Abstract: Evolutionary Algorithms are population-based,
stochastic search techniques, widely used as efficient global
optimizers. However, many real life optimization problems often
require finding optimal solution to complex high dimensional,
multimodal problems involving computationally very expensive
fitness function evaluations. Use of evolutionary algorithms in such
problem domains is thus practically prohibitive. An attractive
alternative is to build meta models or use an approximation of the
actual fitness functions to be evaluated. These meta models are order
of magnitude cheaper to evaluate compared to the actual function
evaluation. Many regression and interpolation tools are available to
build such meta models. This paper briefly discusses the
architectures and use of such meta-modeling tools in an evolutionary
optimization context. We further present two evolutionary algorithm
frameworks which involve use of meta models for fitness function
evaluation. The first framework, namely the Dynamic Approximate
Fitness based Hybrid EA (DAFHEA) model [14] reduces
computation time by controlled use of meta-models (in this case
approximate model generated by Support Vector Machine
regression) to partially replace the actual function evaluation by
approximate function evaluation. However, the underlying
assumption in DAFHEA is that the training samples for the metamodel
are generated from a single uniform model. This does not take
into account uncertain scenarios involving noisy fitness functions.
The second model, DAFHEA-II, an enhanced version of the original
DAFHEA framework, incorporates a multiple-model based learning
approach for the support vector machine approximator to handle
noisy functions [15]. Empirical results obtained by evaluating the
frameworks using several benchmark functions demonstrate their
efficiency
Abstract: Innovation, technology and knowledge are the trilogy
of impact to support the challenges arising from uncertainty.
Evidence showed an opportunity to ask how to manage in this
environment under constant innovation. In an attempt to get a
response from the field of Management Sciences, based in the
Contingency Theory, a research was conducted, with
phenomenological and descriptive approaches, using the Case Study
Method and the usual procedures for this task involving a focus
group composed of managers and employees working in the
pharmaceutical field. The problem situation was raised; the state of
the art was interpreted and dissected the facts. In this tasks were
involved four establishments. The result indicates that these focused
ventures have been managed by its founder empirically and is
experimenting agility described in this work. The expectation of this
study is to improve concepts for stakeholders on creativity in
business.
Abstract: ISO 9000 is the most popular and widely adopted meta-standard for quality and operational improvements. However, only limited empirical research has been conducted to examine the impact of ISO 9000 on operational performance based on objective and longitudinal data. To reveal any causal relationship between the adoption of ISO 9000 and operational performance, we examined the timing and magnitude of change in time-based performance as a result of ISO 9000 adoption. We analyzed the changes in operating cycle, inventory days, and account receivable days prior and after the implementation of ISO 9000 in 695 publicly listed manufacturing firms. We found that ISO 9000 certified firms shortened their operating cycle time by 5.28 days one year after the implementation of ISO 9000. In the long-run (3 years after certification), certified firms showed continuous improvement in time-based efficiency, and experienced a shorter operating cycle time of 11 days than that of non-certified firms. There was an average of 6.5% improvement in operating cycle time for ISO 9000 certified firms. Both inventory days and account receivable days showed similar significant improvements after the implementation of ISO 9000, too.
Abstract: An empirical study of web applications that use
software frameworks is presented here. The analysis is based on two
approaches. In the first, developers using such frameworks are
required, based on their experience, to assign weights to parameters
such as database connection. In the second approach, a performance
testing tool, OpenSTA, is used to compute start time and other such
measures. From such an analysis, it is concluded that open source
software is superior to proprietary software. The motivation behind
this research is to examine ways in which a quantitative assessment
can be made of software in general and frameworks in particular.
Concepts such as metrics and architectural styles are discussed along
with previously published research.
Abstract: The paper deals with an application of quantitative analysis – the Data Envelopment Analysis (DEA) method to performance evaluation of the European Union Member States, in the reference years 2000 and 2011. The main aim of the paper is to measure efficiency changes over the reference years and to analyze a level of productivity in individual countries based on DEA method and to classify the EU Member States to homogeneous units (clusters) according to efficiency results. The theoretical part is devoted to the fundamental basis of performance theory and the methodology of DEA. The empirical part is aimed at measuring degree of productivity and level of efficiency changes of evaluated countries by basic DEA model – CCR CRS model, and specialized DEA approach – the Malmquist Index measuring the change of technical efficiency and the movement of production possibility frontier. Here, DEA method becomes a suitable tool for setting a competitive/uncompetitive position of each country because there is not only one factor evaluated, but a set of different factors that determine the degree of economic development.
Abstract: Modern organizations operate under the pressure of
dynamic and often unpredictable changes, both in external and
internal environment. Market success, in this context, requires a
particular competence in the form of flexibility, interpreted here both
on the level of individuals and on the level of organization. This
paper addresses the changes taking place in the sphere of
employment, as observed in economic entities operating on Polish
market. Based on own empirical studies, the authors focus on the
progressing trend of ‘flexibilization’ of employment, particularly in
the context of transformations in organizational structure, designed to
facilitate the transition into management by projects and
differentiation of labor forms.
Abstract: Artificial Neural Network (ANN) has been
extensively used for classification of heart sounds for its
discriminative training ability and easy implementation. However, it
suffers from overparameterization if the number of nodes is not
chosen properly. In such cases, when the dataset has redundancy
within it, ANN is trained along with this redundant information that
results in poor validation. Also a larger network means more
computational expense resulting more hardware and time related
cost. Therefore, an optimum design of neural network is needed
towards real-time detection of pathological patterns, if any from heart
sound signal. The aims of this work are to (i) select a set of input
features that are effective for identification of heart sound signals and
(ii) make certain optimum selection of nodes in the hidden layer for a
more effective ANN structure. Here, we present an optimization
technique that involves Singular Value Decomposition (SVD) and
QR factorization with column pivoting (QRcp) methodology to
optimize empirically chosen over-parameterized ANN structure.
Input nodes present in ANN structure is optimized by SVD followed
by QRcp while only SVD is required to prune undesirable hidden
nodes. The result is presented for classifying 12 common
pathological cases and normal heart sound.