Abstract: The ever-growing usage of aspect-oriented
development methodology in the field of software engineering
requires tool support for both research environments and industry. So
far, tool support for many activities in aspect-oriented software
development has been proposed, to automate and facilitate their
development. For instance, the AJaTS provides a transformation
system to support aspect-oriented development and refactoring. In
particular, it is well established that the abstract interpretation of
programs, in any paradigm, pursued in static analysis is best served
by a high-level programs representation, such as Control Flow Graph
(CFG). This is why such analysis can more easily locate common
programmatic idioms for which helpful transformation are already
known as well as, association between the input program and
intermediate representation can be more closely maintained.
However, although the current researches define the good concepts
and foundations, to some extent, for control flow analysis of aspectoriented
programs but they do not provide a concrete tool that can
solely construct the CFG of these programs. Furthermore, most of
these works focus on addressing the other issues regarding Aspect-
Oriented Software Development (AOSD) such as testing or data flow
analysis rather than CFG itself. Therefore, this study is dedicated to
build an aspect-oriented control flow graph construction tool called
AJcFgraph Builder. The given tool can be applied in many software
engineering tasks in the context of AOSD such as, software testing,
software metrics, and so forth.
Abstract: In this paper, we develop a Spatio-Temporal graph as
of a key component of our knowledge representation Scheme. We
design an integrated representation Scheme to depict not only present
and past but future in parallel with the spaces in an effective and
intuitive manner. The resulting multi-dimensional comprehensive
knowledge structure accommodates multi-layered virtual world
developing in the time to maximize the diversity of situations in the
historical context. This knowledge representation Scheme is to be used
as the basis for simulation of situations composing the virtual world
and for implementation of virtual agents' knowledge used to judge and
evaluate the situations in the virtual world. To provide natural contexts
for situated learning or simulation games, the virtual stage set by this
Spatio-Temporal graph is to be populated by agents and other objects
interrelated and changing which are abstracted in the ontology.
Abstract: This paper describes the optimization of a complex
dairy farm simulation model using two quite different methods of
optimization, the Genetic algorithm (GA) and the Lipschitz
Branch-and-Bound (LBB) algorithm. These techniques have been
used to improve an agricultural system model developed by Dexcel
Limited, New Zealand, which describes a detailed representation of
pastoral dairying scenarios and contains an 8-dimensional parameter
space. The model incorporates the sub-models of pasture growth and
animal metabolism, which are themselves complex in many cases.
Each evaluation of the objective function, a composite 'Farm
Performance Index (FPI)', requires simulation of at least a one-year
period of farm operation with a daily time-step, and is therefore
computationally expensive. The problem of visualization of the
objective function (response surface) in high-dimensional spaces is
also considered in the context of the farm optimization problem.
Adaptations of the sammon mapping and parallel coordinates
visualization are described which help visualize some important
properties of the model-s output topography. From this study, it is
found that GA requires fewer function evaluations in optimization
than the LBB algorithm.
Abstract: This paper presents the effects of migration at the
urban sites with an integrated model under the sustainable local
development policies for the conservation and revitalization of the
site areas as a case at Reyhan heritage site in Bursa. It is known as
the “City of immigrants" because of its richness of cultural plurality.
The city has always regarded the dynamic impact of immigration as a
positive contribution. As a result of this situation, the city created the
earliest urbanization practices: being the first capital city of the
Ottoman Empire. Bursa created the first modern movement practices
and set the first Organized Industrial Zone. The most important aim
of the study is to be offer a model for the similar areas with the
context of conservation and revitalization of the historical areas,
subjected to the local integrated sustainable development policies of
local goverments.
Abstract: A DEA model can generally evaluate the performance
using multiple inputs and outputs for the same period. However, it is
hard to avoid the production lead time phenomenon some times, such
as long-term project or marketing activity. A couple of models have
been suggested to capture this time lag issue in the context of DEA.
This paper develops a dual-MPO model to deal with time lag effect in
evaluating efficiency. A numerical example is also given to show that
the proposed model can be used to get efficiency and reference set of
inefficient DMUs and to obtain projected target value of input
attributes for inefficient DMUs to be efficient.
Abstract: Using a methodology grounded in business process
change theory, we investigate the critical success factors that affect
ERP implementation success in United States and India.
Specifically, we examine the ERP implementation at two case study
companies, one in each country. Our findings suggest that certain
factors that affect the success of ERP implementations are not
culturally bound, whereas some critical success factors depend on the
national culture of the country in which the system is being
implemented. We believe that the understanding of these critical
success factors will deepen the understanding of ERP
implementations and will help avoid implementation mistakes,
thereby increasing the rate of success in culturally different contexts.
Implications of the findings and future research directions for both
academicians and practitioners are also discussed.
Abstract: With the proliferation of multi-channel retailing, developing a better understanding of the factors that affect customers- purchase behaviors within a multi-channel retail context has become an important topic for practitioners and academics. While many studies have investigated the various customer behaviors associated with brick-and-mortar retailing, online retailing, and brick-and-click retailing, little research has explored how customer shopping value perceptions influence online purchase behaviors within the TV-and-online retail environment. The main purpose of this study is to investigate the influence of TV and online shopping values on online patronage intention. Data collected from 116 respondents in Taiwan are tested against the research model using the partial least squares (PLS) approach. The results indicate that utilitarian and hedonic TV shopping values have indirect, positive influences on online patronage intention through their online counterparts in the TV-and-online retail context. The findings of this study provide several important theoretical and practical implications for multi-channel retailing.
Abstract: A catastrophic earthquake measuring 6.3 on the
Richter scale struck the Christchurch, New Zealand Central Business
District on February 22, 2012, abruptly disrupting the business of
teaching and learning at Christchurch Polytechnic Institute of
Technology. This paper presents the findings from a study
undertaken about the complexity of delivering an educational
programme in the face of this traumatic natural event. Nine
interconnected themes emerged from this multiple method study:
communication, decision making, leader- and follower-ship,
balancing personal and professional responsibilities, taking action,
preparedness and thinking ahead, all within a disruptive and uncertain
context. Sustainable responses that maximise business continuity, and
provide solutions to practical challenges, are among the study-s
recommendations.
Abstract: This paper is a continuation of our daily energy peak load forecasting approach using our modified network which is part of the recurrent networks family and is called feed forward and feed back multi context artificial neural network (FFFB-MCANN). The inputs to the network were exogenous variables such as the previous and current change in the weather components, the previous and current status of the day and endogenous variables such as the past change in the loads. Endogenous variable such as the current change in the loads were used on the network output. Experiment shows that using endogenous and exogenous variables as inputs to the FFFBMCANN rather than either exogenous or endogenous variables as inputs to the same network produces better results. Experiments show that using the change in variables such as weather components and the change in the past load as inputs to the FFFB-MCANN rather than the absolute values for the weather components and past load as inputs to the same network has a dramatic impact and produce better accuracy.
Abstract: Selection of the best possible set of suppliers has a
significant impact on the overall profitability and success of any
business. For this reason, it is usually necessary to optimize all
business processes and to make use of cost-effective alternatives for
additional savings. This paper proposes a new efficient context-aware
supplier selection model that takes into account possible changes of
the environment while significantly reducing selection costs. The
proposed model is based on data clustering techniques while
inspiring certain principles of online algorithms for an optimally
selection of suppliers. Unlike common selection models which re-run
the selection algorithm from the scratch-line for any decision-making
sub-period on the whole environment, our model considers the
changes only and superimposes it to the previously defined best set
of suppliers to obtain a new best set of suppliers. Therefore, any recomputation
of unchanged elements of the environment is avoided
and selection costs are consequently reduced significantly. A
numerical evaluation confirms applicability of this model and proves
that it is a more optimal solution compared with common static
selection models in this field.
Abstract: The join dependency provides the basis for obtaining
lossless join decomposition in a classical relational schema. The
existence of Join dependency shows that that the tables always
represent the correct data after being joined. Since the classical
relational databases cannot handle imprecise data, they were
extended to fuzzy relational databases so that uncertain, ambiguous,
imprecise and partially known information can also be stored in
databases in a formal way. However like classical databases, the
fuzzy relational databases also undergoes decomposition during
normalization, the issue of joining the decomposed fuzzy relations
remains intact. Our effort in the present paper is to emphasize on this
issue. In this paper we define fuzzy join dependency in the
framework of type-1 fuzzy relational databases & type-2 fuzzy
relational databases using the concept of fuzzy equality which is
defined using fuzzy functions. We use the fuzzy equi-join operator
for computing the fuzzy equality of two attribute values. We also
discuss the dependency preservation property on execution of this
fuzzy equi- join and derive the necessary condition for the fuzzy
functional dependencies to be preserved on joining the decomposed
fuzzy relations. We also derive the conditions for fuzzy join
dependency to exist in context of both type-1 and type-2 fuzzy
relational databases. We find that unlike the classical relational
databases even the existence of a trivial join dependency does not
ensure lossless join decomposition in type-2 fuzzy relational
databases. Finally we derive the conditions for the fuzzy equality to
be non zero and the qualification of an attribute for fuzzy key.
Abstract: This study demonstrates the use of Class F fly ash in
combination with lime or lime kiln dust in the full depth reclamation
(FDR) of asphalt pavements. FDR, in the context of this paper, is a
process of pulverizing a predetermined amount of flexible pavement
that is structurally deficient, blending it with chemical additives and
water, and compacting it in place to construct a new stabilized base
course. Test sections of two structurally deficient asphalt pavements
were reclaimed using Class F fly ash in combination with lime and
lime kiln dust. In addition, control sections were constructed using
cement, cement and emulsion, lime kiln dust and emulsion, and mill
and fill. The service performance and structural behavior of the FDR
pavement test sections were monitored to determine how the fly ash
sections compared to other more traditional pavement rehabilitation
techniques. Service performance and structural behavior were
determined with the use of sensors embedded in the road and Falling
Weight Deflectometer (FWD) tests. Monitoring results of the FWD
tests conducted up to 2 years after reclamation show that the cement,
fly ash+LKD, and fly ash+lime sections exhibited two year resilient
modulus values comparable to open graded cement stabilized
aggregates (more than 750 ksi). The cement treatment resulted in a
significant increase in resilient modulus within 3 weeks of
construction and beyond this curing time, the stiffness increase was
slow. On the other hand, the fly ash+LKD and fly ash+lime test
sections indicated slower shorter-term increase in stiffness. The fly
ash+LKD and fly ash+lime section average resilient modulus values
at two years after construction were in excess of 800 ksi. Additional
longer-term testing data will be available from ongoing pavement
performance and environmental condition data collection at the two
pavement sites.
Abstract: Evolutionary Algorithms are population-based,
stochastic search techniques, widely used as efficient global
optimizers. However, many real life optimization problems often
require finding optimal solution to complex high dimensional,
multimodal problems involving computationally very expensive
fitness function evaluations. Use of evolutionary algorithms in such
problem domains is thus practically prohibitive. An attractive
alternative is to build meta models or use an approximation of the
actual fitness functions to be evaluated. These meta models are order
of magnitude cheaper to evaluate compared to the actual function
evaluation. Many regression and interpolation tools are available to
build such meta models. This paper briefly discusses the
architectures and use of such meta-modeling tools in an evolutionary
optimization context. We further present two evolutionary algorithm
frameworks which involve use of meta models for fitness function
evaluation. The first framework, namely the Dynamic Approximate
Fitness based Hybrid EA (DAFHEA) model [14] reduces
computation time by controlled use of meta-models (in this case
approximate model generated by Support Vector Machine
regression) to partially replace the actual function evaluation by
approximate function evaluation. However, the underlying
assumption in DAFHEA is that the training samples for the metamodel
are generated from a single uniform model. This does not take
into account uncertain scenarios involving noisy fitness functions.
The second model, DAFHEA-II, an enhanced version of the original
DAFHEA framework, incorporates a multiple-model based learning
approach for the support vector machine approximator to handle
noisy functions [15]. Empirical results obtained by evaluating the
frameworks using several benchmark functions demonstrate their
efficiency
Abstract: In this work, we improve a previously developed
segmentation scheme aimed at extracting edge information from
speckled images using a maximum likelihood edge detector. The
scheme was based on finding a threshold for the probability density
function of a new kernel defined as the arithmetic mean-to-geometric
mean ratio field over a circular neighborhood set and, in a general
context, is founded on a likelihood random field model (LRFM). The
segmentation algorithm was applied to discriminated speckle areas
obtained using simple elliptic discriminant functions based on
measures of the signal-to-noise ratio with fractional order moments.
A rigorous stochastic analysis was used to derive an exact expression
for the cumulative density function of the probability density
function of the random field. Based on this, an accurate probability
of error was derived and the performance of the scheme was
analysed. The improved segmentation scheme performed well for
both simulated and real images and showed superior results to those
previously obtained using the original LRFM scheme and standard
edge detection methods. In particular, the false alarm probability was
markedly lower than that of the original LRFM method with
oversegmentation artifacts virtually eliminated. The importance of
this work lies in the development of a stochastic-based segmentation,
allowing an accurate quantification of the probability of false
detection. Non visual quantification and misclassification in medical
ultrasound speckled images is relatively new and is of interest to
clinicians.
Abstract: Parsing is important in Linguistics and Natural
Language Processing to understand the syntax and semantics of a
natural language grammar. Parsing natural language text is
challenging because of the problems like ambiguity and inefficiency.
Also the interpretation of natural language text depends on context
based techniques. A probabilistic component is essential to resolve
ambiguity in both syntax and semantics thereby increasing accuracy
and efficiency of the parser. Tamil language has some inherent
features which are more challenging. In order to obtain the solutions,
lexicalized and statistical approach is to be applied in the parsing
with the aid of a language model. Statistical models mainly focus on
semantics of the language which are suitable for large vocabulary
tasks where as structural methods focus on syntax which models
small vocabulary tasks. A statistical language model based on Trigram
for Tamil language with medium vocabulary of 5000 words has
been built. Though statistical parsing gives better performance
through tri-gram probabilities and large vocabulary size, it has some
disadvantages like focus on semantics rather than syntax, lack of
support in free ordering of words and long term relationship. To
overcome the disadvantages a structural component is to be
incorporated in statistical language models which leads to the
implementation of hybrid language models. This paper has attempted
to build phrase structured hybrid language model which resolves
above mentioned disadvantages. In the development of hybrid
language model, new part of speech tag set for Tamil language has
been developed with more than 500 tags which have the wider
coverage. A phrase structured Treebank has been developed with 326
Tamil sentences which covers more than 5000 words. A hybrid
language model has been trained with the phrase structured Treebank
using immediate head parsing technique. Lexicalized and statistical
parser which employs this hybrid language model and immediate
head parsing technique gives better results than pure grammar and
trigram based model.
Abstract: The article emphasizes the ideological commitment of
the philosopher Emil Cioran. It presents firstly Cioran's works on the
theme announced by the title, then the European context that
determined the political option of Cioran and a brief analysis of his
relationship with History during his French period. The anti-
Semitism of Cioran was favored by his attachment to a few
philosophers, but also by the European extremist and anti-Semitic
context. The article seeks to demonstrate that the philosopher Cioran,
known more for his pessimism and nihilism, maintained in time an
obsessive relationship with History. His political philosophy is as
important as his subjective philosophy, better known than the former.
Abstract: Whilst there is growing evidence that activity
across the lifespan is beneficial for improved health, there are
also many changes involved with the aging process and
subsequently the potential for reduced indices of health. The
nexus between health, physical activity and aging is complex
and has raised much interest in recent times due to the
realization that a multifaceted approached is necessary in
order to counteract a growing obesity epidemic. By
investigating age based trends within a population adhering to
competitive sport at older ages, further insight might be
gleaned to assist in understanding one of many factors
influencing this relationship.
BMI was derived using data gathered on a total of 6,071
masters athletes (51.9% male, 48.1% female) aged 25 to 91
years ( =51.5, s =±9.7), competing at the Sydney World
Masters Games (2009). Using linear and loess regression it
was demonstrated that the usual tendency for prevalence of
higher BMI increasing with age was reversed in the sample.
This trend in reversal was repeated for both male and female
only sub-sets of the sample participants, indicating the
possibility of improved prevalence of BMI with increasing
age for both the sample as a whole and these individual subgroups.
This evidence of improved classification in one index of
health (reduced BMI) for masters athletes (when compared to
the general population) implies there are either improved
levels of this index of health with aging due to adherence to
sport or possibly the reduced BMI is advantageous and
contributes to this cohort adhering (or being attracted) to
masters sport at older ages. Demonstration of this
proportionately under-investigated World Masters Games
population having an improved relationship between BMI and
increasing age over the general population is of particular
interest in the context of the measures being taken globally to
curb an obesity epidemic.
Abstract: Market competition and a desire to gain advantages on globalized market, drives companies towards innovation efforts. Project overload is an unpleasant phenomenon, which is happening for employees inside those organizations trying to make the most efficient use of their resources to be innovative. But what are the impacts of project overload on organization-s innovation capabilities? Advanced engineering teams (AE) inside a major heavy equipment manufacturer are suffering from project overload in their quest for innovation. In this paper, Agent-based modeling (ABM) is used to examine the current reality of the company context, and of the AE team, where the opportunities and challenges for reducing the risk of project overload and moving towards innovation were identified. Project overload is more likely to stifle innovation and creativity inside teams. On the other hand, motivations on proper challenging goals are more likely to help individual to alleviate the negative aspects of low level of project overload.
Abstract: This paper critiques several exiting strategic
international human resource management (SIHRM) frameworks and
discusses their limitations to apply directly to emerging multinational
enterprises (EMNEs), especially those generated from China and
other BRICS nations. To complement the existing SIHRM
frameworks, key variables relevant to emerging economies are
identified and the extended model with particular reference to
EMNEs is developed with several research propositions. It is
believed that the extended model would better capture the recent
development of MNEs in transition, and alert emerging international
managers to address several human resource management challenges
in the global context
Abstract: System-level design based on high-level abstractions
is becoming increasingly important in hardware and embedded
system design. This paper analyzes meta-design techniques oriented
at developing meta-programs and meta-models for well-understood
domains. Meta-design techniques include meta-programming and
meta-modeling. At the programming level of design process, metadesign
means developing generic components that are usable in a
wider context of application than original domain components. At the
modeling level, meta-design means developing design patterns that
describe general solutions to the common recurring design problems,
and meta-models that describe the relationship between different
types of design models and abstractions. The paper describes and
evaluates the implementation of meta-design in hardware design
domain using object-oriented and meta-programming techniques.
The presented ideas are illustrated with a case study.