Abstract: Cognitive Science appeared about 40 years ago,
subsequent to the challenge of the Artificial Intelligence, as common
territory for several scientific disciplines such as: IT, mathematics,
psychology, neurology, philosophy, sociology, and linguistics. The
new born science was justified by the complexity of the problems
related to the human knowledge on one hand, and on the other by the
fact that none of the above mentioned sciences could explain alone
the mental phenomena. Based on the data supplied by the
experimental sciences such as psychology or neurology, models of
the human mind operation are built in the cognition science. These
models are implemented in computer programs and/or electronic
circuits (specific to the artificial intelligence) – cognitive systems –
whose competences and performances are compared to the human
ones, leading to the psychology and neurology data reinterpretation,
respectively to the construction of new models. During these
processes if psychology provides the experimental basis, philosophy
and mathematics provides the abstraction level utterly necessary for
the intermission of the mentioned sciences.
The ongoing general problematic of the cognitive approach
provides two important types of approach: the computational one,
starting from the idea that the mental phenomenon can be reduced to
1 and 0 type calculus operations, and the connection one that
considers the thinking products as being a result of the interaction
between all the composing (included) systems. In the field of
psychology measurements in the computational register use classical
inquiries and psychometrical tests, generally based on calculus
methods. Deeming things from both sides that are representing the
cognitive science, we can notice a gap in psychological product
measurement possibilities, regarded from the connectionist
perspective, that requires the unitary understanding of the quality –
quantity whole. In such approach measurement by calculus proves to
be inefficient. Our researches, deployed for longer than 20 years,
lead to the conclusion that measuring by forms properly fits to the
connectionism laws and principles.
Abstract: The ever-growing usage of aspect-oriented
development methodology in the field of software engineering
requires tool support for both research environments and industry. So
far, tool support for many activities in aspect-oriented software
development has been proposed, to automate and facilitate their
development. For instance, the AJaTS provides a transformation
system to support aspect-oriented development and refactoring. In
particular, it is well established that the abstract interpretation of
programs, in any paradigm, pursued in static analysis is best served
by a high-level programs representation, such as Control Flow Graph
(CFG). This is why such analysis can more easily locate common
programmatic idioms for which helpful transformation are already
known as well as, association between the input program and
intermediate representation can be more closely maintained.
However, although the current researches define the good concepts
and foundations, to some extent, for control flow analysis of aspectoriented
programs but they do not provide a concrete tool that can
solely construct the CFG of these programs. Furthermore, most of
these works focus on addressing the other issues regarding Aspect-
Oriented Software Development (AOSD) such as testing or data flow
analysis rather than CFG itself. Therefore, this study is dedicated to
build an aspect-oriented control flow graph construction tool called
AJcFgraph Builder. The given tool can be applied in many software
engineering tasks in the context of AOSD such as, software testing,
software metrics, and so forth.
Abstract: In this paper, we propose a geometric modeling of
illumination on the patterned image containing etching transistor. This
image is captured by a commercial camera during the inspection of
a TFT-LCD panel. Inspection of defect is an important process in the
production of LCD panel, but the regional difference in brightness,
which has a negative effect on the inspection, is due to the uneven
illumination environment. In order to solve this problem, we present
a geometric modeling of illumination consisting of an interpolation
using the least squares method and 3D modeling using bezier surface.
Our computational time, by using the sampling method, is shorter
than the previous methods. Moreover, it can be further used to correct
brightness in every patterned image.
Abstract: This paper describes a complex energy signal model
that is isomorphic with digital human fingerprint images. By using
signal models, the problem of fingerprint matching is transformed
into the signal processing problem of finding a correlation between
two complex signals that differ by phase-rotation and time-scaling. A
technique for minutiae matching that is independent of image
translation, rotation and linear-scaling, and is resistant to missing
minutiae is proposed. The method was tested using random data
points. The results show that for matching prints the scaling and
rotation angles are closely estimated and a stronger match will have a
higher correlation.
Abstract: Unlike its conventional counterpart, Islamic principles
forbid Islamic banks to take any interest-related income and thus
makes deposits from depositors as an important source of fund for its
operational and financing. Consequently, the risk of deposit
withdrawal by depositors is an important aspect that should be wellmanaged
in Islamic banking. This paper aims to investigate factors
that influence depositors- withdrawal behavior in Islamic banks,
particularly in Malaysia, using the framework of theory of reasoned
action. A total of 368 respondents from Klang valley are involved in
the analysis. The paper finds that all the constructs variable i.e.
normative beliefs, subjective norms, behavioral beliefs, and attitude
towards behavior are perceived to be distinct by the respondents. In
addition, the structural equation model is able to verify the structural
relationships between subjective norms, attitude towards behavior
and behavioral intention. Subjective norms gives more influence to
depositors- decision on deposit withdrawal compared to attitude
towards behavior.
Abstract: Using a methodology grounded in business process
change theory, we investigate the critical success factors that affect
ERP implementation success in United States and India.
Specifically, we examine the ERP implementation at two case study
companies, one in each country. Our findings suggest that certain
factors that affect the success of ERP implementations are not
culturally bound, whereas some critical success factors depend on the
national culture of the country in which the system is being
implemented. We believe that the understanding of these critical
success factors will deepen the understanding of ERP
implementations and will help avoid implementation mistakes,
thereby increasing the rate of success in culturally different contexts.
Implications of the findings and future research directions for both
academicians and practitioners are also discussed.
Abstract: With the proliferation of multi-channel retailing, developing a better understanding of the factors that affect customers- purchase behaviors within a multi-channel retail context has become an important topic for practitioners and academics. While many studies have investigated the various customer behaviors associated with brick-and-mortar retailing, online retailing, and brick-and-click retailing, little research has explored how customer shopping value perceptions influence online purchase behaviors within the TV-and-online retail environment. The main purpose of this study is to investigate the influence of TV and online shopping values on online patronage intention. Data collected from 116 respondents in Taiwan are tested against the research model using the partial least squares (PLS) approach. The results indicate that utilitarian and hedonic TV shopping values have indirect, positive influences on online patronage intention through their online counterparts in the TV-and-online retail context. The findings of this study provide several important theoretical and practical implications for multi-channel retailing.
Abstract: Concrete performance is strongly affected by the
particle packing degree since it determines the distribution of the
cementitious component and the interaction of mineral particles. By
using packing theory designers will be able to select optimal
aggregate materials for preparing concrete with low cement content,
which is beneficial from the point of cost. Optimum particle packing
implies minimizing porosity and thereby reducing the amount of
cement paste needed to fill the voids between the aggregate particles,
taking also the rheology of the concrete into consideration. For
reaching good fluidity superplasticizers are required. The results from
pilot tests at Luleå University of Technology (LTU) show various
forms of the proposed theoretical models, and the empirical approach
taken in the study seems to provide a safer basis for developing new,
improved packing models.
Abstract: In this study, we propose a network architecture for
providing secure access to information resources of enterprise
network from remote locations in a wireless fashion. Our proposed
architecture offers a very promising solution for organizations which
are in need of a secure, flexible and cost-effective remote access
methodology. Security of the proposed architecture is based on
Virtual Private Network technology and a special role based access
control mechanism with location and time constraints. The flexibility
mainly comes from the use of Internet as the communication medium
and cost-effectiveness is due to the possibility of in-house
implementation of the proposed architecture.
Abstract: Interpretation of aerial images is an important task in
various applications. Image segmentation can be viewed as the essential
step for extracting information from aerial images. Among many
developed segmentation methods, the technique of clustering has been
extensively investigated and used. However, determining the number
of clusters in an image is inherently a difficult problem, especially
when a priori information on the aerial image is unavailable. This
study proposes a support vector machine approach for clustering
aerial images. Three cluster validity indices, distance-based index,
Davies-Bouldin index, and Xie-Beni index, are utilized as quantitative
measures of the quality of clustering results. Comparisons on the
effectiveness of these indices and various parameters settings on the
proposed methods are conducted. Experimental results are provided
to illustrate the feasibility of the proposed approach.
Abstract: The purpose of this study is to examine the self and
decision making levels of students receiving education in schools of
physical training and sports. The population of the study consisted
258 students, among which 152 were male and 106 were female
( X age=19,3713 + 1,6968), that received education in the schools of
physical education and sports of Selcuk University, Inonu University,
Gazi University and Karamanoglu Mehmetbey University. In order to
achieve the purpose of the study, the Melbourne Decision Making
Questionnary developed by Mann et al. (1998) [1] and adapted to
Turkish by Deniz (2004) [2] and the Self-Esteem Scale developed by
Aricak (1999) [3] was utilized. For analyzing and interpreting data
Kolmogorov-Smirnov test, t-test and one way anova test were used,
while for determining the difference between the groups Tukey test
and Multiple Linear Regression test were employed and significance
was accepted at P
Abstract: Ramadan requires individuals to abstain from food and fluid intake between sunrise and sunset; physiological considerations predict that poorer mood, physical performance and mental performance will result. In addition, any difficulties will be worsened because preparations for fasting and recovery from it often mean that nocturnal sleep is decreased in length, and this independently affects mood and performance.
A difficulty of interpretation in many studies is that the observed changes could be due to fasting but also to the decreased length of sleep and altered food and fluid intakes before and after the daytime fasting. These factors were separated in this study, which took place over three separate days and compared the effects of different durations of fasting (4, 8 or 16h) upon a wide variety of measures (including subjective and objective assessments of performance, body composition, dehydration and responses to a short bout of exercise) - but with an unchanged amount of nocturnal sleep, controlled supper the previous evening, controlled intakes at breakfast and daytime naps not being allowed. Many of the negative effects of fasting observed in previous studies were present in this experiment also. These findings indicate that fasting was responsible for many of the changes previously observed, though some effect of sleep loss, particularly if occurring on successive days (as would occur in Ramadan) cannot be excluded.
Abstract: When reconstructing a scenario, it is necessary to
know the structure of the elements present on the scene to have an
interpretation. In this work we link 3D scenes reconstruction to
evolutionary algorithms through the vision stereo theory. We
consider vision stereo as a method that provides the reconstruction of
a scene using only a couple of images of the scene and performing
some computation. Through several images of a scene, captured from
different positions, vision stereo can give us an idea about the threedimensional
characteristics of the world. Vision stereo usually
requires of two cameras, making an analogy to the mammalian vision
system. In this work we employ only a camera, which is translated
along a path, capturing images every certain distance. As we can not
perform all computations required for an exhaustive reconstruction,
we employ an evolutionary algorithm to partially reconstruct the
scene in real time. The algorithm employed is the fly algorithm,
which employ “flies" to reconstruct the principal characteristics of
the world following certain evolutionary rules.
Abstract: Due to the stringent legislation for emission of diesel
engines and also increasing demand on fuel consumption, the
importance of detailed 3D simulation of fuel injection, mixing and
combustion have been increased in the recent years. In the present
work, FIRE code has been used to study the detailed modeling of
spray and mixture formation in a Caterpillar heavy-duty diesel
engine. The paper provides an overview of the submodels
implemented, which account for liquid spray atomization, droplet
secondary break-up, droplet collision, impingement, turbulent
dispersion and evaporation. The simulation was performed from
intake valve closing (IVC) to exhaust valve opening (EVO). The
predicted in-cylinder pressure is validated by comparing with
existing experimental data. A good agreement between the predicted
and experimental values ensures the accuracy of the numerical
predictions collected with the present work. Predictions of engine
emissions were also performed and a good quantitative agreement
between measured and predicted NOx and soot emission data were
obtained with the use of the present Zeldowich mechanism and
Hiroyasu model. In addition, the results reported in this paper
illustrate that the numerical simulation can be one of the most
powerful and beneficial tools for the internal combustion engine
design, optimization and performance analysis.
Abstract: As the gradual increase of the enterprise scale, the
firms may possess many manufacturing plants located in different
places geographically. This change will result in the multi-site
production planning problems under the environment of multiple
plants or production resources. Our research proposes the structural
framework to analyze the multi-site planning problems. The analytical
framework is composed of six elements: multi-site conceptual model,
product structure (bill of manufacturing), production strategy,
manufacturing capability and characteristics, production planning
constraints, and key performance indicators. As well as the discussion
of these six ingredients, we also review related literatures in this paper
to match our analytical framework. Finally we take a real-world
practical example of a TFT-LCD manufacturer in Taiwan to explain
our proposed analytical framework for the multi-site production
planning problems.
Abstract: The study of a real function of two real variables can be supported by visualization using a Computer Algebra System (CAS). One type of constraints of the system is due to the algorithms implemented, yielding continuous approximations of the given function by interpolation. This often masks discontinuities of the function and can provide strange plots, not compatible with the mathematics. In recent years, point based geometry has gained increasing attention as an alternative surface representation, both for efficient rendering and for flexible geometry processing of complex surfaces. In this paper we present different artifacts created by mesh surfaces near discontinuities and propose a point based method that controls and reduces these artifacts. A least squares penalty method for an automatic generation of the mesh that controls the behavior of the chosen function is presented. The special feature of this method is the ability to improve the accuracy of the surface visualization near a set of interior points where the function may be discontinuous. The present method is formulated as a minimax problem and the non uniform mesh is generated using an iterative algorithm. Results show that for large poorly conditioned matrices, the new algorithm gives more accurate results than the classical preconditioned conjugate algorithm.
Abstract: The size, complexity and number of databases used
for protein information have caused bioinformatics to lag behind in
adapting to the need to handle this distributed information.
Integrating all the information from different databases into one
database is a challenging problem. Our main research is to develop a
tool which can be used to access and manipulate protein information
from difference databases. In our approach, we have integrated
difference databases such as Swiss-prot, PDB, Interpro, and EMBL
and transformed these databases in flat file format into relational
form using XML and Bioperl. As a result, we showed this tool can
search different sizes of protein information stored in relational
database and the result can be retrieved faster compared to flat file
database. A web based user interface is provided to allow user to
access or search for protein information in the local database.
Abstract: This paper demonstrates the application of craziness based particle swarm optimization (CRPSO) technique for designing the 8th order low pass Infinite Impulse Response (IIR) filter. CRPSO, the much improved version of PSO, is a population based global heuristic search algorithm which finds near optimal solution in terms of a set of filter coefficients. Effectiveness of this algorithm is justified with a comparative study of some well established algorithms, namely, real coded genetic algorithm (RGA) and particle swarm optimization (PSO). Simulation results affirm that the proposed algorithm CRPSO, outperforms over its counterparts not only in terms of quality output i.e. sharpness at cut-off, pass band ripple, stop band ripple, and stop band attenuation but also in convergence speed with assured stability.
Abstract: Current image-based individual human recognition
methods, such as fingerprints, face, or iris biometric modalities
generally require a cooperative subject, views from certain aspects,
and physical contact or close proximity. These methods cannot
reliably recognize non-cooperating individuals at a distance in the
real world under changing environmental conditions. Gait, which
concerns recognizing individuals by the way they walk, is a relatively
new biometric without these disadvantages. The inherent gait
characteristic of an individual makes it irreplaceable and useful in
visual surveillance.
In this paper, an efficient gait recognition system for human
identification by extracting two features namely width vector of
the binary silhouette and the MPEG-7-based region-based shape
descriptors is proposed. In the proposed method, foreground objects
i.e., human and other moving objects are extracted by estimating
background information by a Gaussian Mixture Model (GMM) and
subsequently, median filtering operation is performed for removing
noises in the background subtracted image. A moving target classification
algorithm is used to separate human being (i.e., pedestrian)
from other foreground objects (viz., vehicles). Shape and boundary
information is used in the moving target classification algorithm.
Subsequently, width vector of the outer contour of binary silhouette
and the MPEG-7 Angular Radial Transform coefficients are taken as
the feature vector. Next, the Principal Component Analysis (PCA)
is applied to the selected feature vector to reduce its dimensionality.
These extracted feature vectors are used to train an Hidden Markov
Model (HMM) for identification of some individuals. The proposed
system is evaluated using some gait sequences and the experimental
results show the efficacy of the proposed algorithm.
Abstract: Evolutionary Algorithms are population-based,
stochastic search techniques, widely used as efficient global
optimizers. However, many real life optimization problems often
require finding optimal solution to complex high dimensional,
multimodal problems involving computationally very expensive
fitness function evaluations. Use of evolutionary algorithms in such
problem domains is thus practically prohibitive. An attractive
alternative is to build meta models or use an approximation of the
actual fitness functions to be evaluated. These meta models are order
of magnitude cheaper to evaluate compared to the actual function
evaluation. Many regression and interpolation tools are available to
build such meta models. This paper briefly discusses the
architectures and use of such meta-modeling tools in an evolutionary
optimization context. We further present two evolutionary algorithm
frameworks which involve use of meta models for fitness function
evaluation. The first framework, namely the Dynamic Approximate
Fitness based Hybrid EA (DAFHEA) model [14] reduces
computation time by controlled use of meta-models (in this case
approximate model generated by Support Vector Machine
regression) to partially replace the actual function evaluation by
approximate function evaluation. However, the underlying
assumption in DAFHEA is that the training samples for the metamodel
are generated from a single uniform model. This does not take
into account uncertain scenarios involving noisy fitness functions.
The second model, DAFHEA-II, an enhanced version of the original
DAFHEA framework, incorporates a multiple-model based learning
approach for the support vector machine approximator to handle
noisy functions [15]. Empirical results obtained by evaluating the
frameworks using several benchmark functions demonstrate their
efficiency