Abstract: This paper deals with the localization of the wideband sources. We develop a new approach for estimating the wide band sources parameters. This method is based on the high order statistics of the recorded data in order to eliminate the Gaussian components from the signals received on the various hydrophones.In fact the noise of sea bottom is regarded as being Gaussian. Thanks to the coherent signal subspace algorithm based on the cumulant matrix of the received data instead of the cross-spectral matrix the wideband correlated sources are perfectly located in the very noisy environment. We demonstrate the performance of the proposed algorithm on the real data recorded during an underwater acoustics experiments.
Abstract: In this paper the design of maximally flat linear phase
finite impulse response (FIR) filters is considered. The problem is
handled with totally two different approaches. The first one is
completely deterministic numerical approach where the problem is
formulated as a Linear Complementarity Problem (LCP). The other
one is based on a combination of Markov Random Fields (MRF's)
approach with messy genetic algorithm (MGA). Markov Random
Fields (MRFs) are a class of probabilistic models that have been
applied for many years to the analysis of visual patterns or textures.
Our objective is to establish MRFs as an interesting approach to
modeling messy genetic algorithms. We establish a theoretical result
that every genetic algorithm problem can be characterized in terms of
a MRF model. This allows us to construct an explicit probabilistic
model of the MGA fitness function and introduce the Ising MGA.
Experimentations done with Ising MGA are less costly than those
done with standard MGA since much less computations are involved.
The least computations of all is for the LCP. Results of the LCP,
random search, random seeded search, MGA, and Ising MGA are
discussed.
Abstract: This paper presents a useful sub-pixel image
registration method using line segments and a sub-pixel edge detector.
In this approach, straight line segments are first extracted from gray
images at the pixel level before applying the sub-pixel edge detector.
Next, all sub-pixel line edges are mapped onto the orientation-distance
parameter space to solve for line correspondence between images.
Finally, the registration parameters with sub-pixel accuracy are
analytically solved via two linear least-square problems. The present
approach can be applied to various fields where fast registration with
sub-pixel accuracy is required. To illustrate, the present approach is
applied to the inspection of printed circuits on a flat panel. Numerical
example shows that the present approach is effective and accurate
when target images contain a sufficient number of line segments,
which is true in many industrial problems.
Abstract: Multiprocessor task scheduling is a NP-hard problem and Genetic Algorithm (GA) has been revealed as an excellent technique for finding an optimal solution. In the past, several methods have been considered for the solution of this problem based on GAs. But, all these methods consider single criteria and in the present work, minimization of the bi-criteria multiprocessor task scheduling problem has been considered which includes weighted sum of makespan & total completion time. Efficiency and effectiveness of genetic algorithm can be achieved by optimization of its different parameters such as crossover, mutation, crossover probability, selection function etc. The effects of GA parameters on minimization of bi-criteria fitness function and subsequent setting of parameters have been accomplished by central composite design (CCD) approach of response surface methodology (RSM) of Design of Experiments. The experiments have been performed with different levels of GA parameters and analysis of variance has been performed for significant parameters for minimisation of makespan and total completion time simultaneously.
Abstract: Moisture is an important consideration in many
aspects ranging from irrigation, soil chemistry, golf course, corrosion
and erosion, road conditions, weather predictions, livestock feed
moisture levels, water seepage etc. Vegetation and crops always
depend more on the moisture available at the root level than on
precipitation occurrence. In this paper, design of an instrument is
discussed which tells about the variation in the moisture contents of
soil. This is done by measuring the amount of water content in soil by
finding the variation in capacitance of soil with the help of a
capacitive sensor. The greatest advantage of soil moisture sensor is
reduced water consumption. The sensor is also be used to set lower
and upper threshold to maintain optimum soil moisture saturation and
minimize water wilting, contributes to deeper plant root growth
,reduced soil run off /leaching and less favorable condition for insects
and fungal diseases. Capacitance method is preferred because, it
provides absolute amount of water content and also measures water
content at any depth.
Abstract: MC (Management Control)& IC (Internal Control) – what is the relationship? (an empirical study into the definitions between MC and IC) based on the wider considerations of Internal Control and Management Control terms, attention is focused not only on the financial aspects but also more on the soft aspects of the business, such as culture, behaviour, standards and values. The limited considerations of Management Control are focused mainly in the hard, financial aspects of business operation. The definitions of Management Control and Internal Control are often used interchangeably and the results of this empirical study reveal that Management Control is part of Internal Control, there is no causal link between the two concepts. Based on the interpretation of the respondents, the term Management Control has moved from a broad term to a more limited term with the soft aspects of the influencing of behaviour, performance measurements, incentives and culture. This paper is an exploratory study based on qualitative research and on a qualitative matrix method analysis of the thematic definition of the terms Management Control and Internal Control.
Abstract: Validation of an automation system is an important issue. The goal is to check if the system under investigation, modeled by a Petri net, never enters the undesired states. Usually, tools dedicated to Petri nets such as DESIGN/CPN are used to make reachability analysis. The biggest problem with this approach is that it is impossible to generate the full occurence graph of the system because it is too large. In this paper, we show how computational methods such as temporal logic model checking and Groebner bases can be used to verify the correctness of the design of an automation system. We report our experimental results with two automation systems: the Automated Guided Vehicle (AGV) system and the traffic light system. Validation of these two systems ranged from 10 to 30 seconds on a PC depending on the optimizing parameters.
Abstract: For a given specific problem an efficient algorithm has
been the matter of study. However, an alternative approach orthogonal
to this approach comes out, which is called a reduction. In general
for a given specific problem this reduction approach studies how to
convert an original problem into subproblems. This paper proposes
a formal modeling language to support this reduction approach. We
show three examples from the wide area of learning problems. The
benefit is a fast prototyping of algorithms for a given new problem.
Abstract: In this paper, we present a novel approach to location
system under indoor environment. The key idea of our work is
accurate distance estimation with cricket-based location system using
A* algorithm. We also use magnetic sensor for detecting obstacles in
indoor environment. Finally, we suggest how this system can be used
in various applications such as asset tracking and monitoring.
Abstract: The development of the signal compression
algorithms is having compressive progress. These algorithms are
continuously improved by new tools and aim to reduce, an average,
the number of bits necessary to the signal representation by means of
minimizing the reconstruction error. The following article proposes
the compression of Arabic speech signal by a hybrid method
combining the wavelet transform and the linear prediction. The
adopted approach rests, on one hand, on the original signal
decomposition by ways of analysis filters, which is followed by the
compression stage, and on the other hand, on the application of the
order 5, as well as, the compression signal coefficients. The aim of
this approach is the estimation of the predicted error, which will be
coded and transmitted. The decoding operation is then used to
reconstitute the original signal. Thus, the adequate choice of the
bench of filters is useful to the transform in necessary to increase the
compression rate and induce an impercevable distortion from an
auditive point of view.
Abstract: In policy discourse of 1990s, more inclusive spaces
have been constructed for realizing full and meaningful participation
of common people in education. These participatory spaces provide
an alternative possibility for universalizing elementary education
against the backdrop of a history of entrenched forms of social and
economical exclusion; inequitable education provisions; and
shrinking role of the state in today-s neo-liberal times. Drawing on
case-studies of bottom-up approaches to school governance, the study
examines an array of innovative ways through which poor people
gained a sense of identity and agency by evolving indigenous
solutions to issues regarding schooling of their children. In the
process, state-s institutions and practices became more accountable
and responsive to educational concerns of the marginalized people.
The deliberative participation emerged as an active way of
experiencing deeper forms of empowerment and democracy than its
passive realization as mere bearers of citizen rights.
Abstract: Internet today has a huge impact on all aspects of life,
and also in the area of the broader context of democracy, politics and
politicians. If democracy is freedom of choice, there are a number of
conditions that can ensure in practice the freedom to be achieved and
realized. These preconditions must be achieved regardless of the
manner of voting. The key contribution of ICT to achieve freedom of
choice is that technology enables the correlation of the citizens and
elected representatives on the better way than it was possible without
the Internet. In this sense, we can say that the Internet and ICT are
changing significantly, and potentially improving the environment in
which democratic processes are taking place. This paper aims to
describe trends in use of ICT in democratic processes, and analyzes
the challenges for implementation of e-Democracy in Montenegro
Abstract: Natural resources management including water resources requires reliable estimations of time variant environmental parameters. Small improvements in the estimation of environmental parameters would result in grate effects on managing decisions. Noise reduction using wavelet techniques is an effective approach for preprocessing of practical data sets. Predictability enhancement of the river flow time series are assessed using fractal approaches before and after applying wavelet based preprocessing. Time series correlation and persistency, the minimum sufficient length for training the predicting model and the maximum valid length of predictions were also investigated through a fractal assessment.
Abstract: The paper focuses on the enhanced stiffness modeling
of robotic manipulators by taking into account influence of the external force/torque acting upon the end point. It implements the
virtual joint technique that describes the compliance of manipulator elements by a set of localized six-dimensional springs separated by
rigid links and perfect joints. In contrast to the conventional
formulation, which is valid for the unloaded mode and small
displacements, the proposed approach implicitly assumes that the loading leads to the non-negligible changes of the manipulator posture and corresponding amendment of the Jacobian. The
developed numerical technique allows computing the static
equilibrium and relevant force/torque reaction of the manipulator for
any given displacement of the end-effector. This enables designer
detecting essentially nonlinear effects in elastic behavior of
manipulator, similar to the buckling of beam elements. It is also proposed the linearization procedure that is based on the inversion of
the dedicated matrix composed of the stiffness parameters of the
virtual springs and the Jacobians/Hessians of the active and passive
joints. The developed technique is illustrated by an application example that deals with the stiffness analysis of a parallel
manipulator of the Orthoglide family
Abstract: Web usage mining algorithms have been widely
utilized for modeling user web navigation behavior. In this study we
advance a model for mining of user-s navigation pattern. The model
makes user model based on expectation-maximization (EM)
algorithm.An EM algorithm is used in statistics for finding maximum
likelihood estimates of parameters in probabilistic models, where the
model depends on unobserved latent variables. The experimental
results represent that by decreasing the number of clusters, the log
likelihood converges toward lower values and probability of the
largest cluster will be decreased while the number of the clusters
increases in each treatment.
Abstract: This paper proposes a new decision making approch
based on quantitative possibilistic influence diagrams which are
extension of standard influence diagrams in the possibilistic framework.
We will in particular treat the case where several expert
opinions relative to value nodes are available. An initial expert assigns
confidence degrees to other experts and fixes a similarity threshold
that provided possibility distributions should respect. To illustrate our
approach an evaluation algorithm for these multi-source possibilistic
influence diagrams will also be proposed.
Abstract: Detection of incipient abnormal events is important to
improve safety and reliability of machine operations and reduce losses
caused by failures. Improper set-ups or aligning of parts often leads to
severe problems in many machines. The construction of prediction
models for predicting faulty conditions is quite essential in making
decisions on when to perform machine maintenance. This paper
presents a multivariate calibration monitoring approach based on the
statistical analysis of machine measurement data. The calibration
model is used to predict two faulty conditions from historical reference
data. This approach utilizes genetic algorithms (GA) based variable
selection, and we evaluate the predictive performance of several
prediction methods using real data. The results shows that the
calibration model based on supervised probabilistic principal
component analysis (SPPCA) yielded best performance in this work.
By adopting a proper variable selection scheme in calibration models,
the prediction performance can be improved by excluding
non-informative variables from their model building steps.
Abstract: This article proposes an Ant Colony Optimization
(ACO) metaheuristic to minimize total makespan for scheduling a set
of jobs and assign workers for uniformly related parallel machines.
An algorithm based on ACO has been developed and coded on a
computer program Matlab®, to solve this problem. The paper
explains various steps to apply Ant Colony approach to the problem
of minimizing makespan for the worker assignment & jobs
scheduling problem in a parallel machine model and is aimed at
evaluating the strength of ACO as compared to other conventional
approaches. One data set containing 100 problems (12 Jobs, 03
machines and 10 workers) which is available on internet, has been
taken and solved through this ACO algorithm. The results of our
ACO based algorithm has shown drastically improved results,
especially, in terms of negligible computational effort of CPU, to
reach the optimal solution. In our case, the time taken to solve all 100
problems is even lesser than the average time taken to solve one
problem in the data set by other conventional approaches like GA
algorithm and SPT-A/LMC heuristics.
Abstract: Single biometric modality recognition is not able to meet the high performance supplies in most cases with its application become more and more broadly. Multimodal biometrics identification represents an emerging trend recently. This paper investigates a novel algorithm based on fusion of both fingerprint and fingervein biometrics. For both biometric recognition, we employ the Monogenic Local Binary Pattern (MonoLBP). This operator integrate the orginal LBP (Local Binary Pattern ) with both other rotation invariant measures: local phase and local surface type. Experimental results confirm that a weighted sum based proposed fusion achieves excellent identification performances opposite unimodal biometric systems. The AUC of proposed approach based on combining the two modalities has very close to unity (0.93).
Abstract: The Siemens Healthcare Sector is one of the world's
largest suppliers to the healthcare industry and a trendsetter in
medical imaging and therapy, laboratory diagnostics, medical
information technology, and hearing aids.
Siemens offers its customers products and solutions for the entire
range of patient care from a single source – from prevention and
early detection to diagnosis, and on to treatment and aftercare. By
optimizing clinical workflows for the most common diseases,
Siemens also makes healthcare faster, better, and more cost effective.
The optimization of clinical workflows requires a
multidisciplinary focus and a collaborative approach of e.g. medical
advisors, researchers and scientists as well as healthcare economists.
This new form of collaboration brings together experts with deep
technical experience, physicians with specialized medical knowledge
as well as people with comprehensive knowledge about health
economics.
As Charles Darwin is often quoted as saying, “It is neither the
strongest of the species that survive, nor the most intelligent, but the
one most responsive to change," We believe that those who can
successfully manage this change will emerge as winners, with
valuable competitive advantage.
Current medical information and knowledge are some of the core
assets in the healthcare industry. The main issue is to connect
knowledge holders and knowledge recipients from various
disciplines efficiently in order to spread and distribute knowledge.