Abstract: Natural resources management including water resources requires reliable estimations of time variant environmental parameters. Small improvements in the estimation of environmental parameters would result in grate effects on managing decisions. Noise reduction using wavelet techniques is an effective approach for preprocessing of practical data sets. Predictability enhancement of the river flow time series are assessed using fractal approaches before and after applying wavelet based preprocessing. Time series correlation and persistency, the minimum sufficient length for training the predicting model and the maximum valid length of predictions were also investigated through a fractal assessment.
Abstract: The paper focuses on the enhanced stiffness modeling
of robotic manipulators by taking into account influence of the external force/torque acting upon the end point. It implements the
virtual joint technique that describes the compliance of manipulator elements by a set of localized six-dimensional springs separated by
rigid links and perfect joints. In contrast to the conventional
formulation, which is valid for the unloaded mode and small
displacements, the proposed approach implicitly assumes that the loading leads to the non-negligible changes of the manipulator posture and corresponding amendment of the Jacobian. The
developed numerical technique allows computing the static
equilibrium and relevant force/torque reaction of the manipulator for
any given displacement of the end-effector. This enables designer
detecting essentially nonlinear effects in elastic behavior of
manipulator, similar to the buckling of beam elements. It is also proposed the linearization procedure that is based on the inversion of
the dedicated matrix composed of the stiffness parameters of the
virtual springs and the Jacobians/Hessians of the active and passive
joints. The developed technique is illustrated by an application example that deals with the stiffness analysis of a parallel
manipulator of the Orthoglide family
Abstract: Web usage mining algorithms have been widely
utilized for modeling user web navigation behavior. In this study we
advance a model for mining of user-s navigation pattern. The model
makes user model based on expectation-maximization (EM)
algorithm.An EM algorithm is used in statistics for finding maximum
likelihood estimates of parameters in probabilistic models, where the
model depends on unobserved latent variables. The experimental
results represent that by decreasing the number of clusters, the log
likelihood converges toward lower values and probability of the
largest cluster will be decreased while the number of the clusters
increases in each treatment.
Abstract: This paper proposes a new decision making approch
based on quantitative possibilistic influence diagrams which are
extension of standard influence diagrams in the possibilistic framework.
We will in particular treat the case where several expert
opinions relative to value nodes are available. An initial expert assigns
confidence degrees to other experts and fixes a similarity threshold
that provided possibility distributions should respect. To illustrate our
approach an evaluation algorithm for these multi-source possibilistic
influence diagrams will also be proposed.
Abstract: Detection of incipient abnormal events is important to
improve safety and reliability of machine operations and reduce losses
caused by failures. Improper set-ups or aligning of parts often leads to
severe problems in many machines. The construction of prediction
models for predicting faulty conditions is quite essential in making
decisions on when to perform machine maintenance. This paper
presents a multivariate calibration monitoring approach based on the
statistical analysis of machine measurement data. The calibration
model is used to predict two faulty conditions from historical reference
data. This approach utilizes genetic algorithms (GA) based variable
selection, and we evaluate the predictive performance of several
prediction methods using real data. The results shows that the
calibration model based on supervised probabilistic principal
component analysis (SPPCA) yielded best performance in this work.
By adopting a proper variable selection scheme in calibration models,
the prediction performance can be improved by excluding
non-informative variables from their model building steps.
Abstract: This article proposes an Ant Colony Optimization
(ACO) metaheuristic to minimize total makespan for scheduling a set
of jobs and assign workers for uniformly related parallel machines.
An algorithm based on ACO has been developed and coded on a
computer program Matlab®, to solve this problem. The paper
explains various steps to apply Ant Colony approach to the problem
of minimizing makespan for the worker assignment & jobs
scheduling problem in a parallel machine model and is aimed at
evaluating the strength of ACO as compared to other conventional
approaches. One data set containing 100 problems (12 Jobs, 03
machines and 10 workers) which is available on internet, has been
taken and solved through this ACO algorithm. The results of our
ACO based algorithm has shown drastically improved results,
especially, in terms of negligible computational effort of CPU, to
reach the optimal solution. In our case, the time taken to solve all 100
problems is even lesser than the average time taken to solve one
problem in the data set by other conventional approaches like GA
algorithm and SPT-A/LMC heuristics.
Abstract: Single biometric modality recognition is not able to meet the high performance supplies in most cases with its application become more and more broadly. Multimodal biometrics identification represents an emerging trend recently. This paper investigates a novel algorithm based on fusion of both fingerprint and fingervein biometrics. For both biometric recognition, we employ the Monogenic Local Binary Pattern (MonoLBP). This operator integrate the orginal LBP (Local Binary Pattern ) with both other rotation invariant measures: local phase and local surface type. Experimental results confirm that a weighted sum based proposed fusion achieves excellent identification performances opposite unimodal biometric systems. The AUC of proposed approach based on combining the two modalities has very close to unity (0.93).
Abstract: The Siemens Healthcare Sector is one of the world's
largest suppliers to the healthcare industry and a trendsetter in
medical imaging and therapy, laboratory diagnostics, medical
information technology, and hearing aids.
Siemens offers its customers products and solutions for the entire
range of patient care from a single source – from prevention and
early detection to diagnosis, and on to treatment and aftercare. By
optimizing clinical workflows for the most common diseases,
Siemens also makes healthcare faster, better, and more cost effective.
The optimization of clinical workflows requires a
multidisciplinary focus and a collaborative approach of e.g. medical
advisors, researchers and scientists as well as healthcare economists.
This new form of collaboration brings together experts with deep
technical experience, physicians with specialized medical knowledge
as well as people with comprehensive knowledge about health
economics.
As Charles Darwin is often quoted as saying, “It is neither the
strongest of the species that survive, nor the most intelligent, but the
one most responsive to change," We believe that those who can
successfully manage this change will emerge as winners, with
valuable competitive advantage.
Current medical information and knowledge are some of the core
assets in the healthcare industry. The main issue is to connect
knowledge holders and knowledge recipients from various
disciplines efficiently in order to spread and distribute knowledge.
Abstract: Human Computer Interaction (HCI) has been an
emerging field that draws in the experts from various fields to
enhance the application of computer programs and the ease of
computer users. HCI has much to do with learning and cognition and
an emerging approach to learning and problem-solving is problembased
learning (PBL). The processes of PBL involve important
cognitive functions in the various stages. This paper will illustrate
how closely related fields to HCI, PBL and cognitive psychology can
benefit from informing each other through analysing various
cognitive functions. Several cognitive functions from cognitive
function disc (CFD) would be presented and discussed in relation to
human-computer interface. This paper concludes with the
implications of bridging the gaps amongst these disciplines.
Abstract: This paper examines many mathematical methods for
molding the hourly price forward curve (HPFC); the model will be
constructed by numerous regression methods, like polynomial
regression, radial basic function neural networks & a furrier series.
Examination the models goodness of fit will be done by means of
statistical & graphical tools. The criteria for choosing the model will
depend on minimize the Root Mean Squared Error (RMSE), using the
correlation analysis approach for the regression analysis the optimal
model will be distinct, which are robust against model
misspecification. Learning & supervision technique employed to
determine the form of the optimal parameters corresponding to each
measure of overall loss. By using all the numerical methods that
mentioned previously; the explicit expressions for the optimal model
derived and the optimal designs will be implemented.
Abstract: Focusing on the environmental issues, including the reduction of scrap and consumer residuals, along with the benefiting from the economic value during the life cycle of goods/products leads the companies to have an important competitive approach. The aim of this paper is to present a new mixed nonlinear facility locationallocation model in recycling collection networks by considering multi-echelon, multi-suppliers, multi-collection centers and multifacilities in the recycling network. To make an appropriate decision in reality, demands, returns, capacities, costs and distances, are regarded uncertain in our model. For this purpose, a fuzzy mathematical programming-based possibilistic approach is introduced as a solution methodology from the recent literature to solve the proposed mixed-nonlinear programming model (MNLP). The computational experiments are provided to illustrate the applicability of the designed model in a supply chain environment and to help the decision makers to facilitate their analysis.
Abstract: MicroRNAs (miRNAs) are small, non-coding and
regulatory RNAs about 20 to 24 nucleotides long. Their conserved
nature among the various organisms makes them a good source of
new miRNAs discovery by comparative genomics approach. The
study resulted in 21 miRNAs of 20 pre-miRNAs belonging to 16
families (miR156, 157, 158, 164, 165, 168, 169, 172, 319, 390, 393,
394, 395, 400, 472 and 861) in evergreen spruce tree (Picea). The
miRNA families; miR 157, 158, 164, 165, 168, 169, 319, 390, 393,
394, 400, 472 and 861 are reported for the first time in the Picea. All
20 miRNA precursors form stable minimum free energy stem-loop
structure as their orthologues form in Arabidopsis and the mature
miRNA reside in the stem portion of the stem loop structure. Sixteen
(16) miRNAs are from Picea glauca and five (5) belong to Picea
sitchensis. Their targets consist of transcription factors, growth
related, stressed related and hypothetical proteins.
Abstract: The triumph of inductive neuro-stimulation since its rediscovery in the 1980s has been quite spectacular. In lots of branches ranging from clinical applications to basic research this system is absolutely indispensable. Nevertheless, the basic knowledge about the processes underlying the stimulation effect is still very rough and rarely refined in a quantitative way. This seems to be not only an inexcusable blank spot in biophysics and for stimulation prediction, but also a fundamental hindrance for technological progress. The already very sophisticated devices have reached a stage where further optimization requires better strategies than provided by simple linear membrane models of integrate-and-fire style. Addressing this problem for the first time, we suggest in the following text a way for virtual quantitative analysis of a stimulation system. Concomitantly, this ansatz seems to provide a route towards a better understanding by using nonlinear signal processing and taking the nerve as a filter that is adapted for neuronal magnetic stimulation. The model is compact and easy to adjust. The whole setup behaved very robustly during all performed tests. Exemplarily a recent innovative stimulator design known as cTMS is analyzed and dimensioned with this approach in the following. The results show hitherto unforeseen potentials.
Abstract: As wireless sensor networks are energy constraint networks
so energy efficiency of sensor nodes is the main design issue.
Clustering of nodes is an energy efficient approach. It prolongs the
lifetime of wireless sensor networks by avoiding long distance communication.
Clustering algorithms operate in rounds. Performance of
clustering algorithm depends upon the round time. A large round
time consumes more energy of cluster heads while a small round
time causes frequent re-clustering. So existing clustering algorithms
apply a trade off to round time and calculate it from the initial
parameters of networks. But it is not appropriate to use initial
parameters based round time value throughout the network lifetime
because wireless sensor networks are dynamic in nature (nodes can be
added to the network or some nodes go out of energy). In this paper
a variable round time approach is proposed that calculates round
time depending upon the number of active nodes remaining in the
field. The proposed approach makes the clustering algorithm adaptive
to network dynamics. For simulation the approach is implemented
with LEACH in NS-2 and the results show that there is 6% increase
in network lifetime, 7% increase in 50% node death time and 5%
improvement over the data units gathered at the base station.
Abstract: Compensating physiological motion in the context
of minimally invasive cardiac surgery has become an attractive
issue since it outperforms traditional cardiac procedures offering
remarkable benefits. Owing to space restrictions, computer vision
techniques have proven to be the most practical and suitable solution.
However, the lack of robustness and efficiency of existing methods
make physiological motion compensation an open and challenging
problem. This work focusses on increasing robustness and efficiency
via exploration of the classes of 1−and 2−regularized optimization,
emphasizing the use of explicit regularization. Both approaches are
based on natural features of the heart using intensity information.
Results pointed out the 1−regularized optimization class as the best
since it offered the shortest computational cost, the smallest average
error and it proved to work even under complex deformations.
Abstract: In recent years, environment regulation forcing
manufactures to consider recovery activity of end-of- life products
and/or return products for refurbishing, recycling,
remanufacturing/repair and disposal in supply chain management. In
this paper, a mathematical model is formulated for single product
production-inventory system considering remanufacturing/reuse of
return products and rate of return products follows a demand like
function, dependent on purchasing price and acceptance quality level.
It is useful in decision making to determine whether to go for
remanufacturing or disposal of returned products along with newly
produced products to satisfy a stationary demand. In addition, a
modified genetic algorithm approach is proposed, inspired by particle
swarm optimization method. Numerical analysis of the case study is
carried out to validate the model.
Abstract: ICA which is generally used for blind source separation
problem has been tested for feature extraction in Speech recognition
system to replace the phoneme based approach of MFCC. Applying
the Cepstral coefficients generated to ICA as preprocessing has
developed a new signal processing approach. This gives much better
results against MFCC and ICA separately, both for word and speaker
recognition. The mixing matrix A is different before and after MFCC
as expected. As Mel is a nonlinear scale. However, cepstrals
generated from Linear Predictive Coefficient being independent
prove to be the right candidate for ICA. Matlab is the tool used for
all comparisons. The database used is samples of ISOLET.
Abstract: In this paper we introduce the notion of protein interaction
network. This is a graph whose vertices are the protein-s
amino acids and whose edges are the interactions between them.
Using a graph theory approach, we identify a number of properties of
these networks. We compare them to the general small-world network
model and we analyze their hierarchical structure.
Abstract: Construction of tunnels is connected with high
uncertainty in the field of costs, construction period, safety and
impact on surroundings. Risk management became therefore a
common part of tunnel projects, especially after a set of fatal
collapses occurred in 1990's. Such collapses are caused usually by
combination of factors that can be divided into three main groups, i.e.
unfavourable geological conditions, failures in the design and
planning or failures in the execution.
This paper suggests a procedure enabling quantification of the
excavation risk related to extraordinary accidents using FTA and
ETA tools. It will elaborate on a common process of risk analysis and
enable the transfer of information and experience between particular
tunnel construction projects. Further, it gives a guide for designers,
management and other participants, how to deal with risk of such
accidents and how to make qualified decisions based on a
probabilistic approach.
Abstract: Connected dominating set (CDS) problem in unit disk
graph has signi£cant impact on an ef£cient design of routing protocols
in wireless sensor networks, where the searching space for a
route is reduced to nodes in the set. A set is dominating if all the
nodes in the system are either in the set or neighbors of nodes in the
set. In this paper, a simple and ef£cient heuristic method is proposed
for £nding a minimum connected dominating set (MCDS) in ad hoc
wireless networks based on the new parameter support of vertices.
With this parameter the proposed heuristic approach effectively
£nds the MCDS of a graph. Extensive computational experiments
show that the proposed approach outperforms the recently proposed
heuristics found in the literature for the MCD