Abstract: Validation of an automation system is an important issue. The goal is to check if the system under investigation, modeled by a Petri net, never enters the undesired states. Usually, tools dedicated to Petri nets such as DESIGN/CPN are used to make reachability analysis. The biggest problem with this approach is that it is impossible to generate the full occurence graph of the system because it is too large. In this paper, we show how computational methods such as temporal logic model checking and Groebner bases can be used to verify the correctness of the design of an automation system. We report our experimental results with two automation systems: the Automated Guided Vehicle (AGV) system and the traffic light system. Validation of these two systems ranged from 10 to 30 seconds on a PC depending on the optimizing parameters.
Abstract: In this paper we study a system composed by carbon
nanotube (CNT) and bundle of carbon nanotube (BuCNT) interacting
with a specific fatty acid as molecular probe. Full system is
represented by open nanotube (or nanotubes) and the linoleic acid
(LA) relaxing due the interaction with CNT and BuCNT. The LA has
in his form an asymmetric shape with COOH termination provoking
a close BuCNT interaction mainly by van der Waals force field. The
simulations were performed by classical molecular dynamics with
standard parameterizations.
Our results show that these BuCNT and CNT are dynamically
stable and it shows a preferential interaction position with LA
resulting in three features: (i) when the LA is interacting with CNT
and BuCNT (including both termination, CH2 or COOH), the LA is
repelled; (ii) when the LA terminated with CH2 is closer to open
extremity of BuCNT, the LA is also repelled by the interaction
between them; and (iii) when the LA terminated with COOH is
closer to open extremity of BuCNT, the LA is encapsulated by the
BuCNT. These simulations are part of a more extensive work on
searching efficient selective molecular devices and could be useful to
reach this goal.
Abstract: In research on natural ventilation, and passive cooling
with forced convection, is essential to know how heat flows in a solid
object and the pattern of temperature distribution on their surfaces,
and eventually how air flows through and convects heat from the
surfaces of steel under roof. This paper presents some results from
running the computational fluid dynamic program (CFD) by
comparison between natural ventilation and forced convection within
roof attic that is received directly from solar radiation. The CFD
program for modeling air flow inside roof attic has been modified to
allow as two cases. First case, the analysis under natural ventilation,
is closed area in roof attic and second case, the analysis under forced
convection, is opened area in roof attic. These extend of all cases to
available predictions of variations such as temperature, pressure, and
mass flow rate distributions in each case within roof attic. The
comparison shows that this CFD program is an effective model for
predicting air flow of temperature and heat transfer coefficient
distribution within roof attic. The result shows that forced convection
can help to reduce heat transfer through roof attic and an around area
of steel core has temperature inner zone lower than natural
ventilation type. The different temperature on the steel core of roof
attic of two cases was 10-15 oK.
Abstract: Random Forests are a powerful classification technique, consisting of a collection of decision trees. One useful feature of Random Forests is the ability to determine the importance of each variable in predicting the outcome. This is done by permuting each variable and computing the change in prediction accuracy before and after the permutation. This variable importance calculation is similar to a one-factor-at a time experiment and therefore is inefficient. In this paper, we use a regular fractional factorial design to determine which variables to permute. Based on the results of the trials in the experiment, we calculate the individual importance of the variables, with improved precision over the standard method. The method is illustrated with a study of student attrition at Monash University.
Abstract: Natural resources management including water resources requires reliable estimations of time variant environmental parameters. Small improvements in the estimation of environmental parameters would result in grate effects on managing decisions. Noise reduction using wavelet techniques is an effective approach for preprocessing of practical data sets. Predictability enhancement of the river flow time series are assessed using fractal approaches before and after applying wavelet based preprocessing. Time series correlation and persistency, the minimum sufficient length for training the predicting model and the maximum valid length of predictions were also investigated through a fractal assessment.
Abstract: The paper focuses on the enhanced stiffness modeling
of robotic manipulators by taking into account influence of the external force/torque acting upon the end point. It implements the
virtual joint technique that describes the compliance of manipulator elements by a set of localized six-dimensional springs separated by
rigid links and perfect joints. In contrast to the conventional
formulation, which is valid for the unloaded mode and small
displacements, the proposed approach implicitly assumes that the loading leads to the non-negligible changes of the manipulator posture and corresponding amendment of the Jacobian. The
developed numerical technique allows computing the static
equilibrium and relevant force/torque reaction of the manipulator for
any given displacement of the end-effector. This enables designer
detecting essentially nonlinear effects in elastic behavior of
manipulator, similar to the buckling of beam elements. It is also proposed the linearization procedure that is based on the inversion of
the dedicated matrix composed of the stiffness parameters of the
virtual springs and the Jacobians/Hessians of the active and passive
joints. The developed technique is illustrated by an application example that deals with the stiffness analysis of a parallel
manipulator of the Orthoglide family
Abstract: Web usage mining algorithms have been widely
utilized for modeling user web navigation behavior. In this study we
advance a model for mining of user-s navigation pattern. The model
makes user model based on expectation-maximization (EM)
algorithm.An EM algorithm is used in statistics for finding maximum
likelihood estimates of parameters in probabilistic models, where the
model depends on unobserved latent variables. The experimental
results represent that by decreasing the number of clusters, the log
likelihood converges toward lower values and probability of the
largest cluster will be decreased while the number of the clusters
increases in each treatment.
Abstract: This paper proposes a new decision making approch
based on quantitative possibilistic influence diagrams which are
extension of standard influence diagrams in the possibilistic framework.
We will in particular treat the case where several expert
opinions relative to value nodes are available. An initial expert assigns
confidence degrees to other experts and fixes a similarity threshold
that provided possibility distributions should respect. To illustrate our
approach an evaluation algorithm for these multi-source possibilistic
influence diagrams will also be proposed.
Abstract: Modeling of Panel Zone (PZ) seismic behavior,
because of its role in overall ductility and lateral stiffness of steel
moment frames, has been considered a challenge for years. There are
some studies regarding the effects of different doubler plates
thicknesses and geometric properties of PZ on its seismic behavior.
However, there is not much investigation on the effects of number of
provided continuity plates in case of presence of one triangular
haunch, two triangular haunches and rectangular haunch (T shape
haunches) for exterior columns. In this research first detailed finite
element models of 12tested connection of SAC joint venture were
created and analyzed then obtained cyclic behavior backbone curves
of these models besides other FE models for similar tests were used
for neural network training. Then seismic behavior of these data is
categorized according to continuity plate-s arrangements and
differences in type of haunches. PZ with one-sided haunches have
little plastic rotation. As the number of continuity plates increases
due to presence of two triangular haunches (four continuity plate),
there will be no plastic rotation, in other words PZ behaves in its
elastic range. In the case of rectangular haunch, PZ show more plastic
rotation in comparison with one-sided triangular haunch and
especially double-sided triangular haunches. Moreover, the models
that will be presented in case of triangular one-sided and double-
sided haunches and rectangular haunches as a result of this study
seem to have a proper estimation of PZ seismic behavior.
Abstract: The Siemens Healthcare Sector is one of the world's
largest suppliers to the healthcare industry and a trendsetter in
medical imaging and therapy, laboratory diagnostics, medical
information technology, and hearing aids.
Siemens offers its customers products and solutions for the entire
range of patient care from a single source – from prevention and
early detection to diagnosis, and on to treatment and aftercare. By
optimizing clinical workflows for the most common diseases,
Siemens also makes healthcare faster, better, and more cost effective.
The optimization of clinical workflows requires a
multidisciplinary focus and a collaborative approach of e.g. medical
advisors, researchers and scientists as well as healthcare economists.
This new form of collaboration brings together experts with deep
technical experience, physicians with specialized medical knowledge
as well as people with comprehensive knowledge about health
economics.
As Charles Darwin is often quoted as saying, “It is neither the
strongest of the species that survive, nor the most intelligent, but the
one most responsive to change," We believe that those who can
successfully manage this change will emerge as winners, with
valuable competitive advantage.
Current medical information and knowledge are some of the core
assets in the healthcare industry. The main issue is to connect
knowledge holders and knowledge recipients from various
disciplines efficiently in order to spread and distribute knowledge.
Abstract: It is sometimes difficult to differentiate between
innocent murmurs and pathological murmurs during auscultation. In
these difficult cases, an intelligent stethoscope with decision support
abilities would be of great value. In this study, using a dog model,
phonocardiographic recordings were obtained from 27 boxer dogs
with various degrees of aortic stenosis (AS) severity. As a reference
for severity assessment, continuous wave Doppler was used. The data
were analyzed with recurrence quantification analysis (RQA) with
the aim to find features able to distinguish innocent murmurs from
murmurs caused by AS. Four out of eight investigated RQA features
showed significant differences between innocent murmurs and
pathological murmurs. Using a plain linear discriminant analysis
classifier, the best pair of features (recurrence rate and entropy)
resulted in a sensitivity of 90% and a specificity of 88%. In
conclusion, RQA provide valid features which can be used for
differentiation between innocent murmurs and murmurs caused by
AS.
Abstract: In a recent major industry-supported research and development study, a novel framework was developed and applied for assessment of reliability and quality performance levels in reallife power systems with practical large-scale sizes. The new assessment methodology is based on three metaphors (dimensions) representing the relationship between available generation capacities and required demand levels. The paper shares the results of the successfully completed stud and describes the implementation of the new methodology on practical zones in the Saudi electricity system.
Abstract: Service-oriented systems have become popular and
presented many advantages in develop and maintain process. The
coupling is the most important attribute of services when they are
integrated into a system. In this paper, we propose a suite of metrics
to evaluate service-s quality according to its ability of coupling. We
use the coupling metrics to measure the maintainability, reliability,
testability, and reusability of services. Our proposed metrics are
operated in run-time which bring more exact results.
Abstract: This study was conducted in order to determine the physical properties and stability of mayonnaise-like emulsions as affected by modified yam starches. Native yam starch was modified via pre-gelatinization and cross-linking phosphorylation procedures. The emulsions (50% oil dispersed phase) were prepared with 0.3% native potato, native yam, pre-gelatinized yam and cross-linking phosphorylation yam starches. The droplet size of surface weighted mean diameter was found to be significantly (p < 0.05) lower in the sample with cross-linking phosphorylation yam starch as compared to other samples. Moreover, the viscosity of the sample with pregelatinized yam starch was observed to be higher than that of other samples. The phase separation stability was low in the freshly prepared and stored (45 days, 5°C) emulsions containing native yam starch. This study thus generally suggested that modified yam starches were more suitable (i.e. better physical properties and stability) to be used as stabilizers in a similar system i.e. light mayonnaises, rather than a native yam starch.
Abstract: In modern era, the biggest challenge facing the
software industry is the upcoming of new technologies. So, the
software engineers are gearing up themselves to meet and manage
change in large software system. Also they find it difficult to deal
with software cognitive complexities. In the last few years many
metrics were proposed to measure the cognitive complexity of
software. This paper aims at a comprehensive survey of the metric of
software cognitive complexity. Some classic and efficient software
cognitive complexity metrics, such as Class Complexity (CC),
Weighted Class Complexity (WCC), Extended Weighted Class
Complexity (EWCC), Class Complexity due to Inheritance (CCI) and
Average Complexity of a program due to Inheritance (ACI), are
discussed and analyzed. The comparison and the relationship of these
metrics of software complexity are also presented.
Abstract: Saturated hydraulic conductivity of Soil is an
important property in processes involving water and solute flow in
soils. Saturated hydraulic conductivity of soil is difficult to measure
and can be highly variable, requiring a large number of replicate
samples. In this study, 60 sets of soil samples were collected at
Saqhez region of Kurdistan province-IRAN. The statistics such as
Correlation Coefficient (R), Root Mean Square Error (RMSE), Mean
Bias Error (MBE) and Mean Absolute Error (MAE) were used to
evaluation the multiple linear regression models varied with number
of dataset. In this study the multiple linear regression models were
evaluated when only percentage of sand, silt, and clay content (SSC)
were used as inputs, and when SSC and bulk density, Bd, (SSC+Bd)
were used as inputs. The R, RMSE, MBE and MAE values of the 50
dataset for method (SSC), were calculated 0.925, 15.29, -1.03 and
12.51 and for method (SSC+Bd), were calculated 0.927, 15.28,-1.11
and 12.92, respectively, for relationship obtained from multiple
linear regressions on data. Also the R, RMSE, MBE and MAE values
of the 10 dataset for method (SSC), were calculated 0.725, 19.62, -
9.87 and 18.91 and for method (SSC+Bd), were calculated 0.618,
24.69, -17.37 and 22.16, respectively, which shows when number of
dataset increase, precision of estimated saturated hydraulic
conductivity, increases.
Abstract: The triumph of inductive neuro-stimulation since its rediscovery in the 1980s has been quite spectacular. In lots of branches ranging from clinical applications to basic research this system is absolutely indispensable. Nevertheless, the basic knowledge about the processes underlying the stimulation effect is still very rough and rarely refined in a quantitative way. This seems to be not only an inexcusable blank spot in biophysics and for stimulation prediction, but also a fundamental hindrance for technological progress. The already very sophisticated devices have reached a stage where further optimization requires better strategies than provided by simple linear membrane models of integrate-and-fire style. Addressing this problem for the first time, we suggest in the following text a way for virtual quantitative analysis of a stimulation system. Concomitantly, this ansatz seems to provide a route towards a better understanding by using nonlinear signal processing and taking the nerve as a filter that is adapted for neuronal magnetic stimulation. The model is compact and easy to adjust. The whole setup behaved very robustly during all performed tests. Exemplarily a recent innovative stimulator design known as cTMS is analyzed and dimensioned with this approach in the following. The results show hitherto unforeseen potentials.
Abstract: As wireless sensor networks are energy constraint networks
so energy efficiency of sensor nodes is the main design issue.
Clustering of nodes is an energy efficient approach. It prolongs the
lifetime of wireless sensor networks by avoiding long distance communication.
Clustering algorithms operate in rounds. Performance of
clustering algorithm depends upon the round time. A large round
time consumes more energy of cluster heads while a small round
time causes frequent re-clustering. So existing clustering algorithms
apply a trade off to round time and calculate it from the initial
parameters of networks. But it is not appropriate to use initial
parameters based round time value throughout the network lifetime
because wireless sensor networks are dynamic in nature (nodes can be
added to the network or some nodes go out of energy). In this paper
a variable round time approach is proposed that calculates round
time depending upon the number of active nodes remaining in the
field. The proposed approach makes the clustering algorithm adaptive
to network dynamics. For simulation the approach is implemented
with LEACH in NS-2 and the results show that there is 6% increase
in network lifetime, 7% increase in 50% node death time and 5%
improvement over the data units gathered at the base station.
Abstract: Deprivation indices are widely used in public health
study. These indices are also referred as the index of inequalities or
disadvantage. Even though, there are many indices that have been
built before, it is believed to be less appropriate to use the existing
indices to be applied in other countries or areas which had different
socio-economic conditions and different geographical characteristics.
The objective of this study is to construct the index based on the
geographical and socio-economic factors in Peninsular Malaysia
which is defined as the weighted household-based deprivation index.
This study has employed the variables based on household items,
household facilities, school attendance and education level obtained
from Malaysia 2000 census report. The factor analysis is used to
extract the latent variables from indicators, or reducing the
observable variable into smaller amount of components or factor.
Based on the factor analysis, two extracted factors were selected,
known as Basic Household Amenities and Middle-Class Household
Item factor. It is observed that the district with a lower index values
are located in the less developed states like Kelantan, Terengganu
and Kedah. Meanwhile, the areas with high index values are located
in developed states such as Pulau Pinang, W.P. Kuala Lumpur and
Selangor.
Abstract: Deep cold rolling (DCR) and low plasticity burnishing (LPB) process are cold working processes, which easily produce a smooth and work-hardened surface by plastic deformation of surface irregularities. The present study focuses on the surface roughness and surface hardness aspects of AISI 4140 work material, using fractional factorial design of experiments. The assessment of the surface integrity aspects on work material was done, in order to identify the predominant factors amongst the selected parameters. They were then categorized in order of significance followed by setting the levels of the factors for minimizing surface roughness and/or maximizing surface hardness. In the present work, the influence of main process parameters (force, feed rate, number of tool passes/overruns, initial roughness of the work piece, ball material, ball diameter and lubricant used) on the surface roughness and the hardness of AISI 4140 steel were studied for both LPB and DCR process and the results are compared. It was observed that by using LPB process surface hardness has been improved by 167% and in DCR process surface hardness has been improved by 442%. It was also found that the force, ball diameter, number of tool passes and initial roughness of the workpiece are the most pronounced parameters, which has a significant effect on the work piece-s surface during deep cold rolling and low plasticity burnishing process.