Abstract: Computer worm detection is commonly performed by
antivirus software tools that rely on prior explicit knowledge of the
worm-s code (detection based on code signatures). We present an
approach for detection of the presence of computer worms based on
Artificial Neural Networks (ANN) using the computer's behavioral
measures. Identification of significant features, which describe the
activity of a worm within a host, is commonly acquired from security
experts. We suggest acquiring these features by applying feature
selection methods. We compare three different feature selection
techniques for the dimensionality reduction and identification of the
most prominent features to capture efficiently the computer behavior
in the context of worm activity. Additionally, we explore three
different temporal representation techniques for the most prominent
features. In order to evaluate the different techniques, several
computers were infected with five different worms and 323 different
features of the infected computers were measured. We evaluated
each technique by preprocessing the dataset according to each one
and training the ANN model with the preprocessed data. We then
evaluated the ability of the model to detect the presence of a new
computer worm, in particular, during heavy user activity on the
infected computers.
Abstract: The effect of varying holding temperature on hatching success, occurrence of deformities and mortality rates were investigated for goldlined seabream eggs. Wild broodstock (600 g) were stocked at a 2:1 male-female ratio in a 2 m3 fiberglass tank supplied with filtered seawater (37 g L-1 salinity, temp. range 24±0.5 oC [day] and 22±1 oC [night], DO2 in excess of 5.0mg L-1). Females were injected with 200 IU kg-1 HCG between 08.00 and 10.00 h and returned to tanks to spawn following which eggs were collected by hand using a 100μm net. Fertilized eggs at the gastrulation stage (120 L-1) were randomly placed into one of 12 experimental 6 L aerated (DO2 5 mg L-1) plastic containers with water temperatures maintained at 24±0.5 oC (ambient), 26±0.5 oC, 28± 0.5 oC and 30±0.5 oC using thermostats. Each treatment was undertaken in triplicate using a 12:12 photophase:scotophase photoperiod. No differences were recorded between eggs reared at 24 and 26 oC with respect to viability, deformity, mortality or unhatched egg rates. Increasing temperature reduced the number of viable eggs with those at 30 oC returning poorest performance (P < 0.05). Mortality levels were lowest for eggs incubated at 24 and 26 oC. The greatest level of deformities recorded was that for eggs reared at 28 oC.
Abstract: This article presents a detailed analysis and comparative
performance evaluation of model reference adaptive control systems.
In contrast to classical control theory, adaptive control methods allow
to deal with time-variant processes. Inspired by the works [1] and
[2], two methods based on the MIT rule and Lyapunov rule are
applied to a linear first order system. The system is simulated and
it is investigated how changes to the adaptation gain affect the
system performance. Furthermore, variations in the reference model
parameters, that is changing the desired closed-loop behaviour are
examinded.
Abstract: Recent years, adaptive pushover methods have been
developed for seismic analysis of structures. Herein, the accuracy of
the displacement-based adaptive pushover (DAP) method, which is
introduced by Antoniou and Pinho [2004], is evaluated for Irregular
buildings. The results are compared to the force-based procedure.
Both concrete and steel frame structures, asymmetric in plan and
elevation are analyzed and also torsional effects are taking into the
account. These analyses are performed using both near fault and far
fault records. In order to verify the results, the Incremental Dynamic
Analysis (IDA) is performed.
Abstract: Transmission network expansion planning (TNEP) is an important component of power system planning that its task is to minimize the network construction and operational cost while satisfying the demand increasing, imposed technical and economic conditions. Up till now, various methods have been presented to solve the static transmission network expansion planning (STNEP) problem. But in all of these methods, the lines adequacy rate has not been studied after the planning horizon, i.e. when the expanded network misses its adequacy and needs to be expanded again. In this paper, in order to take transmission lines condition after expansion in to account from the line loading view point, the adequacy of transmission network is considered for solution of STNEP problem. To obtain optimal network arrangement, a decimal codification genetic algorithm (DCGA) is being used for minimizing the network construction and operational cost. The effectiveness of the proposed idea is tested on the Garver's six-bus network. The results evaluation reveals that the annual worth of network adequacy has a considerable effect on the network arrangement. In addition, the obtained network, based on the DCGA, has lower investment cost and higher adequacy rate. Thus, the network satisfies the requirements of delivering electric power more safely and reliably to load centers.
Abstract: Analysis of blood vessel mechanics in normal and
diseased conditions is essential for disease research, medical device
design and treatment planning. In this work, 3D finite element
models of normal vessel and atherosclerotic vessel with 50% plaque
deposition were developed. The developed models were meshed
using finite number of tetrahedral elements. The developed models
were simulated using actual blood pressure signals. Based on the
transient analysis performed on the developed models, the parameters
such as total displacement, strain energy density and entropy per unit
volume were obtained. Further, the obtained parameters were used to
develop artificial neural network models for analyzing normal and
atherosclerotic blood vessels. In this paper, the objectives of the
study, methodology and significant observations are presented.
Abstract: The modified Claus process is the major technology
for the recovery of elemental sulfur from hydrogen sulfide. The
chemical reactions that can occur in the reaction furnace are
numerous and many byproducts such as carbon disulfide and carbon
carbonyl sulfide are produced. These compounds can often contribute
from 20 to 50% of the pollutants and therefore, should be hydrolyzed
in the catalytic converter. The inlet temperature of the first catalytic
reactor should be maintained over than 250 °C, to hydrolyze COS
and CS2. In this paper, the various configurations for the first
converter reheating of sulfur recovery unit are investigated. As a
result, the performance of each method is presented for a typical
clause unit. The results show that the hot gas method seems to be
better than the other methods.
Abstract: This paper proposes a method for speckle reduction in
medical ultrasound imaging while preserving the edges with the
added advantages of adaptive noise filtering and speed. A nonlinear
image diffusion method that incorporates local image parameter,
namely, scatterer density in addition to gradient, to weight the
nonlinear diffusion process, is proposed. The method was tested for
the isotropic case with a contrast detail phantom and varieties of
clinical ultrasound images, and then compared to linear and some
other diffusion enhancement methods. Different diffusion parameters
were tested and tuned to best reduce speckle noise and preserve
edges. The method showed superior performance measured both
quantitatively and qualitatively when incorporating scatterer density
into the diffusivity function. The proposed filter can be used as a
preprocessing step for ultrasound image enhancement before
applying automatic segmentation, automatic volumetric calculations,
or 3D ultrasound volume rendering.
Abstract: Construction cost in India is increasing at around 50
per cent over the average inflation levels. It have registered increase
of up to 15 per cent every year, primarily due to cost of basic
building materials such as steel, cement, bricks, timber and other
inputs as well as cost of labour. As a result, the cost of construction
using conventional building materials and construction is becoming
beyond the affordable limits particularly for low-income groups of
population as well as a large cross section of the middle - income
groups. Therefore, there is a need to adopt cost-effective construction
methods either by up-gradation of traditional technologies using local
resources or applying modern construction materials and techniques
with efficient inputs leading to economic solutions. This has become
the most relevant aspect in the context of the large volume of housing
to be constructed in both rural and urban areas and the consideration
of limitations in the availability of resources such as building
materials and finance. This paper makes an overview of the housing
status in India and adoption of appropriate and cost effective
technologies in the country.
Abstract: The Algorithm 2 for a n-link manipulator movement amidst arbitrary unknown static obstacles for a case when a sensor system supplies information about local neighborhoods of different points in the configuration space is presented. The Algorithm 2 guarantees the reaching of a target position in a finite number of steps. The Algorithm 2 is reduced to a finite number of calls of a subroutine for planning a trajectory in the presence of known forbidden states. The polynomial approximation algorithm which is used as the subroutine is presented. The results of the Algorithm2 implementation are given.
Abstract: A prototype model of an emulsion separator was
designed and manufactured. Generally, it is a cylinder filled with
different fractal modules. The emulsion was fed into the reactor by a
peristaltic pump through an inlet placed at the boundary between the
two phases. For hydrodynamic design and sizing of the reactor the
assumptions of the theory of filtration were used and methods to
describe the separation process were developed. Based on this
methodology and using numerical methods and software of Autodesk
the process is simulated in different operating modes. The basic
hydrodynamic characteristics - speed and performance for different
types of fractal systems and decisions to optimize the design of the
reactor were also defined.
Abstract: This study systemizes processes and methods in
wooden furniture design that contains uniqueness in function and
aesthetics. The study was done by research and analysis for
designer-s consideration factors that affect function and production.
Therefore, the study result indicates that such factors are design
process (planning for design, product specifications, concept design,
product architecture, industrial design, production), design evaluation
as well as wooden furniture design dependent factors i.e. art (art
style; furniture history, form), functionality (the strength and
durability, area place, using), material (appropriate to function, wood
mechanical properties), joints, cost, safety, and social responsibility.
Specifically, all aforementioned factors affect good design. Resulting
from direct experience gained through user-s usage, the designer
must design the wooden furniture systemically and effectively. As a
result, this study selected dinning armchair as a case study with all
involving factors and all design process stated in this study.
Abstract: Image target detection and tracking methods based on
target information such as intensity, shape model, histogram and
target dynamics have been proven to be robust to target model
variations and background clutters as shown by recent researches.
However, no definitive answer has been given to occluded target by
counter measure or limited field of view(FOV). In this paper, we
will present a novel tracking method using filtering and computational
geometry. This paper has two central goals: 1) to deal with vulnerable
target measurements; and 2) to maintain target tracking out of FOV
using non-target-originated information. The experimental results,
obtained with airborne images, show a robust tracking ability with
respect to the existing approaches. In exploring the questions of target
tracking, this paper will be limited to consideration of airborne image.
Abstract: Cognitive Science appeared about 40 years ago,
subsequent to the challenge of the Artificial Intelligence, as common
territory for several scientific disciplines such as: IT, mathematics,
psychology, neurology, philosophy, sociology, and linguistics. The
new born science was justified by the complexity of the problems
related to the human knowledge on one hand, and on the other by the
fact that none of the above mentioned sciences could explain alone
the mental phenomena. Based on the data supplied by the
experimental sciences such as psychology or neurology, models of
the human mind operation are built in the cognition science. These
models are implemented in computer programs and/or electronic
circuits (specific to the artificial intelligence) – cognitive systems –
whose competences and performances are compared to the human
ones, leading to the psychology and neurology data reinterpretation,
respectively to the construction of new models. During these
processes if psychology provides the experimental basis, philosophy
and mathematics provides the abstraction level utterly necessary for
the intermission of the mentioned sciences.
The ongoing general problematic of the cognitive approach
provides two important types of approach: the computational one,
starting from the idea that the mental phenomenon can be reduced to
1 and 0 type calculus operations, and the connection one that
considers the thinking products as being a result of the interaction
between all the composing (included) systems. In the field of
psychology measurements in the computational register use classical
inquiries and psychometrical tests, generally based on calculus
methods. Deeming things from both sides that are representing the
cognitive science, we can notice a gap in psychological product
measurement possibilities, regarded from the connectionist
perspective, that requires the unitary understanding of the quality –
quantity whole. In such approach measurement by calculus proves to
be inefficient. Our researches, deployed for longer than 20 years,
lead to the conclusion that measuring by forms properly fits to the
connectionism laws and principles.
Abstract: Instead of traditional (nominal) classification we investigate
the subject of ordinal classification or ranking. An enhanced
method based on an ensemble of Support Vector Machines (SVM-s)
is proposed. Each binary classifier is trained with specific weights
for each object in the training data set. Experiments on benchmark
datasets and synthetic data indicate that the performance of our
approach is comparable to state of the art kernel methods for
ordinal regression. The ensemble method, which is straightforward
to implement, provides a very good sensitivity-specificity trade-off
for the highest and lowest rank.
Abstract: Automatic methods of detecting changes through
satellite imaging are the object of growing interest, especially
beca²use of numerous applications linked to analysis of the Earth’s
surface or the environment (monitoring vegetation, updating maps,
risk management, etc...). This work implemented spatial analysis
techniques by using images with different spatial and spectral
resolutions on different dates. The work was based on the principle
of control charts in order to set the upper and lower limits beyond
which a change would be noted. Later, the a contrario approach was
used. This was done by testing different thresholds for which the
difference calculated between two pixels was significant. Finally,
labeled images were considered, giving a particularly low difference
which meant that the number of “false changes” could be estimated
according to a given limit.
Abstract: Many high-risk pathogens that cause disease in
humans are transmitted through various food items. Food-borne
disease constitutes a major public health problem. Assessment of the
quality and safety of foods is important in human health. Rapid and
easy detection of pathogenic organisms will facilitate precautionary
measures to maintain healthy food. The Polymerase Chain Reaction
(PCR) is a handy tool for rapid detection of low numbers of bacteria.
We have designed gene specific primers for most common food
borne pathogens such as Staphylococci, Salmonella and E.coli.
Bacteria were isolated from food samples of various food outlets and
identified using gene specific PCRs. We identified Staphylococci,
Salmonella and E.coli O157 using gene specific primers by rapid and
direct PCR technique in various food samples. This study helps us in
getting a complete picture of the various pathogens that threaten to
cause and spread food borne diseases and it would also enable
establishment of a routine procedure and methodology for rapid
identification of food borne bacteria using the rapid technique of
direct PCR. This study will also enable us to judge the efficiency of
present food safety steps taken by food manufacturers and exporters.
Abstract: In this paper, we propose a geometric modeling of
illumination on the patterned image containing etching transistor. This
image is captured by a commercial camera during the inspection of
a TFT-LCD panel. Inspection of defect is an important process in the
production of LCD panel, but the regional difference in brightness,
which has a negative effect on the inspection, is due to the uneven
illumination environment. In order to solve this problem, we present
a geometric modeling of illumination consisting of an interpolation
using the least squares method and 3D modeling using bezier surface.
Our computational time, by using the sampling method, is shorter
than the previous methods. Moreover, it can be further used to correct
brightness in every patterned image.
Abstract: In the semiconductor manufacturing process, large
amounts of data are collected from various sensors of multiple
facilities. The collected data from sensors have several different characteristics
due to variables such as types of products, former processes
and recipes. In general, Statistical Quality Control (SQC) methods
assume the normality of the data to detect out-of-control states of
processes. Although the collected data have different characteristics,
using the data as inputs of SQC will increase variations of data,
require wide control limits, and decrease performance to detect outof-
control. Therefore, it is necessary to separate similar data groups
from mixed data for more accurate process control. In the paper,
we propose a regression tree using split algorithm based on Pearson
distribution to handle non-normal distribution in parametric method.
The regression tree finds similar properties of data from different
variables. The experiments using real semiconductor manufacturing
process data show improved performance in fault detecting ability.
Abstract: Data mining incorporates a group of statistical
methods used to analyze a set of information, or a data set. It operates
with models and algorithms, which are powerful tools with the great
potential. They can help people to understand the patterns in certain
chunk of information so it is obvious that the data mining tools have
a wide area of applications. For example in the theoretical chemistry
data mining tools can be used to predict moleculeproperties or
improve computer-assisted drug design. Classification analysis is one
of the major data mining methodologies. The aim of thecontribution
is to create a classification model, which would be able to deal with a
huge data set with high accuracy. For this purpose logistic regression,
Bayesian logistic regression and random forest models were built
using R software. TheBayesian logistic regression in Latent GOLD
software was created as well. These classification methods belong to
supervised learning methods.
It was necessary to reduce data matrix dimension before construct
models and thus the factor analysis (FA) was used. Those models
were applied to predict the biological activity of molecules, potential
new drug candidates.