Abstract: A prototype model of an emulsion separator was
designed and manufactured. Generally, it is a cylinder filled with
different fractal modules. The emulsion was fed into the reactor by a
peristaltic pump through an inlet placed at the boundary between the
two phases. For hydrodynamic design and sizing of the reactor the
assumptions of the theory of filtration were used and methods to
describe the separation process were developed. Based on this
methodology and using numerical methods and software of Autodesk
the process is simulated in different operating modes. The basic
hydrodynamic characteristics - speed and performance for different
types of fractal systems and decisions to optimize the design of the
reactor were also defined.
Abstract: This study systemizes processes and methods in
wooden furniture design that contains uniqueness in function and
aesthetics. The study was done by research and analysis for
designer-s consideration factors that affect function and production.
Therefore, the study result indicates that such factors are design
process (planning for design, product specifications, concept design,
product architecture, industrial design, production), design evaluation
as well as wooden furniture design dependent factors i.e. art (art
style; furniture history, form), functionality (the strength and
durability, area place, using), material (appropriate to function, wood
mechanical properties), joints, cost, safety, and social responsibility.
Specifically, all aforementioned factors affect good design. Resulting
from direct experience gained through user-s usage, the designer
must design the wooden furniture systemically and effectively. As a
result, this study selected dinning armchair as a case study with all
involving factors and all design process stated in this study.
Abstract: Image target detection and tracking methods based on
target information such as intensity, shape model, histogram and
target dynamics have been proven to be robust to target model
variations and background clutters as shown by recent researches.
However, no definitive answer has been given to occluded target by
counter measure or limited field of view(FOV). In this paper, we
will present a novel tracking method using filtering and computational
geometry. This paper has two central goals: 1) to deal with vulnerable
target measurements; and 2) to maintain target tracking out of FOV
using non-target-originated information. The experimental results,
obtained with airborne images, show a robust tracking ability with
respect to the existing approaches. In exploring the questions of target
tracking, this paper will be limited to consideration of airborne image.
Abstract: Wood as a natural renewable material is vulnerable to
degradation by microorganisms and susceptible to change in
dimension by water. In order to effectively improve the durability of
wood, an active reagent, maleic anhydride (Man) was selected for
wood modification. Man was first dissolved into a solvent, and then
penetrated into wood porous structure under a vacuum/pressure
condition. After a final catalyst-thermal treatment, wood modification
was finished. The test results indicate that acetone is a good solvent for
transporting Man into wood matrix. SEM observation proved that
wood samples treated by Man kept a good cellular structure, indicating
a well penetration of Man into wood cell walls. FTIR analysis
suggested that Man reacted with hydroxyl groups on wood cell walls
by its ring-ether group, resulting in reduction of amount of hydroxyl
groups and resultant good dimensional stability as well as fine decay
resistance. Consequently, Man modifying wood to improve its
durability is an effective method.
Abstract: Recently, nanomaterials are developed in the form of nano-films, nano-crystals and nano-pores. Lanthanide phosphates as a material find extensive application as laser, ceramic, sensor, phosphor, and also in optoelectronics, medical and biological labels, solar cells and light sources. Among the different kinds of rare-earth orthophosphates, yttrium orthophosphate has been shown to be an efficient host lattice for rare earth activator ions, which have become a research focus because of their important role in the field of light display systems, lasers, and optoelectronic devices. It is in this context that the 4fn- « 4fn-1 5d transitions of rare earth in insulating materials, lying in the UV and VUV, are the aim of large number of studies .Though there has been a few reports on Eu3+, Nd3+, Pr3+,Er3+, Ce3+, Tm3+ doped YPO4. The 4fn- « 4fn-1 5d transitions of the rare earth dependent to the host-matrix, several matrices ions were used to study these transitions, in this work we are suggesting to study on a very specific class of inorganic material that are orthophosphate doped with rare earth ions. This study focused on the effect of Ce3+ concentration on the structural and optical properties of Ce3+ doped YPO4 yttrium orthophosphate with powder form prepared by the Sol Gel method.
Abstract: Computation of facility location problem for every
location in the country is not easy simultaneously. Solving the
problem is described by using cluster computing. A technique is to
design parallel algorithm by using local search with single swap
method in order to solve that problem on clusters. Parallel
implementation is done by the use of portable parallel programming,
Message Passing Interface (MPI), on Microsoft Windows Compute
Cluster. In this paper, it presents the algorithm that used local search
with single swap method and implementation of the system of a
facility to be opened by using MPI on cluster. If large datasets are
considered, the process of calculating a reasonable cost for a facility
becomes time consuming. The result shows parallel computation of
facility location problem on cluster speedups and scales well as
problem size increases.
Abstract: In this paper we present a new method for over-height
vehicle detection in low headroom streets and highways using digital
video possessing. The accuracy and the lower price comparing to
present detectors like laser radars and the capability of providing
extra information like speed and height measurement make this
method more reliable and efficient. In this algorithm the features are
selected and tracked using KLT algorithm. A blob extraction
algorithm is also applied using background estimation and
subtraction. Then the world coordinates of features that are inside the
blobs are estimated using a noble calibration method. As, the heights
of the features are calculated, we apply a threshold to select overheight
features and eliminate others. The over-height features are
segmented using some association criteria and grouped using an
undirected graph. Then they are tracked through sequential frames.
The obtained groups refer to over-height vehicles in a scene.
Abstract: Dust acoustic solitary waves are studied in warm
dusty plasma containing negatively charged dusts, nonthermal ions
and Boltzmann distributed electrons. Sagdeev pseudopotential
method is used in order to investigate solitary wave solutions in the
plasmas. The existence of compressive and rarefractive solitons is
studied.
Abstract: Cognitive Science appeared about 40 years ago,
subsequent to the challenge of the Artificial Intelligence, as common
territory for several scientific disciplines such as: IT, mathematics,
psychology, neurology, philosophy, sociology, and linguistics. The
new born science was justified by the complexity of the problems
related to the human knowledge on one hand, and on the other by the
fact that none of the above mentioned sciences could explain alone
the mental phenomena. Based on the data supplied by the
experimental sciences such as psychology or neurology, models of
the human mind operation are built in the cognition science. These
models are implemented in computer programs and/or electronic
circuits (specific to the artificial intelligence) – cognitive systems –
whose competences and performances are compared to the human
ones, leading to the psychology and neurology data reinterpretation,
respectively to the construction of new models. During these
processes if psychology provides the experimental basis, philosophy
and mathematics provides the abstraction level utterly necessary for
the intermission of the mentioned sciences.
The ongoing general problematic of the cognitive approach
provides two important types of approach: the computational one,
starting from the idea that the mental phenomenon can be reduced to
1 and 0 type calculus operations, and the connection one that
considers the thinking products as being a result of the interaction
between all the composing (included) systems. In the field of
psychology measurements in the computational register use classical
inquiries and psychometrical tests, generally based on calculus
methods. Deeming things from both sides that are representing the
cognitive science, we can notice a gap in psychological product
measurement possibilities, regarded from the connectionist
perspective, that requires the unitary understanding of the quality –
quantity whole. In such approach measurement by calculus proves to
be inefficient. Our researches, deployed for longer than 20 years,
lead to the conclusion that measuring by forms properly fits to the
connectionism laws and principles.
Abstract: The ever-growing usage of aspect-oriented
development methodology in the field of software engineering
requires tool support for both research environments and industry. So
far, tool support for many activities in aspect-oriented software
development has been proposed, to automate and facilitate their
development. For instance, the AJaTS provides a transformation
system to support aspect-oriented development and refactoring. In
particular, it is well established that the abstract interpretation of
programs, in any paradigm, pursued in static analysis is best served
by a high-level programs representation, such as Control Flow Graph
(CFG). This is why such analysis can more easily locate common
programmatic idioms for which helpful transformation are already
known as well as, association between the input program and
intermediate representation can be more closely maintained.
However, although the current researches define the good concepts
and foundations, to some extent, for control flow analysis of aspectoriented
programs but they do not provide a concrete tool that can
solely construct the CFG of these programs. Furthermore, most of
these works focus on addressing the other issues regarding Aspect-
Oriented Software Development (AOSD) such as testing or data flow
analysis rather than CFG itself. Therefore, this study is dedicated to
build an aspect-oriented control flow graph construction tool called
AJcFgraph Builder. The given tool can be applied in many software
engineering tasks in the context of AOSD such as, software testing,
software metrics, and so forth.
Abstract: Instead of traditional (nominal) classification we investigate
the subject of ordinal classification or ranking. An enhanced
method based on an ensemble of Support Vector Machines (SVM-s)
is proposed. Each binary classifier is trained with specific weights
for each object in the training data set. Experiments on benchmark
datasets and synthetic data indicate that the performance of our
approach is comparable to state of the art kernel methods for
ordinal regression. The ensemble method, which is straightforward
to implement, provides a very good sensitivity-specificity trade-off
for the highest and lowest rank.
Abstract: Automatic methods of detecting changes through
satellite imaging are the object of growing interest, especially
beca²use of numerous applications linked to analysis of the Earth’s
surface or the environment (monitoring vegetation, updating maps,
risk management, etc...). This work implemented spatial analysis
techniques by using images with different spatial and spectral
resolutions on different dates. The work was based on the principle
of control charts in order to set the upper and lower limits beyond
which a change would be noted. Later, the a contrario approach was
used. This was done by testing different thresholds for which the
difference calculated between two pixels was significant. Finally,
labeled images were considered, giving a particularly low difference
which meant that the number of “false changes” could be estimated
according to a given limit.
Abstract: Many high-risk pathogens that cause disease in
humans are transmitted through various food items. Food-borne
disease constitutes a major public health problem. Assessment of the
quality and safety of foods is important in human health. Rapid and
easy detection of pathogenic organisms will facilitate precautionary
measures to maintain healthy food. The Polymerase Chain Reaction
(PCR) is a handy tool for rapid detection of low numbers of bacteria.
We have designed gene specific primers for most common food
borne pathogens such as Staphylococci, Salmonella and E.coli.
Bacteria were isolated from food samples of various food outlets and
identified using gene specific PCRs. We identified Staphylococci,
Salmonella and E.coli O157 using gene specific primers by rapid and
direct PCR technique in various food samples. This study helps us in
getting a complete picture of the various pathogens that threaten to
cause and spread food borne diseases and it would also enable
establishment of a routine procedure and methodology for rapid
identification of food borne bacteria using the rapid technique of
direct PCR. This study will also enable us to judge the efficiency of
present food safety steps taken by food manufacturers and exporters.
Abstract: In this paper, we propose a geometric modeling of
illumination on the patterned image containing etching transistor. This
image is captured by a commercial camera during the inspection of
a TFT-LCD panel. Inspection of defect is an important process in the
production of LCD panel, but the regional difference in brightness,
which has a negative effect on the inspection, is due to the uneven
illumination environment. In order to solve this problem, we present
a geometric modeling of illumination consisting of an interpolation
using the least squares method and 3D modeling using bezier surface.
Our computational time, by using the sampling method, is shorter
than the previous methods. Moreover, it can be further used to correct
brightness in every patterned image.
Abstract: In the semiconductor manufacturing process, large
amounts of data are collected from various sensors of multiple
facilities. The collected data from sensors have several different characteristics
due to variables such as types of products, former processes
and recipes. In general, Statistical Quality Control (SQC) methods
assume the normality of the data to detect out-of-control states of
processes. Although the collected data have different characteristics,
using the data as inputs of SQC will increase variations of data,
require wide control limits, and decrease performance to detect outof-
control. Therefore, it is necessary to separate similar data groups
from mixed data for more accurate process control. In the paper,
we propose a regression tree using split algorithm based on Pearson
distribution to handle non-normal distribution in parametric method.
The regression tree finds similar properties of data from different
variables. The experiments using real semiconductor manufacturing
process data show improved performance in fault detecting ability.
Abstract: Majority of pepper farmers in Malaysia are using the
open-sun method for drying the pepper berries. This method is time
consuming and exposed the berries to rain and contamination. A
maintenance-friendly and properly enclosed dryer is therefore
desired. A dryer design with a solar collector and a chimney was
studied and adapted to suit the needs of small-scale pepper farmers in
Malaysia. The dryer will provide an environment with an optimum
operating temperature meant for drying pepper berries. The dryer
model was evaluated by using commercially available computational
fluid dynamic (CFD) software in order to understand the heat and
mass transfer inside the dryer. Natural convection was the only mode
of heat transportation considered in this study as in accordance to the
idea of having a simple and maintenance-friendly design. To
accommodate the effect of low buoyancy found in natural convection
driers, a biomass burner was integrated into the solar dryer design.
Abstract: If organizations like Mellat Bank want to identify its
customer market completely to reach its specified goals, it can
segment the market to offer the product package to the right segment.
Our objective is to offer a segmentation model for Iran banking
market in Mellat bank view. The methodology of this project is
combined by “segmentation on the basis of four part-quality
variables" and “segmentation on the basis of different in means".
Required data are gathered from E-Systems and researcher personal
observation. Finally, the research offers the organization that at first
step form a four dimensional matrix with 756 segments using four
variables named value-based, behavioral, activity style, and activity
level, and at the second step calculate the means of profit for every
cell of matrix in two distinguished work level (levels α1:normal
condition and α2: high pressure condition) and compare the segments
by checking two conditions that are 1- homogeneity every segment
with its sub segment and 2- heterogeneity with other segments, and
so it can do the necessary segmentation process. After all, the last
offer (more explained by an operational example and feedback
algorithm) is to test and update the model because of dynamic
environment, technology, and banking system.
Abstract: Data mining incorporates a group of statistical
methods used to analyze a set of information, or a data set. It operates
with models and algorithms, which are powerful tools with the great
potential. They can help people to understand the patterns in certain
chunk of information so it is obvious that the data mining tools have
a wide area of applications. For example in the theoretical chemistry
data mining tools can be used to predict moleculeproperties or
improve computer-assisted drug design. Classification analysis is one
of the major data mining methodologies. The aim of thecontribution
is to create a classification model, which would be able to deal with a
huge data set with high accuracy. For this purpose logistic regression,
Bayesian logistic regression and random forest models were built
using R software. TheBayesian logistic regression in Latent GOLD
software was created as well. These classification methods belong to
supervised learning methods.
It was necessary to reduce data matrix dimension before construct
models and thus the factor analysis (FA) was used. Those models
were applied to predict the biological activity of molecules, potential
new drug candidates.
Abstract: Evolutionary Programming (EP) represents a
methodology of Evolutionary Algorithms (EA) in which mutation is
considered as a main reproduction operator. This paper presents a
novel EP approach for Artificial Neural Networks (ANN) learning.
The proposed strategy consists of two components: the self-adaptive,
which contains phenotype information and the dynamic, which is
described by genotype. Self-adaptation is achieved by the addition of
a value, called the network weight, which depends on a total number
of hidden layers and an average number of neurons in hidden layers.
The dynamic component changes its value depending on the fitness
of a chromosome, exposed to mutation. Thus, the mutation step size
is controlled by two components, encapsulated in the algorithm,
which adjust it according to the characteristics of a predefined ANN
architecture and the fitness of a particular chromosome. The
comparative analysis of the proposed approach and the classical EP
(Gaussian mutation) showed, that that the significant acceleration of
the evolution process is achieved by using both phenotype and
genotype information in the mutation strategy.
Abstract: A novel approach to speech coding using the hybrid architecture is presented. Advantages of parametric and perceptual coding methods are utilized together in order to create a speech coding algorithm assuring better signal quality than in traditional CELP parametric codec. Two approaches are discussed. One is based on selection of voiced signal components that are encoded using parametric algorithm, unvoiced components that are encoded perceptually and transients that remain unencoded. The second approach uses perceptual encoding of the residual signal in CELP codec. The algorithm applied for precise transient selection is described. Signal quality achieved using the proposed hybrid codec is compared to quality of some standard speech codecs.