Abstract: Artificial Immune System is applied as a Heuristic
Algorithm for decades. Nevertheless, many of these applications
took advantage of the benefit of this algorithm but seldom proposed
approaches for enhancing the efficiency. In this paper, a
Self-evolving Artificial Immune System is proposed via developing
the T and B cell in Immune System and built a self-evolving
mechanism for the complexities of different problems. In this
research, it focuses on enhancing the efficiency of Clonal selection
which is responsible for producing Affinities to resist the invading of
Antigens. T and B cell are the main mechanisms for Clonal
Selection to produce different combinations of Antibodies.
Therefore, the development of T and B cell will influence the
efficiency of Clonal Selection for searching better solution.
Furthermore, for better cooperation of the two cells, a co-evolutional
strategy is applied to coordinate for more effective productions of
Antibodies. This work finally adopts Flow-shop scheduling
instances in OR-library to validate the proposed algorithm.
Abstract: The cables in a nuclear power plant are designed to be
used for about 40 years in safe operation environment. However, the
heat and radiation in the nuclear power plant causes the rapid
performance deterioration of cables in nuclear vessels and heat
exchangers, which requires cable lifetime estimation. The most
accurate method of estimating the cable lifetime is to evaluate the
cables in a laboratory. However, removing cables while the plant is
operating is not allowed because of its safety and cost. In this paper, a
robot system to estimate the cable lifetime in nuclear power plants is
developed and tested. The developed robot system can calculate a
modulus value to estimate the cable lifetime even when the nuclear
power plant is in operation.
Abstract: Study of fire and explosion is very important mainly
in oil and gas industries due to several accidents which have been
reported in the past and present. In this work, we have investigated
the flammability of bio oil vapour mixtures. This mixture may
contribute to fire during the storage and transportation process. Bio
oil sample derived from Palm Kernell shell was analysed using Gas
Chromatography Mass Spectrometry (GC-MS) to examine the
composition of the sample. Mole fractions of 12 selected
components in the liquid phase were obtained from the GC-FID data
and used to calculate mole fractions of components in the gas phase
via modified Raoult-s law. Lower Flammability Limits (LFLs) and
Upper Flammability Limits (UFLs) for individual components were
obtained from published literature. However, stoichiometric
concentration method was used to calculate the flammability limits
of some components which their flammability limit values are not
available in the literature. The LFL and UFL values for the mixture
were calculated using the Le Chatelier equation. The LFLmix and
UFLmix values were used to construct a flammability diagram and
subsequently used to determine the flammability of the mixture. The
findings of this study can be used to propose suitable inherently
safer method to prevent the flammable mixture from occurring and
to minimizing the loss of properties, business, and life due to fire
accidents in bio oil productions.
Abstract: This paper describes about the process of recognition and classification of brain images such as normal and abnormal based on PSO-SVM. Image Classification is becoming more important for medical diagnosis process. In medical area especially for diagnosis the abnormality of the patient is classified, which plays a great role for the doctors to diagnosis the patient according to the severeness of the diseases. In case of DICOM images it is very tough for optimal recognition and early detection of diseases. Our work focuses on recognition and classification of DICOM image based on collective approach of digital image processing. For optimal recognition and classification Particle Swarm Optimization (PSO), Genetic Algorithm (GA) and Support Vector Machine (SVM) are used. The collective approach by using PSO-SVM gives high approximation capability and much faster convergence.
Abstract: Currently, there are many local area industrial networks
that can give guaranteed bandwidth to synchronous traffic, particularly
providing CBR channels (Constant Bit Rate), which allow
improved bandwidth management. Some of such networks operate
over Ethernet, delivering channels with enough capacity, specially
with compressors, to integrate multimedia traffic in industrial monitoring
and image processing applications with many sources. In
these industrial environments where a low latency is an essential
requirement, JPEG is an adequate compressing technique but it
generates VBR traffic (Variable Bit Rate). Transmitting VBR traffic
in CBR channels is inefficient and current solutions to this problem
significantly increase the latency or further degrade the quality. In
this paper an R(q) model is used which allows on-line calculation of
the JPEG quantification factor. We obtained increased quality, a lower
requirement for the CBR channel with reduced number of discarded
frames along with better use of the channel bandwidth.
Abstract: This paper deals with efficient computation of
probability coefficients which offers computational simplicity as
compared to spectral coefficients. It eliminates the need of inner
product evaluations in determination of signature of a combinational
circuit realizing given Boolean function. The method for computation
of probability coefficients using transform matrix, fast transform
method and using BDD is given. Theoretical relations for achievable
computational advantage in terms of required additions in computing
all 2n probability coefficients of n variable function have been
developed. It is shown that for n ≥ 5, only 50% additions are needed
to compute all probability coefficients as compared to spectral
coefficients. The fault detection techniques based on spectral
signature can be used with probability signature also to offer
computational advantage.
Abstract: The article presents test results on the changes
occurring in sewage sludge during the process of its storage. Tests
were conducted on mechanically dehydrated sewage sludge derived
from large municipal sewage treatment plants equipped with
biological sewage treatment systems. In testing presented in the paper
the focus was on the basic fuel properties of sewage sludge: moisture
content, heat of combustion, carbon share. In the first part of the
article the overview of the issues concerning the sewage sludge
management is presented and the genesis of tests is explained.
Further in the paper, selected results of conducted tests are discussed.
Changes in tested parameters were determined in the period of a 10-
month sewage storage.
Abstract: The equilibrium chemical reactions taken place in a converter reactor of the Khorasan Petrochemical Ammonia plant was studied using the minimization of Gibbs free energy method. In the minimization of the Gibbs free energy function the Davidon– Fletcher–Powell (DFP) optimization procedure using the penalty terms in the well-defined objective function was used. It should be noted that in the DFP procedure along with the corresponding penalty terms the Hessian matrices for the composition of constituents in the Converter reactor can be excluded. This, in fact, can be considered as the main advantage of the DFP optimization procedure. Also the effect of temperature and pressure on the equilibrium composition of the constituents was investigated. The results obtained in this work were compared with the data collected from the converter reactor of the Khorasan Petrochemical Ammonia plant. It was concluded that the results obtained from the method used in this work are in good agreement with the industrial data. Notably, the algorithm developed in this work, in spite of its simplicity, takes the advantage of short computation and convergence time.
Abstract: In an era of knowledge explosion, the growth of data
increases rapidly day by day. Since data storage is a limited resource,
how to reduce the data space in the process becomes a challenge issue.
Data compression provides a good solution which can lower the
required space. Data mining has many useful applications in recent
years because it can help users discover interesting knowledge in large
databases. However, existing compression algorithms are not
appropriate for data mining. In [1, 2], two different approaches were
proposed to compress databases and then perform the data mining
process. However, they all lack the ability to decompress the data to
their original state and improve the data mining performance. In this
research a new approach called Mining Merged Transactions with the
Quantification Table (M2TQT) was proposed to solve these problems.
M2TQT uses the relationship of transactions to merge related
transactions and builds a quantification table to prune the candidate
itemsets which are impossible to become frequent in order to improve
the performance of mining association rules. The experiments show
that M2TQT performs better than existing approaches.
Abstract: One major difficulty that faces developers of
concurrent and distributed software is analysis for concurrency based
faults like deadlocks. Petri nets are used extensively in the
verification of correctness of concurrent programs. ECATNets [2] are
a category of algebraic Petri nets based on a sound combination of
algebraic abstract types and high-level Petri nets. ECATNets have
'sound' and 'complete' semantics because of their integration in
rewriting logic [12] and its programming language Maude [13].
Rewriting logic is considered as one of very powerful logics in terms
of description, verification and programming of concurrent systems.
We proposed in [4] a method for translating Ada-95 tasking
programs to ECATNets formalism (Ada-ECATNet). In this paper,
we show that ECATNets formalism provides a more compact
translation for Ada programs compared to the other approaches based
on simple Petri nets or Colored Petri nets (CPNs). Such translation
doesn-t reduce only the size of program, but reduces also the number
of program states. We show also, how this compact Ada-ECATNet
may be reduced again by applying reduction rules on it. This double
reduction of Ada-ECATNet permits a considerable minimization of
the memory space and run time of corresponding Maude program.
Abstract: This paper presents a remote on-line diagnostic system
for vehicles via the use of On-Board Diagnostic (OBD), GPS, and 3G
techniques. The main parts of the proposed system are on-board
computer, vehicle monitor server, and vehicle status browser. First,
the on-board computer can obtain the location of deriver and vehicle
status from GPS receiver and OBD interface, respectively. Then
on-board computer will connect with the vehicle monitor server
through 3G network to transmit the real time vehicle system status.
Finally, vehicle status browser could show the remote vehicle status
including vehicle speed, engine rpm, battery voltage, engine coolant
temperature, and diagnostic trouble codes. According to the
experimental results, the proposed system can help fleet managers and
car knockers to understand the remote vehicle status. Therefore this
system can decrease the time of fleet management and vehicle repair
due to the fleet managers and car knockers who find the diagnostic
trouble messages in time.
Abstract: This paper analyzed the perception of e-commerce
application services by construction material traders in Malaysia.
Five attributes were tested: usability, reputation, trust, privacy and
familiarity. Study methodology consists of survey questionnaire and
statistical analysis that includes reliability analysis, factor analysis,
ANOVA and regression analysis. The respondents were construction
material traders, including hardware stores in Klang Valley, Kuala
Lumpur.
Findings support that usability and familiarity with e-commerce
services in Malaysia have insignificant influence on the acceptance of
e-commerce application. However, reputation, trust and privacy
attributes have significant influence on the choice of e-commerce
acceptance by construction material traders. E-commerce
applications studied included customer database, e-selling, emarketing,
e-payment, e-buying and online advertising. Assumptions
are made that traders have basic knowledge and exposure to ICT
services. i.e. internet service and computers. Study concludes that
reputation, privacy and trust are the three website attributes that
influence the acceptance of e-commerce by construction material
traders.
Abstract: Monitoring the tool flank wear without affecting the
throughput is considered as the prudent method in production
technology. The examination has to be done without affecting the
machining process. In this paper we proposed a novel work that is
used to determine tool flank wear by observing the sound signals
emitted during the turning process. The work-piece material we used
here is steel and aluminum and the cutting insert was carbide
material. Two different cutting speeds were used in this work. The
feed rate and the cutting depth were constant whereas the flank wear
was a variable. The emitted sound signal of a fresh tool (0 mm flank
wear) a slightly worn tool (0.2 -0.25 mm flank wear) and a severely
worn tool (0.4mm and above flank wear) during turning process were
recorded separately using a high sensitive microphone. Analysis
using Singular Value Decomposition was done on these sound
signals to extract the feature sound components. Observation of the
results showed that an increase in tool flank wear correlates with an
increase in the values of SVD features produced out of the sound
signals for both the materials. Hence it can be concluded that wear
monitoring of tool flank during turning process using SVD features
with the Fuzzy C means classification on the emitted sound signal is
a potential and relatively simple method.
Abstract: This paper aims at to develop a robust optimization methodology for the mechatronic modules of machine tools by considering all important characteristics from all structural and control domains in one single process. The relationship between these two domains is strongly coupled. In order to reduce the disturbance caused by parameters in either one, the mechanical and controller design domains need to be integrated. Therefore, the concurrent integrated design method Design For Control (DFC), will be employed in this paper. In this connect, it is not only applied to achieve minimal power consumption but also enhance structural performance and system response at same time. To investigate the method for integrated optimization, a mechatronic feed drive system of the machine tools is used as a design platform. Pro/Engineer and AnSys are first used to build the 3D model to analyze and design structure parameters such as elastic deformation, nature frequency and component size, based on their effects and sensitivities to the structure. In addition, the robust controller,based on Quantitative Feedback Theory (QFT), will be applied to determine proper control parameters for the controller. Therefore, overall physical properties of the machine tool will be obtained in the initial stage. Finally, the technology of design for control will be carried out to modify the structural and control parameters to achieve overall system performance. Hence, the corresponding productivity is expected to be greatly improved.
Abstract: In order to study the effect of plant density and
competition of wheat with field bindweed (Convolvulus arvensis) on
yield and agronomical properties of wheat(Triticum Sativum) in
irrigated conditions, a factorial experiment as the base of complete
randomize block design in three replication was conducted at the
field of Kamalvand in khoramabad (Lorestan) region of Iran during
2008-2009. Three plant density (Factor A=200, 230 and 260kg/ha)
three cultivar (Factor B=Bahar,Pishtaz and Alvand) and weed control
(Factor C= control and no control of weeds)were assigned in
experiment. Results show that: Plant density had significant effect
(statistically) on seed yield, 1000 seed weight, weed density and dry
weight of weeds, seed yield and harvest index had been meaningful
effect for cultivars. The interaction between plant density and
cultivars for weed density, seed yield, thousand seed weight and
harvest index were significant. 260 kg/ha (plant density) of wheat had
more effect on increasing of seed yield in Bahar cultivar wheat in
khoramabad region of Iran.
Abstract: Safety, river environment, and sediment utilization are the elements of the target of sediment management. As a change in an element by sediment management, may affect the other two elements, and the priority among three elements depends on stakeholders. It is necessary to develop a method to evaluate the effect of sediment management on each element and an integrated evaluation method for socio-economic effect. In this study, taking Mount Merapi basin as an investigation field, the method for an active volcanic basin was developed. An integrated evaluation method for sediment management was discussed from a socio-economic point on safety, environment, and sediment utilization and a case study of sediment management was evaluated by means of this method. To evaluate the effect of sediment management, some parameters on safety, utilization, and environment have been introduced. From a utilization point of view, job opportunity, additional income of local people, and tax income to local government were used to evaluate the effectiveness of sediment management. The risk degree of river infrastructure was used to describe the effect of sediment management on a safety aspect. To evaluate the effects of sediment management on environment, the mean diameter of grain size distribution of riverbed surface was used. On the coordinate system designating these elements, the direction of change in basin condition by sediment management can be predicted, so that the most preferable sediment management can be decided. The results indicate that the cases of sediment management tend to give the negative impacts on sediment utilization. However, these sediment managements will give positive impacts on safety and environment condition. Evaluation result from a social-economic point of view shows that the case study of sediment management reduces job opportunity and additional income for inhabitants as well as tax income for government. Therefore, it is necessary to make another policy for creating job opportunity for inhabitants to support these sediment managements.
Abstract: Abrasive waterjet is a novel machining process capable of processing wide range of hard-to-machine materials. This research addresses modeling and optimization of the process parameters for this machining technique. To model the process a set of experimental data has been used to evaluate the effects of various parameter settings in cutting 6063-T6 aluminum alloy. The process variables considered here include nozzle diameter, jet traverse rate, jet pressure and abrasive flow rate. Depth of cut, as one of the most important output characteristics, has been evaluated based on different parameter settings. The Taguchi method and regression modeling are used in order to establish the relationships between input and output parameters. The adequacy of the model is evaluated using analysis of variance (ANOVA) technique. The pairwise effects of process parameters settings on process response outputs are also shown graphically. The proposed model is then embedded into a Simulated Annealing algorithm to optimize the process parameters. The optimization is carried out for any desired values of depth of cut. The objective is to determine proper levels of process parameters in order to obtain a certain level of depth of cut. Computational results demonstrate that the proposed solution procedure is quite effective in solving such multi-variable problems.
Abstract: This paper presents a sensing system for 3D sensing
and mapping by a tracked mobile robot with an arm-type sensor
movable unit and a laser range finder (LRF). The arm-type sensor
movable unit is mounted on the robot and the LRF is installed at the
end of the unit. This system enables the sensor to change position and
orientation so that it avoids occlusions according to terrain by this
mechanism. This sensing system is also able to change the height of
the LRF by keeping its orientation flat for efficient sensing. In this kind
of mapping, it may be difficult for moving robot to apply mapping
algorithms such as the iterative closest point (ICP) because sets of the
2D data at each sensor height may be distant in a common surface. In
order for this kind of mapping, the authors therefore applied
interpolation to generate plausible model data for ICP. The results of
several experiments provided validity of these kinds of sensing and
mapping in this sensing system.
Abstract: This paper considers a scheduling problem in flexible
flow shops environment with the aim of minimizing two important
criteria including makespan and cumulative tardiness of jobs. Since
the proposed problem is known as an Np-hard problem in literature,
we have to develop a meta-heuristic to solve it. We considered
general structure of Genetic Algorithm (GA) and developed a new
version of that based on Data Envelopment Analysis (DEA). Two
objective functions assumed as two different inputs for each Decision
Making Unit (DMU). In this paper we focused on efficiency score of
DMUs and efficient frontier concept in DEA technique. After
introducing the method we defined two different scenarios with
considering two types of mutation operator. Also we provided an
experimental design with some computational results to show the
performance of algorithm. The results show that the algorithm
implements in a reasonable time.
Abstract: In this article, the design of a Supply Chain Network
(SCN) consisting of several suppliers, production plants, distribution
centers and retailers, is considered. Demands of retailers are
considered stochastic parameters, so we generate amounts of data via
simulation to extract a few demand scenarios. Then a mixed integer
two-stage programming model is developed to optimize
simultaneously two objectives: (1) minimization the fixed and
variable cost, (2) maximization the service level. A weighting method
is utilized to solve this two objective problem and a numerical
example is made to show the performance of the model.