Abstract: Training neural networks to capture an intrinsic
property of a large volume of high dimensional data is a difficult
task, as the training process is computationally expensive. Input
attributes should be carefully selected to keep the dimensionality of
input vectors relatively small.
Technical indexes commonly used for stock market prediction
using neural networks are investigated to determine its effectiveness
as inputs. The feed forward neural network of Levenberg-Marquardt
algorithm is applied to perform one step ahead forecasting of
NASDAQ and Dow stock prices.
Abstract: In this paper, the decomposition-aggregation method
is used to carry out connective stability criteria for general linear
composite system via aggregation. The large scale system is
decomposed into a number of subsystems. By associating directed
graphs with dynamic systems in an essential way, we define the
relation between system structure and stability in the sense of
Lyapunov. The stability criteria is then associated with the stability
and system matrices of subsystems as well as those interconnected
terms among subsystems using the concepts of vector differential
inequalities and vector Lyapunov functions. Then, we show that the
stability of each subsystem and stability of the aggregate model
imply connective stability of the overall system. An example is
reported, showing the efficiency of the proposed technique.
Abstract: Air emissions from waste treatment plants often
consist of a combination of Volatile Organic Compounds (VOCs)
and odors. Hydrogen sulfide is one of the major odorous gases
present in the waste emissions coming from municipal wastewater
treatment facilities. Hydrogen sulfide (H2S) is odorous, highly toxic
and flammable. Exposure to lower concentrations can result in eye
irritation, a sore throat and cough, shortness of breath, and fluid in
the lungs. Biofiltration has become a widely accepted technology for
treating air streams containing H2S. When compared with other nonbiological
technologies, biofilter is more cost-effective for treating large
volumes of air containing low concentrations of biodegradable compounds.
Optimization of biofilter media is essential for many reasons such as:
providing a higher surface area for biofilm growth, low pressure drop,
physical stability, and good moisture retention. In this work, a novel
biofilter media is developed and tested at a pumping station of a
municipality located in the United Arab Emirates (UAE). The
media is found to be very effective (>99%) in removing H2S
concentrations that are expected in pumping stations under steady
state and shock loading conditions.
Abstract: Society has grown to rely on Internet services, and the
number of Internet users increases every day. As more and more
users become connected to the network, the window of opportunity
for malicious users to do their damage becomes very great and
lucrative. The objective of this paper is to incorporate different
techniques into classier system to detect and classify intrusion from
normal network packet. Among several techniques, Steady State
Genetic-based Machine Leaning Algorithm (SSGBML) will be used
to detect intrusions. Where Steady State Genetic Algorithm (SSGA),
Simple Genetic Algorithm (SGA), Modified Genetic Algorithm and
Zeroth Level Classifier system are investigated in this research.
SSGA is used as a discovery mechanism instead of SGA. SGA
replaces all old rules with new produced rule preventing old good
rules from participating in the next rule generation. Zeroth Level
Classifier System is used to play the role of detector by matching
incoming environment message with classifiers to determine whether
the current message is normal or intrusion and receiving feedback
from environment. Finally, in order to attain the best results,
Modified SSGA will enhance our discovery engine by using Fuzzy
Logic to optimize crossover and mutation probability. The
experiments and evaluations of the proposed method were performed
with the KDD 99 intrusion detection dataset.
Abstract: The increasing complexity of software development based on peer to peer networks makes necessary the creation of new frameworks in order to simplify the developer-s task. Additionally, some applications, e.g. fire detection or security alarms may require real-time constraints and the high level definition of these features eases the application development. In this paper, a service model based on a component model with real-time features is proposed. The high-level model will abstract developers from implementation tasks, such as discovery, communication, security or real-time requirements. The model is oriented to deploy services on small mobile devices, such as sensors, mobile phones and PDAs, where the computation is light-weight. Services can be composed among them by means of the port concept to form complex ad-hoc systems and their implementation is carried out using a component language called UM-RTCOM. In order to apply our proposals a fire detection application is described.
Abstract: This study was conducted to investigate the incidence
of pathogenic bacteria: Salmonella, Shigella, Escherichia coli O157
and Staphylococcus aureus in cakes and tarts collected from thirtyfive
confectionery producing and selling premises located within
Tripoli city, Libya. The results revealed an incidence of S. aureus
with 94.4 and 48.0 %, E. coli O157 with 14.7 and 4.0 % and Salmonella
sp. with 5.9 and 8.0 % in cakes and tarts samples respectively;
while Shigella was not detected in all samples. In order to determine
the source of these pathogenic bacteria, cotton swabs were taken
from the hands of workers on the production line, the surfaces of
preparation tables and cream whipping instruments. The results
showed that the cotton swabs obtained from the hands of workers
contained S. aureus and Salmonella sp. with an incidence of 42.9 and
2.9 %, the cotton swabs obtained from the surfaces of preparation
tables 22.9 and 2.9 % and the cotton swabs obtained from the cream
whipping instruments 14.3 and 0.0 % respectively; while E. coli
O157 and Shigella sp. were not detected in all swabs. Additionally,
other bacteria were isolated from the hands of workers and the Surfaces
of producing equipments included: Aeromonas sp., Pseudomonas
sp., E. coli, Klebsiella sp., Enterobacter sp., Citrobacter sp.,
Proteus sp., Serratia sp. and Acinetobacter sp. These results indicate
that some of the cakes and tarts might pose threat to consumer's
health. Meanwhile, occurrences of pathogenic bacteria on the hands
of those who are working in production line and the surfaces of
equipments reflect poor hygienic practices at most confectionery
premises examined in this study. Thus, firm and continuous surveillance
of these premises is needed to insure the consumer's health and
safety.
Abstract: In this paper we analyze the core issues affecting
software architecture in enterprise projects where a large number of
people at different backgrounds are involved and complex business,
management and technical problems exist. We first give general
features of typical enterprise projects and then present foundations of
software architectures. The detailed analysis of core issues affecting
software architecture in software development phases is given. We
focus on three main areas in each development phase: people,
process, and management related issues, structural (product) issues,
and technology related issues. After we point out core issues and
problems in these main areas, we give recommendations for
designing good architecture. We observed these core issues and the
importance of following the best software development practices and
also developed some novel practices in many big enterprise
commercial and military projects in about 10 years of experience.
Abstract: An electrocardiogram (ECG) feature extraction system
based on the calculation of the complex resonance frequency
employing Prony-s method is developed. Prony-s method is applied
on five different classes of ECG signals- arrhythmia as a finite sum
of exponentials depending on the signal-s poles and the resonant
complex frequencies. Those poles and resonance frequencies of the
ECG signals- arrhythmia are evaluated for a large number of each
arrhythmia. The ECG signals of lead II (ML II) were taken from
MIT-BIH database for five different types. These are the ventricular
couplet (VC), ventricular tachycardia (VT), ventricular bigeminy
(VB), and ventricular fibrillation (VF) and the normal (NR). This
novel method can be extended to any number of arrhythmias.
Different classification techniques were tried using neural networks
(NN), K nearest neighbor (KNN), linear discriminant analysis (LDA)
and multi-class support vector machine (MC-SVM).
Abstract: Mobile IPv6 (MIPv6) describes how mobile node can change its point of attachment from one access router to another. As a demand for wireless mobile devices increases, many enhancements for macro-mobility (inter-domain) protocols have been proposed, designed and implemented in Mobile IPv6. Hierarchical Mobile IPv6 (HMIPv6) is one of them that is designed to reduce the amount of signaling required and to improve handover speed for mobile connections. This is achieved by introducing a new network entity called Mobility Anchor Point (MAP). This report presents a comparative study of the Hierarchical Mobility IPv6 and Mobile IPv6 protocols and we have narrowed down the scope to micro-mobility (intra-domain). The architecture and operation of each protocol is studied and they are evaluated based on the Quality of Service (QoS) parameter; handover latency. The simulation was carried out by using the Network Simulator-2. The outcome from this simulation has been discussed. From the results, it shows that, HMIPv6 performs best under intra-domain mobility compared to MIPv6. The MIPv6 suffers large handover latency. As enhancement we proposed to HMIPv6 to locate the MAP to be in the middle of the domain with respect to all Access Routers. That gives approximately same distance between MAP and Mobile Node (MN) regardless of the new location of MN, and possible shorter distance. This will reduce the delay since the distance is shorter. As a future work performance analysis is to be carried for the proposed HMIPv6 and compared to HMIPv6.
Abstract: Condition monitoring of electrical power equipment
has attracted considerable attention for many years. The aim of this
paper is to use Labview with Fuzzy Logic controller to build a
simulation system to diagnose transformer faults and monitor its
condition. The front panel of the system was designed using
LabVIEW to enable computer to act as customer-designed
instrument. The dissolved gas-in-oil analysis (DGA) method was
used as technique for oil type transformer diagnosis; meanwhile
terminal voltages and currents analysis method was used for dry type
transformer. Fuzzy Logic was used as expert system that assesses all
information keyed in at the front panel to diagnose and predict the
condition of the transformer. The outcome of the Fuzzy Logic
interpretation will be displayed at front panel of LabVIEW to show
the user the conditions of the transformer at any time.
Abstract: This paper presents the study of parameters affecting
the environment protection in the printing industry. The paper has
also compared LCA studies performed within the printing industry in
order to identify common practices, limitations, areas for
improvement, and opportunities for standardization. This comparison
is focused on the data sources and methodologies used in the printing
pollutants register. The presented concepts, methodology and results
represent the contribution to the sustainable development
management. Furthermore, the paper analyzes the result of the
quantitative identification of hazardous substances emitted in printing
industry of Novi Sad.
Abstract: In current common research reports, salient regions
are usually defined as those regions that could present the main
meaningful or semantic contents. However, there are no uniform
saliency metrics that could describe the saliency of implicit image
regions. Most common metrics take those regions as salient regions,
which have many abrupt changes or some unpredictable
characteristics. But, this metric will fail to detect those salient useful
regions with flat textures. In fact, according to human semantic
perceptions, color and texture distinctions are the main characteristics
that could distinct different regions. Thus, we present a novel saliency
metric coupled with color and texture features, and its corresponding
salient region extraction methods. In order to evaluate the
corresponding saliency values of implicit regions in one image, three
main colors and multi-resolution Gabor features are respectively used
for color and texture features. For each region, its saliency value is
actually to evaluate the total sum of its Euclidean distances for other
regions in the color and texture spaces. A special synthesized image
and several practical images with main salient regions are used to
evaluate the performance of the proposed saliency metric and other
several common metrics, i.e., scale saliency, wavelet transform
modulus maxima point density, and important index based metrics.
Experiment results verified that the proposed saliency metric could
achieve more robust performance than those common saliency
metrics.
Abstract: In this paper, creep constitutive equations of base
(Parent) and weld materials of the weldment for cold-drawn 304L
stainless steel have been obtained experimentally. For this purpose,
test samples have been generated from cold drawn bars and weld
material according to the ASTM standard. The creep behavior and
properties have been examined for these materials by conducting uniaxial
creep tests. Constant temperatures and constant load uni-axial
creep tests have been carried out at two high temperatures, 680 and
720 oC, subjected to constant loads, which produce initial stresses
ranging from 240 to 360 MPa. The experimental data have been used
to obtain the creep constitutive parameters using numerical
optimization techniques.
Abstract: Complexity, as a theoretical background has made it
easier to understand and explain the features and dynamic behavior
of various complex systems. As the common theoretical background
has confirmed, borrowing the terminology for design from the
natural sciences has helped to control and understand urban
complexity. Phenomena like self-organization, evolution and
adaptation are appropriate to describe the formerly inaccessible
characteristics of the complex environment in unpredictable bottomup
systems. Increased computing capacity has been a key element in
capturing the chaotic nature of these systems.
A paradigm shift in urban planning and architectural design has
forced us to give up the illusion of total control in urban
environment, and consequently to seek for novel methods for
steering the development. New methods using dynamic modeling
have offered a real option for more thorough understanding of
complexity and urban processes. At best new approaches may renew
the design processes so that we get a better grip on the complex
world via more flexible processes, support urban environmental
diversity and respond to our needs beyond basic welfare by liberating
ourselves from the standardized minimalism.
A complex system and its features are as such beyond human
ethics. Self-organization or evolution is either good or bad. Their
mechanisms are by nature devoid of reason. They are common in
urban dynamics in both natural processes and gas. They are features
of a complex system, and they cannot be prevented. Yet their
dynamics can be studied and supported.
The paradigm of complexity and new design approaches has been
criticized for a lack of humanity and morality, but the ethical
implications of scientific or computational design processes have not
been much discussed. It is important to distinguish the (unexciting)
ethics of the theory and tools from the ethics of computer aided
processes based on ethical decisions. Urban planning and architecture
cannot be based on the survival of the fittest; however, the natural
dynamics of the system cannot be impeded on grounds of being
“non-human".
In this paper the ethical challenges of using the dynamic models
are contemplated in light of a few examples of new architecture and
dynamic urban models and literature. It is suggested that ethical
challenges in computational design processes could be reframed
under the concepts of responsibility and transparency.
Abstract: The segmentation of endovascular tools in fluoroscopy images can be accurately performed automatically or by minimum user intervention, using known modern techniques. It has been proven in literature, but no clinical implementation exists so far because the computational time requirements of such technology have not yet been met. A classical segmentation scheme is composed of edge enhancement filtering, line detection, and segmentation. A new method is presented that consists of a vector that propagates in the image to track an edge as it advances. The filtering is performed progressively in the projected path of the vector, whose orientation allows for oriented edge detection, and a minimal image area is globally filtered. Such an algorithm is rapidly computed and can be implemented in real-time applications. It was tested on medical fluoroscopy images from an endovascular cerebral intervention. Ex- periments showed that the 2D tracking was limited to guidewires without intersection crosspoints, while the 3D implementation was able to cope with such planar difficulties.
Abstract: Applicability of tuning the controller gains for Stewart manipulator using genetic algorithm as an efficient search technique is investigated. Kinematics and dynamics models were introduced in detail for simulation purpose. A PD task space control scheme was used. For demonstrating technique feasibility, a Stewart manipulator numerical-model was built. A genetic algorithm was then employed to search for optimal controller gains. The controller was tested onsite a generic circular mission. The simulation results show that the technique is highly convergent with superior performance operating for different payloads.
Abstract: Radio-frequency identification has entered as a beneficial means with conforming GS1 standards to provide the best solutions in the manufacturing area. It competes with other automated identification technologies e.g. barcodes and smart cards with regard to high speed scanning, reliability and accuracy as well. The purpose of this study is to improve production line-s performance by implementing RFID system in the manufacturing area on the basis of radio-frequency identification (RFID) system by 3D modeling in the program Cinema 4D R13 which provides obvious graphical scenes for users to portray their applications. Finally, with regard to improving system performance, it shows how RFID appears as a well-suited technology in a comparison of the barcode scanner to handle different kinds of raw materials in the production line base on logical process.
Abstract: In this study we applied thermal lens (TL) technique
to study the effect of size on thermal diffusivity of cadmium sulphide
(CdS) nanofluid prepared by using γ-radiation method containing
particles with different sizes. In TL experimental set up a diode laser
of wavelength 514 nm and intensity stabilized He-Ne laser were used
as the excitation source and the probe beam respectively,
respectively. The experimental results showed that the thermal
diffusivity value of CdS nanofluid increases when the of particle size
increased.
Abstract: Laboratory activities have produced benefits in
student learning. With current drives of new technology resources
and evolving era of education methods, renewal status of learning
and teaching in laboratory methods are in progress, for both learners
and the educators. To enhance learning outcomes in laboratory works
particularly in engineering practices and testing, learning via handson
by instruction may not sufficient. This paper describes and
compares techniques and implementation of traditional (expository)
with open-ended laboratory (problem-based) for two consecutive
cohorts studying environmental laboratory course in civil engineering
program. The transition of traditional to problem-based findings and
effect were investigated in terms of course assessment student
feedback survey, course outcome learning measurement and student
performance grades. It was proved that students have demonstrated
better performance in their grades and 12% increase in the course
outcome (CO) in problem-based open-ended laboratory style than
traditional method; although in perception, students has responded
less favorable in their feedback.
Abstract: This paper employs a new approach to regulate the
blood glucose level of type I diabetic patient under an intensive
insulin treatment. The closed-loop control scheme incorporates
expert knowledge about treatment by using reinforcement learning
theory to maintain the normoglycemic average of 80 mg/dl and the
normal condition for free plasma insulin concentration in severe
initial state. The insulin delivery rate is obtained off-line by using Qlearning
algorithm, without requiring an explicit model of the
environment dynamics. The implementation of the insulin delivery
rate, therefore, requires simple function evaluation and minimal
online computations. Controller performance is assessed in terms of
its ability to reject the effect of meal disturbance and to overcome the
variability in the glucose-insulin dynamics from patient to patient.
Computer simulations are used to evaluate the effectiveness of the
proposed technique and to show its superiority in controlling
hyperglycemia over other existing algorithms