Abstract: When architecting an application, key nonfunctional requirements such as performance, scalability, availability and security, which influence the architecture of the system, are some times not adequately addressed. Performance of the application may not be looked at until there is a concern. There are several problems with this reactive approach. If the system does not meet its performance objectives, the application is unlikely to be accepted by the stakeholders. This paper suggests an approach for performance modeling for web based J2EE and .Net applications to address performance issues early in the development life cycle. It also includes a Performance Modeling Case Study, with Proof-of-Concept (PoC) and implementation details for .NET and J2EE platforms.
Abstract: In the present study, a procedure was developed to
determine the optimum reaction rate constants in generalized
Arrhenius form and optimized through the Nelder-Mead method. For
this purpose, a comprehensive mathematical model of a fixed bed
reactor for dehydrogenation of heavy paraffins over Pt–Sn/Al2O3
catalyst was developed. Utilizing appropriate kinetic rate expressions
for the main dehydrogenation reaction as well as side reactions and
catalyst deactivation, a detailed model for the radial flow reactor was
obtained. The reactor model composed of a set of partial differential
equations (PDE), ordinary differential equations (ODE) as well as
algebraic equations all of which were solved numerically to
determine variations in components- concentrations in term of mole
percents as a function of time and reactor radius. It was demonstrated
that most significant variations observed at the entrance of the bed
and the initial olefin production obtained was rather high. The
aforementioned method utilized a direct-search optimization
algorithm along with the numerical solution of the governing
differential equations. The usefulness and validity of the method was
demonstrated by comparing the predicted values of the kinetic
constants using the proposed method with a series of experimental
values reported in the literature for different systems.
Abstract: The mechanism of microwave heating is essentially
that of dielectric heating. After exposing the emulsion to the
microwave Electromagnetic (EM) field, molecular rotation and ionic
conduction due to the penetration of (EM) into the emulsion are
responsible for the internal heating. To determine the capability of
microwave technology in demulsification of crude oil emulsions,
microwave demulsification method was applied in a 50-50 % and 20-
80 % water-in-oil emulsions with microwave exposure time varied
from 20-180 sec. Transient temperature profiles of water-in-oil
emulsions inside a cylindrical container were measured. The
temperature rise at a given location was almost horizontal (linear).
The average rates of temperature increase of 50-50 % and 20-80 %
water-in-oil emulsions are 0.351 and 0.437 oC/sec, respectively. The
rate of temperature increase of emulsions decreased at higher
temperature due to decreasing dielectric loss of water. These results
indicate that microwave demulsification of water-in-oil emulsions
does not require chemical additions. Microwave has the potential to
be used as an alternative way in the demulsification process.
Abstract: Guaranteeing the availability of the required parts at
the scheduled time represents a key logistical challenge. This is
especially important when several parts are required together. This
article describes a tool that supports the positioning in the area of
conflict between low stock costs and a high service level for a
consumer.
Abstract: Cryptographic algorithms play a crucial role in the
information society by providing protection from unauthorized
access to sensitive data. It is clear that information technology will
become increasingly pervasive, Hence we can expect the emergence
of ubiquitous or pervasive computing, ambient intelligence. These
new environments and applications will present new security
challenges, and there is no doubt that cryptographic algorithms and
protocols will form a part of the solution. The efficiency of a public
key cryptosystem is mainly measured in computational overheads,
key size and bandwidth. In particular the RSA algorithm is used in
many applications for providing the security. Although the security
of RSA is beyond doubt, the evolution in computing power has
caused a growth in the necessary key length. The fact that most chips
on smart cards can-t process key extending 1024 bit shows that there
is need for alternative. NTRU is such an alternative and it is a
collection of mathematical algorithm based on manipulating lists of
very small integers and polynomials. This allows NTRU to high
speeds with the use of minimal computing power. NTRU (Nth degree
Truncated Polynomial Ring Unit) is the first secure public key
cryptosystem not based on factorization or discrete logarithm
problem. This means that given sufficient computational resources
and time, an adversary, should not be able to break the key. The
multi-party communication and requirement of optimal resource
utilization necessitated the need for the present day demand of
applications that need security enforcement technique .and can be
enhanced with high-end computing. This has promoted us to develop
high-performance NTRU schemes using approaches such as the use
of high-end computing hardware. Peer-to-peer (P2P) or enterprise
grids are proven as one of the approaches for developing high-end
computing systems. By utilizing them one can improve the
performance of NTRU through parallel execution. In this paper we
propose and develop an application for NTRU using enterprise grid
middleware called Alchemi. An analysis and comparison of its
performance for various text files is presented.
Abstract: This paper evaluate the multilevel modulation for
different techniques such as amplitude shift keying (M-ASK), MASK,
differential phase shift keying (M-ASK-Bipolar), Quaternary
Amplitude Shift Keying (QASK) and Quaternary Polarization-ASK
(QPol-ASK) at a total bit rate of 107 Gbps. The aim is to find a costeffective
very high speed transport solution. Numerical investigation
was performed using Monte Carlo simulations. The obtained results
indicate that some modulation formats can be operated at 100Gbps
in optical communication systems with low implementation effort
and high spectral efficiency.
Abstract: Many accidents were happened because of fast driving, habitual working overtime or tired spirit. This paper presents a solution of remote warning for vehicles collision avoidance using vehicular communication. The development system integrates dedicated short range communication (DSRC) and global position system (GPS) with embedded system into a powerful remote warning system. To transmit the vehicular information and broadcast vehicle position; DSRC communication technology is adopt as the bridge. The proposed system is divided into two parts of the positioning andvehicular units in a vehicle. The positioning unit is used to provide the position and heading information from GPS module, and furthermore the vehicular unit is used to receive the break, throttle, and othersignals via controller area network (CAN) interface connected to each mechanism. The mobile hardware are built with an embedded system using X86 processor in Linux system. A vehicle is communicated with other vehicles via DSRC in non-addressed protocol with wireless access in vehicular environments (WAVE) short message protocol. From the position data and vehicular information, this paper provided a conflict detection algorithm to do time separation and remote warning with error bubble consideration. And the warning information is on-line displayed in the screen. This system is able to enhance driver assistance service and realize critical safety by using vehicular information from the neighbor vehicles.KeywordsDedicated short range communication, GPS, Control area network, Collision avoidance warning system.
Abstract: The aim of present study was to assess the effect of
glucogenic (G) and lipogenic (L) diets on blood metabolites in
Baloochi lambs. Three rumen cannulated Baloochi sheep were used
as a 3×3 Latin square design with 3 periods (28 days). Experimental
diets were a glucogenic, a lipogenic and a mixture of G and L diets
(50:50). The animals were fed diets consisted of 50% chopped alfalfa
hay and 50% concentrate. Diets were fed once daily ad libitum.
Blood samples were taken from jugular vein before the feeding, 2, 4
and 6 hour post feeding at day 27. Results indicated that β-
hydroxybutyrate (BHBA), glucose, insulin and aspartate
aminotransferase (AST) were not affected by treatments (P > 0.05).
However, lipogenic diet increased significantly activity of Alanine
aminotransferase (ALT) and concentration of non-esterified fatty acid
(NEFA) in blood plasma (P < 0.05)
Abstract: The wireless adhoc network is comprised of wireless
node which can move freely and are connected among themselves
without central infrastructure. Due to the limited transmission range
of wireless interfaces, in most cases communication has to be relayed
over intermediate nodes. Thus, in such multihop network each node
(also called router) is independent, self-reliant and capable to route
the messages over the dynamic network topology. Various protocols
are reported in this field and it is very difficult to decide the best one.
A key issue in deciding which type of routing protocol is best for
adhoc networks is the communication overhead incurred by the
protocol. In this paper STAR a table driven and DSR on demand
protocols based on IEEE 802.11 are analyzed for their performance
on different performance measuring metrics versus varying traffic
CBR load using QualNet 5.0.2 network simulator.
Abstract: Worldwide many electrical equipment insulation
failures have been reported caused by switching operations, while
those equipments had previously passed all the standard tests and
complied with all quality requirements. The problem is mostly
associated with high-frequency overvoltages generated during
opening or closing of a switching device. The transients generated
during switching operations in a Gas Insulated Substation (GIS) are
associated with high frequency components in the order of few tens
of MHz.
The frequency spectrum of the VFTO generated in the 220/66 kV
Wadi-Hoff GIS is analyzed using Fast Fourier Transform technique.
The main frequency with high voltage amplitude due to the operation
of disconnector (DS5) is 5 to 10 MHz, with the highest amplitude at 9
MHz. The main frequency with high voltage amplitude due to the
operation of circuit breaker (CB5) is 1 to 25 MHz, with the highest
amplitude at 2 MHz.
Mitigating techniques damped the oscillating frequencies
effectively. The using of cable terminal reduced the frequency
oscillation effectively than that of OHTL terminal. The using of a
shunt capacitance results in vanishing the high frequency
components. Ferrite rings reduces the high frequency components
effectively especially in the range 2 to 7 MHz. The using of RC and
RL filters results in vanishing the high frequency components.
Abstract: This paper presents the development of analysis tools
for Home Agriculture project. The tools are required for monitoring
the condition of greenhouse which involves two components:
measurement hardware and data analysis engine. Measurement
hardware is functioned to measure environment parameters such as
temperature, humidity, air quality, dust and etc while analysis tool is
used to analyse and interpret the integrated data against the condition
of weather, quality of health, irradiance, quality of soil and etc. The
current development of the tools is completed for off-line data
recorded technique. The data is saved in MMC and transferred via
ZigBee to Environment Data Manager (EDM) for data analysis.
EDM converts the raw data and plot three combination graphs. It has
been applied in monitoring three months data measurement for
irradiance, temperature and humidity of the greenhouse..
Abstract: This paper describes the development of a control
system model using a graphical software tool. This control system is
part of an operator training simulator developed for the National
Training Center for Operators of Ixtapantongo (CNCAOI, acronym
according to its name in Spanish language) of the Mexico-s Federal
Commission of Electricity, CFE). The Department of Simulation of
the Electrical Research Institute (IIE) developed this simulator using
as reference the Unit I of the Combined Cycle Power Plant El Sauz,
located at the centre of Mexico. The first step in the project was the
developing of the Gas Turbine System and its control system
simulator. The Turbo Gas simulator was finished and delivered to
CNCAOI in March 2007 for commercial operation. This simulator is
a high-fidelity real time dynamic simulator built and tested for
accurate operation over the entire load range. The simulator was used
primarily for operator training although it has been used for
procedure development and evaluation of plant transients.
Abstract: Previously, harmonic parameters (HPs) have been
selected as features extracted from EEG signals for automatic sleep
scoring. However, in previous studies, only one HP parameter was
used, which were directly extracted from the whole epoch of EEG
signal.
In this study, two different transformations were applied to extract
HPs from EEG signals: Hilbert-Huang transform (HHT) and wavelet
transform (WT). EEG signals are decomposed by the two
transformations; and features were extracted from different
components. Twelve parameters (four sets of HPs) were extracted.
Some of the parameters are highly diverse among different stages.
Afterward, HPs from two transformations were used to building a
rough sleep stages scoring model using the classifier SVM. The
performance of this model is about 78% using the features obtained by
our proposed extractions. Our results suggest that these features may
be useful for automatic sleep stages scoring.
Abstract: In this paper the concept of strongly (λM)p - Ces'aro
summability of a sequence of fuzzy numbers and strongly λM- statistically convergent sequences of fuzzy numbers is introduced.
Abstract: Wavelet transform has been extensively used in
machine fault diagnosis and prognosis owing to its strength to deal
with non-stationary signals. The existing Wavelet transform based
schemes for fault diagnosis employ wavelet decomposition of the
entire vibration frequency which not only involve huge
computational overhead in extracting the features but also increases
the dimensionality of the feature vector. This increase in the
dimensionality has the tendency to 'over-fit' the training data and
could mislead the fault diagnostic model. In this paper a novel
technique, envelope wavelet packet transform (EWPT) is proposed in
which features are extracted based on wavelet packet transform of the
filtered envelope signal rather than the overall vibration signal. It not
only reduces the computational overhead in terms of reduced number
of wavelet decomposition levels and features but also improves the
fault detection accuracy. Analytical expressions are provided for the
optimal frequency resolution and decomposition level selection in
EWPT. Experimental results with both actual and simulated machine
fault data demonstrate significant gain in fault detection ability by
EWPT at reduced complexity compared to existing techniques.
Abstract: The data is available in abundance in any business
organization. It includes the records for finance, maintenance,
inventory, progress reports etc. As the time progresses, the data keep
on accumulating and the challenge is to extract the information from
this data bank. Knowledge discovery from these large and complex
databases is the key problem of this era. Data mining and machine
learning techniques are needed which can scale to the size of the
problems and can be customized to the application of business. For
the development of accurate and required information for particular
problem, business analyst needs to develop multidimensional models
which give the reliable information so that they can take right
decision for particular problem. If the multidimensional model does
not possess the advance features, the accuracy cannot be expected.
The present work involves the development of a Multidimensional
data model incorporating advance features. The criterion of
computation is based on the data precision and to include slowly
change time dimension. The final results are displayed in graphical
form.
Abstract: The motivation of this work was to find a suitable 3D
scanner for human body parts digitalization in the field of prosthetics
and orthotics. The main project objective is to compare the three
hand-held portable scanners (two optical and one laser) and two
optical tripod scanners. The comparison was made with respect of
scanning detail, simplicity of operation and ability to scan directly on
the human body. Testing was carried out on a plaster cast of the
upper limb and directly on a few volunteers. The objective monitored
parameters were time of digitizing and post-processing of 3D data
and resulting visual data quality. Subjectively, it was considered level
of usage and handling of the scanner. The new tripod was developed
to improve the face scanning conditions. The results provide an
overview of the suitability of different types of scanners.
Abstract: Brick is one of the most common masonry units used as building material. Due to the demand, different types of waste have been investigated to be incorporated into the bricks. Many types of sludge have been incorporated in fired clay brick for example marble sludge, stone sludge, water sludge, sewage sludge, and ceramic sludge. The utilization of these waste materials in fired clay bricks usually has positive effects on the properties such as lightweight bricks with improved shrinkage, porosity, and strength. This paper reviews on utilization of different types of sludge wastes into fired clay bricks. Previous investigations have demonstrated positive effects on the physical and mechanical properties as well as less impact towards the environment. Thus, the utilizations of sludge waste could produce a good quality of brick and could be one of alternative disposal methods for the sludge wastes.
Abstract: A sequential treatment of ozonation followed by a
Fenton or photo-Fenton process, using black light lamps (365 nm) in
this latter case, has been applied to remove a mixture of
pharmaceutical compounds and the generated by-products both in
ultrapure and secondary treated wastewater. The scientifictechnological
innovation of this study stems from the in situ
generation of hydrogen peroxide from the direct ozonation of
pharmaceuticals, and can later be used in the application of Fenton
and photo-Fenton processes. The compounds selected as models
were sulfamethoxazol and acetaminophen. It should be remarked that
the use of a second process is necessary as a result of the low
mineralization yield reached by the exclusive application of ozone.
Therefore, the influence of the water matrix has been studied in terms
of hydrogen peroxide concentration, individual compound
concentration and total organic carbon removed. Moreover, the
concentration of different iron species in solution has been measured.
Abstract: Automatic Extraction of Event information from
social text stream (emails, social network sites, blogs etc) is a vital
requirement for many applications like Event Planning and
Management systems and security applications. The key information
components needed from Event related text are Event title, location,
participants, date and time. Emails have very unique distinctions over
other social text streams from the perspective of layout and format
and conversation style and are the most commonly used
communication channel for broadcasting and planning events.
Therefore we have chosen emails as our dataset. In our work, we
have employed two statistical NLP methods, named as Finite State
Machines (FSM) and Hidden Markov Model (HMM) for the
extraction of event related contextual information. An application
has been developed providing a comparison among the two methods
over the event extraction task. It comprises of two modules, one for
each method, and works for both bulk as well as direct user input.
The results are evaluated using Precision, Recall and F-Score.
Experiments show that both methods produce high performance and
accuracy, however HMM was good enough over Title extraction and
FSM proved to be better for Venue, Date, and time.