Abstract: Fake finger submission attack is a major problem in fingerprint recognition systems. In this paper, we introduce an aliveness detection method based on multiple static features, which derived from a single fingerprint image. The static features are comprised of individual pore spacing, residual noise and several first order statistics. Specifically, correlation filter is adopted to address individual pore spacing. The multiple static features are useful to reflect the physiological and statistical characteristics of live and fake fingerprint. The classification can be made by calculating the liveness scores from each feature and fusing the scores through a classifier. In our dataset, we compare nine classifiers and the best classification rate at 85% is attained by using a Reduced Multivariate Polynomial classifier. Our approach is faster and more convenient for aliveness check for field applications.
Abstract: This paper aims at developing a multilevel fuzzy
decision support model for urban rail transit planning schemes in
China under the background that China is presently experiencing an
unprecedented construction of urban rail transit. In this study, an
appropriate model using multilevel fuzzy comprehensive evaluation
method is developed. In the decision process, the followings are
considered as the influential objectives: traveler attraction,
environment protection, project feasibility and operation. In addition,
consistent matrix analysis method is used to determine the weights
between objectives and the weights between the objectives-
sub-indictors, which reduces the work caused by repeated
establishment of the decision matrix on the basis of ensuring the
consistency of decision matrix. The application results show that
multilevel fuzzy decision model can perfectly deal with the
multivariable and multilevel decision process, which is particularly
useful in the resolution of multilevel decision-making problem of
urban rail transit planning schemes.
Abstract: Medical image data hiding has strict constrains such
as high imperceptibility, high capacity and high robustness.
Achieving these three requirements simultaneously is highly
cumbersome. Some works have been reported in the literature on
data hiding, watermarking and stegnography which are suitable for
telemedicine applications. None is reliable in all aspects. Electronic
Patient Report (EPR) data hiding for telemedicine demand it blind
and reversible. This paper proposes a novel approach to blind
reversible data hiding based on integer wavelet transform.
Experimental results shows that this scheme outperforms the prior
arts in terms of zero BER (Bit Error Rate), higher PSNR (Peak Signal
to Noise Ratio), and large EPR data embedding capacity with
WPSNR (Weighted Peak Signal to Noise Ratio) around 53 dB,
compared with the existing reversible data hiding schemes.
Abstract: This paper discusses the applicability of the Data
Distribution Service (DDS) for the development of automated and modular manufacturing systems which require a flexible and robust
communication infrastructure. DDS is an emergent standard for datacentric publish/subscribe middleware systems that provides an
infrastructure for platform-independent many-to-many
communication. It particularly addresses the needs of real-time systems that require deterministic data transfer, have low memory
footprints and high robustness requirements. After an overview of the
standard, several aspects of DDS are related to current challenges for the development of modern manufacturing systems with distributed architectures. Finally, an example application is presented based on a modular active fixturing system to illustrate the described aspects.
Abstract: This paper is mainly concerned with the application of
a novel technique of data interpretation for classifying measurements
of plasma columns in Tokamak reactors for nuclear fusion
applications. The proposed method exploits several concepts derived
from soft computing theory. In particular, Artificial Neural Networks
and Multi-Class Support Vector Machines have been exploited to
classify magnetic variables useful to determine shape and position of
the plasma with a reduced computational complexity. The proposed
technique is used to analyze simulated databases of plasma equilibria
based on ITER geometry configuration. As well as demonstrating the
successful recovery of scalar equilibrium parameters, we show that
the technique can yield practical advantages compared with earlier
methods.
Abstract: We study the spatial design of experiment and we want to select a most informative subset, having prespecified size, from a set of correlated random variables. The problem arises in many applied domains, such as meteorology, environmental statistics, and statistical geology. In these applications, observations can be collected at different locations and possibly at different times. In spatial design, when the design region and the set of interest are discrete then the covariance matrix completely describe any objective function and our goal is to choose a feasible design that minimizes the resulting uncertainty. The problem is recast as that of maximizing the determinant of the covariance matrix of the chosen subset. This problem is NP-hard. For using these designs in computer experiments, in many cases, the design space is very large and it's not possible to calculate the exact optimal solution. Heuristic optimization methods can discover efficient experiment designs in situations where traditional designs cannot be applied, exchange methods are ineffective and exact solution not possible. We developed a GA algorithm to take advantage of the exploratory power of this algorithm. The successful application of this method is demonstrated in large design space. We consider a real case of design of experiment. In our problem, design space is very large and for solving the problem, we used proposed GA algorithm.
Abstract: The paper presents the brief information on particular results of experimental study focused to the problems of behavior of structural plated components made of fiber-cement-based materials and used in building constructions, exposed to atmospheric physical effects given by the weather changes in the summer period. Weather changes represented namely by temperature and rain cause also the changes of the temperature and moisture of the investigated structural components. This can affect their static behavior that means stresses and deformations, which have been monitored as the main outputs of tests performed. Experimental verification is based on the simulation of the influence of temperature and rain using the defined procedure of warming and water sprinkling with respect to the corresponding weather conditions during summer period in the South Moravian region at the Czech Republic, for which the application of these structural components is mainly planned. Two types of components have been tested: (i) glass-fiber-concrete panels used for building façades and (ii) fiber-cement slabs used mainly for claddings, but also as a part of floor structures or lost shuttering, and so on.
Abstract: In this paper we present the information life cycle and analyze the importance of managing the corporate application portfolio across this life cycle. The approach presented here corresponds not just to the extension of the traditional information system development life cycle. This approach is based in the generic life cycle. In this paper it is proposed a model of an information system life cycle, supported in the assumption that a system has a limited life. But, this limited life may be extended. This model is also applied in several cases; being reported here two examples of the framework application in a construction enterprise and in a manufacturing enterprise.
Abstract: When architecting an application, key nonfunctional requirements such as performance, scalability, availability and security, which influence the architecture of the system, are some times not adequately addressed. Performance of the application may not be looked at until there is a concern. There are several problems with this reactive approach. If the system does not meet its performance objectives, the application is unlikely to be accepted by the stakeholders. This paper suggests an approach for performance modeling for web based J2EE and .Net applications to address performance issues early in the development life cycle. It also includes a Performance Modeling Case Study, with Proof-of-Concept (PoC) and implementation details for .NET and J2EE platforms.
Abstract: Static synchronous compensator (STATCOM) is a shunt connected voltage source converter (VSC), which can affect rapid control of reactive flow in the transmission line by controlling the generated a.c. voltage. The main aim of the paper is to design a power system installed with a Static synchronous compensator (STATCOM) and demonstrates the application of the linearised Phillips-heffron model in analyzing the damping effect of the STATCOM to improve power system oscillation stability. The proposed PI controller is designed to coordinate two control inputs: Voltage of the injection bus and capacitor voltage of the STATCOM, to improve the Dynamic stability of a SMIB system .The power oscillations damping (POD) control and power system stabilizer (PSS) and their coordinated action with proposed controllers are tested. The simulation result shows that the proposed damping controllers provide satisfactory performance in terms of improvements of dynamic stability of the system.
Abstract: Cryptographic algorithms play a crucial role in the
information society by providing protection from unauthorized
access to sensitive data. It is clear that information technology will
become increasingly pervasive, Hence we can expect the emergence
of ubiquitous or pervasive computing, ambient intelligence. These
new environments and applications will present new security
challenges, and there is no doubt that cryptographic algorithms and
protocols will form a part of the solution. The efficiency of a public
key cryptosystem is mainly measured in computational overheads,
key size and bandwidth. In particular the RSA algorithm is used in
many applications for providing the security. Although the security
of RSA is beyond doubt, the evolution in computing power has
caused a growth in the necessary key length. The fact that most chips
on smart cards can-t process key extending 1024 bit shows that there
is need for alternative. NTRU is such an alternative and it is a
collection of mathematical algorithm based on manipulating lists of
very small integers and polynomials. This allows NTRU to high
speeds with the use of minimal computing power. NTRU (Nth degree
Truncated Polynomial Ring Unit) is the first secure public key
cryptosystem not based on factorization or discrete logarithm
problem. This means that given sufficient computational resources
and time, an adversary, should not be able to break the key. The
multi-party communication and requirement of optimal resource
utilization necessitated the need for the present day demand of
applications that need security enforcement technique .and can be
enhanced with high-end computing. This has promoted us to develop
high-performance NTRU schemes using approaches such as the use
of high-end computing hardware. Peer-to-peer (P2P) or enterprise
grids are proven as one of the approaches for developing high-end
computing systems. By utilizing them one can improve the
performance of NTRU through parallel execution. In this paper we
propose and develop an application for NTRU using enterprise grid
middleware called Alchemi. An analysis and comparison of its
performance for various text files is presented.
Abstract: This paper presents design features of a rescue robot, named CEO Mission II. Its body is designed to be the track wheel type with double front flippers for climbing over the collapse and the rough terrain. With 125 cm. long, 5-joint mechanical arm installed on the robot body, it is deployed not only for surveillance from the top view but also easier and faster access to the victims to get their vital signs. Two cameras and sensors for searching vital signs are set up at the tip of the multi-joint mechanical arm. The third camera is at the back of the robot for driving control. Hardware and software of the system, which controls and monitors the rescue robot, are explained. The control system is used for controlling the robot locomotion, the 5-joint mechanical arm, and for turning on/off devices. The monitoring system gathers all information from 7 distance sensors, IR temperature sensors, 3 CCD cameras, voice sensor, robot wheels encoders, yawn/pitch/roll angle sensors, laser range finder and 8 spare A/D inputs. All sensors and controlling data are communicated with a remote control station via IEEE 802.11b Wi-Fi. The audio and video data are compressed and sent via another IEEE 802.11g Wi-Fi transmitter for getting real-time response. At remote control station site, the robot locomotion and the mechanical arm are controlled by joystick. Moreover, the user-friendly GUI control program is developed based on the clicking and dragging method to easily control the movement of the arm. Robot traveling map is plotted from computing the information of wheel encoders and the yawn/pitch data. 2D Obstacle map is plotted from data of the laser range finder. The concept and design of this robot can be adapted to suit many other applications. As the Best Technique awardee from Thailand Rescue Robot Championship 2006, all testing results are satisfied.
Abstract: In this paper, the differential quadrature method is applied to simulate natural convection in an inclined cubic cavity using velocity-vorticity formulation. The numerical capability of the present algorithm is demonstrated by application to natural convection in an inclined cubic cavity. The velocity Poisson equations, the vorticity transport equations and the energy equation are all solved as a coupled system of equations for the seven field variables consisting of three velocities, three vorticities and temperature. The coupled equations are simultaneously solved by imposing the vorticity definition at boundary without requiring the explicit specification of the vorticity boundary conditions. Test results obtained for an inclined cubic cavity with different angle of inclinations for Rayleigh number equal to 103, 104, 105 and 106 indicate that the present coupled solution algorithm could predict the benchmark results for temperature and flow fields. Thus, it is convinced that the present formulation is capable of solving coupled Navier-Stokes equations effectively and accurately.
Abstract: Noise has adverse effect on human health and
comfort. Noise not only cause hearing impairment, but it also acts as
a causal factor for stress and raising systolic pressure. Additionally it
can be a causal factor in work accidents, both by marking hazards
and warning signals and by impeding concentration. Industry
workers also suffer psychological and physical stress as well as
hearing loss due to industrial noise. This paper proposes an approach
to enable engineers to point out quantitatively the noisiest source for
modification, while multiple machines are operating simultaneously.
The model with the point source and spherical radiation in a free field
was adopted to formulate the problem. The procedure works very
well in ideal cases (point source and free field). However, most of the
industrial noise problems are complicated by the fact that the noise is
confined in a room. Reflections from the walls, floor, ceiling, and
equipment in a room create a reverberant sound field that alters the
sound wave characteristics from those for the free field. So the model
was validated for relatively low absorption room at NIT Kurukshetra
Central Workshop. The results of validation pointed out that the
estimated sound power of noise sources under simultaneous
conditions were on lower side, within the error limits 3.56 - 6.35 %.
Thus suggesting the use of this methodology for practical
implementation in industry. To demonstrate the application of the
above analytical procedure for estimating the sound power of noise
sources under simultaneous operating conditions, a manufacturing
facility (Railway Workshop at Yamunanagar, India) having five
sound sources (machines) on its workshop floor is considered in this
study. The findings of the case study had identified the two most
effective candidates (noise sources) for noise control in the Railway
Workshop Yamunanagar, India. The study suggests that the
modification in the design and/or replacement of these two identified
noisiest sources (machine) would be necessary so as to achieve an
effective reduction in noise levels. Further, the estimated data allows
engineers to better understand the noise situations of the workplace
and to revise the map when changes occur in noise level due to a
workplace re-layout.
Abstract: Flight management system (FMS) is a specialized
computer system that automates a wide variety of in-flight tasks,
reducing the workload on the flight crew to the point that modern
aircraft no longer carry flight engineers or navigators. The primary
function of FMS is to perform the in-flight management of the flight
plan using various sensors (such as GPS and INS often backed up by
radio navigation) to determine the aircraft's position. From the
cockpit FMS is normally controlled through a Control Display Unit
(CDU) which incorporates a small screen and keyboard or touch
screen. This paper investigates the performance of GPS/ INS
integration techniques in which the data fusion process is done using
Kalman filtering. This will include the importance of sensors
calibration as well as the alignment of the strap down inertial
navigation system. The limitations of the inertial navigation systems
are investigated in order to understand why INS sometimes is
integrated with other navigation aids and not just operating in standalone
mode. Finally, both the loosely coupled and tightly coupled
configurations are analyzed for several types of situations and
operational conditions.
Abstract: This paper discusses a design of nonlinear observer by
a formal linearization method using an application of Chebyshev Interpolation
in order to facilitate processes for synthesizing a nonlinear
observer and to improve the precision of linearization.
A dynamic nonlinear system is linearized with respect to a linearization
function, and a measurement equation is transformed into
an augmented linear one by the formal linearization method which is
based on Chebyshev interpolation. To the linearized system, a linear
estimation theory is applied and a nonlinear observer is derived. To
show effectiveness of the observer design, numerical experiments
are illustrated and they indicate that the design shows remarkable
performances for nonlinear systems.
Abstract: Can biometrics do what everyone is expecting it will?
And more importantly, should it be doing it? Biometrics is the
buzzword “on the mouth" of everyone, who are trying to use this
technology in a variety of applications. But all this “hype" about
biometrics can be dangerous without a careful evaluation of the real
needs of each application. In this paper I-ll try to focus on the
dangers of using the right technology at the right time in the wrong
place.
Abstract: The data is available in abundance in any business
organization. It includes the records for finance, maintenance,
inventory, progress reports etc. As the time progresses, the data keep
on accumulating and the challenge is to extract the information from
this data bank. Knowledge discovery from these large and complex
databases is the key problem of this era. Data mining and machine
learning techniques are needed which can scale to the size of the
problems and can be customized to the application of business. For
the development of accurate and required information for particular
problem, business analyst needs to develop multidimensional models
which give the reliable information so that they can take right
decision for particular problem. If the multidimensional model does
not possess the advance features, the accuracy cannot be expected.
The present work involves the development of a Multidimensional
data model incorporating advance features. The criterion of
computation is based on the data precision and to include slowly
change time dimension. The final results are displayed in graphical
form.
Abstract: A sequential treatment of ozonation followed by a
Fenton or photo-Fenton process, using black light lamps (365 nm) in
this latter case, has been applied to remove a mixture of
pharmaceutical compounds and the generated by-products both in
ultrapure and secondary treated wastewater. The scientifictechnological
innovation of this study stems from the in situ
generation of hydrogen peroxide from the direct ozonation of
pharmaceuticals, and can later be used in the application of Fenton
and photo-Fenton processes. The compounds selected as models
were sulfamethoxazol and acetaminophen. It should be remarked that
the use of a second process is necessary as a result of the low
mineralization yield reached by the exclusive application of ozone.
Therefore, the influence of the water matrix has been studied in terms
of hydrogen peroxide concentration, individual compound
concentration and total organic carbon removed. Moreover, the
concentration of different iron species in solution has been measured.
Abstract: Automatic Extraction of Event information from
social text stream (emails, social network sites, blogs etc) is a vital
requirement for many applications like Event Planning and
Management systems and security applications. The key information
components needed from Event related text are Event title, location,
participants, date and time. Emails have very unique distinctions over
other social text streams from the perspective of layout and format
and conversation style and are the most commonly used
communication channel for broadcasting and planning events.
Therefore we have chosen emails as our dataset. In our work, we
have employed two statistical NLP methods, named as Finite State
Machines (FSM) and Hidden Markov Model (HMM) for the
extraction of event related contextual information. An application
has been developed providing a comparison among the two methods
over the event extraction task. It comprises of two modules, one for
each method, and works for both bulk as well as direct user input.
The results are evaluated using Precision, Recall and F-Score.
Experiments show that both methods produce high performance and
accuracy, however HMM was good enough over Title extraction and
FSM proved to be better for Venue, Date, and time.