Abstract: In this paper, an innovative watermarking scheme for audio signal based on genetic algorithms (GA) in the discrete wavelet transforms is proposed. It is robust against watermarking attacks, which are commonly employed in literature. In addition, the watermarked image quality is also considered. We employ GA for the optimal localization and intensity of watermark. The watermark detection process can be performed without using the original audio signal. The experimental results demonstrate that watermark is inaudible and robust to many digital signal processing, such as cropping, low pass filter, additive noise.
Abstract: In this paper we address the issue of classifying the fluorescent intensity of a sample in Indirect Immuno-Fluorescence (IIF). Since IIF is a subjective, semi-quantitative test in its very nature, we discuss a strategy to reliably label the image data set by using the diagnoses performed by different physicians. Then, we discuss image pre-processing, feature extraction and selection. Finally, we propose two ANN-based classifiers that can separate intrinsically dubious samples and whose error tolerance can be flexibly set. Measured performance shows error rates less than 1%, which candidates the method to be used in daily medical practice either to perform pre-selection of cases to be examined, or to act as a second reader.
Abstract: A separation-kernel-based operating system (OS) has been designed for use in secure embedded systems by applying formal methods to the design of the separation-kernel part. The separation kernel is a small OS kernel that provides an abstract distributed environment on a single CPU. The design of the separation kernel was verified using two formal methods, the B method and the Spin model checker. A newly designed semi-formal method, the extended state transition method, was also applied. An OS comprising the separation-kernel part and additional OS services on top of the separation kernel was prototyped on the Intel IA-32 architecture. Developing and testing of a prototype embedded application, a point-of-sale application, on the prototype OS demonstrated that the proposed architecture and the use of formal methods to design its kernel part are effective for achieving a secure embedded system having a high-assurance separation kernel.
Abstract: Sharing consistent and correct master data among
disparate applications in a reverse-logistics chain has long been
recognized as an intricate problem. Although a master data
management (MDM) system can surely assume that responsibility,
applications that need to co-operate with it must comply with
proprietary query interfaces provided by the specific MDM system. In
this paper, we present a RFID-ready MDM system which makes
master data readily available for any participating applications in a
reverse-logistics chain. We propose a RFID-wrapper as a part of our
MDM. It acts as a gateway between any data retrieval request and
query interfaces that process it. With the RFID-wrapper, any
participating applications in a reverse-logistics chain can easily
retrieve master data in a way that is analogous to retrieval of any other
RFID-based logistics transactional data.
Abstract: Ad hoc networks are characterized by multihop wireless connectivity, frequently changing network topology and the need for efficient dynamic routing protocols. We compare the performance of three routing protocols for mobile ad hoc networks: Dynamic Source Routing (DSR) , Ad Hoc On-Demand Distance Vector Routing (AODV), location-aided routing(LAR1).The performance differentials are analyzed using varying network load, mobility, and network size. We simulate protocols with GLOMOSIM simulator. Based on the observations, we make recommendations about when the performance of either protocol can be best.
Abstract: The volume of XML data exchange is explosively increasing, and the need for efficient mechanisms of XML data management is vital. Many XML storage models have been proposed for storing XML DTD-independent documents in relational database systems. Benchmarking is the best way to highlight pros and cons of different approaches. In this study, we use a common benchmarking scheme, known as XMark to compare the most cited and newly proposed DTD-independent methods in terms of logical reads, physical I/O, CPU time and duration. We show the effect of Label Path, extracting values and storing in another table and type of join needed for each method's query answering.
Abstract: The scale, complexity and worldwide geographical
spread of the LHC computing and data analysis problems are
unprecedented in scientific research. The complexity of processing
and accessing this data is increased substantially by the size and
global span of the major experiments, combined with the limited
wide area network bandwidth available. We present the latest
generation of the MONARC (MOdels of Networked Analysis at
Regional Centers) simulation framework, as a design and modeling
tool for large scale distributed systems applied to HEP experiments.
We present simulation experiments designed to evaluate the
capabilities of the current real-world distributed infrastructure to
support existing physics analysis processes and the means by which
the experiments bands together to meet the technical challenges
posed by the storage, access and computing requirements of LHC
data analysis within the CMS experiment.
Abstract: In this paper, a neural tree (NT) classifier having a
simple perceptron at each node is considered. A new concept for
making a balanced tree is applied in the learning algorithm of the
tree. At each node, if the perceptron classification is not accurate and
unbalanced, then it is replaced by a new perceptron. This separates
the training set in such a way that almost the equal number of patterns
fall into each of the classes. Moreover, each perceptron is trained only
for the classes which are present at respective node and ignore other
classes. Splitting nodes are employed into the neural tree architecture
to divide the training set when the current perceptron node repeats
the same classification of the parent node. A new error function based
on the depth of the tree is introduced to reduce the computational
time for the training of a perceptron. Experiments are performed to
check the efficiency and encouraging results are obtained in terms of
accuracy and computational costs.
Abstract: Optical 3D measurement of objects is meaningful in
numerous industrial applications. In various cases shape acquisition
of weak textured objects is essential. Examples are repetition parts
made of plastic or ceramic such as housing parts or ceramic bottles as
well as agricultural products like tubers. These parts are often
conveyed in a wobbling way during the automated optical inspection.
Thus, conventional 3D shape acquisition methods like laser scanning
might fail. In this paper, a novel approach for acquiring 3D shape of
weak textured and moving objects is presented. To facilitate such
measurements an active stereo vision system with structured light is
proposed. The system consists of multiple camera pairs and auxiliary
laser pattern generators. It performs the shape acquisition within one
shot and is beneficial for rapid inspection tasks. An experimental
setup including hardware and software has been developed and
implemented.
Abstract: The successful implementation of Service-Oriented Architecture (SOA) is not confined to Information Technology systems and required changes of the whole enterprise. In order to adapt IT and business, the enterprise requires adequate and measurable methods. The adoption of SOA creates new problem with regard to measuring and analysis the performance. In fact the enterprise should investigate to what extent the development of services will increase the value of business. It is required for every business to measure the extent of SOA adaptation with the goals of enterprise. Moreover, precise performance metrics and their combination with the advanced evaluation methodologies as a solution should be defined. The aim of this paper is to present a systematic methodology for designing a measurement system at the technical and business levels, so that: (1) it will determine measurement metrics precisely (2) the results will be analysed by mapping identified metrics to the measurement tools.
Abstract: Current OCR technology does not allow to
accurately recognizing small text images, such as those found
in web images. Our goal is to investigate new approaches to
recognize very low resolution text images containing antialiased
character shapes.
This paper presents a preliminary study on the variability of
such characters and the feasibility to discriminate them by
using geometrical features. In a first stage we analyze the
distribution of these features. In a second stage we present a
study on the discriminative power for recognizing isolated
characters, using various rendering methods and font
properties. Finally we present interesting results of our
evaluation tests leading to our conclusion and future focus.
Abstract: This paper focuses on developing an integrated
reliable and sophisticated model for ultra large wind turbines And to
study the performance and analysis of vector control on large wind
turbines. With the advance of power electronics technology, direct
driven multi-pole radial flux PMSG (Permanent Magnet Synchronous
Generator) has proven to be a good choice for wind turbines
manufacturers. To study the wind energy conversion systems, it is
important to develop a wind turbine simulator that is able to produce
realistic and validated conditions that occur in real ultra MW wind
turbines. Three different packages are used to simulate this model,
namely, Turbsim, FAST and Simulink. Turbsim is a Full field wind
simulator developed by National Renewable Energy Laboratory
(NREL). The wind turbine mechanical parts are modeled by FAST
(Fatigue, Aerodynamics, Structures and Turbulence) code which is
also developed by NREL. Simulink is used to model the PMSG, full
scale back to back IGBT converters, and the grid.
Abstract: WebGL is typically used with web browsers. In this
paper, we represent a standalone WebGL execution environment,
where the original WebGL source codes show the same result to those
of WebGL-capable web browsers. This standalone environment
enables us to run WebGL programs without web browsers and/or
internet connections. Our implementation shows the same rendering
results with typical web browser outputs. This standalone environment
is suitable for low-tier devices and/or debugging purposes.
Abstract: Current research on semantic web aims at making intelligent web pages meaningful for machines. In this way, ontology plays a primary role. We believe that logic can help ontology languages (such as OWL) to be more fluent and efficient. In this paper we try to combine logic with OWL to reduce some disadvantages of this language. Therefore we extend OWL by logic and also show how logic can satisfy our future expectations of an ontology language.
Abstract: Although face recognition seems as an easy task for
human, automatic face recognition is a much more challenging task
due to variations in time, illumination and pose. In this paper, the
influence of time-lapse on visible and thermal images is examined.
Orthogonal moment invariants are used as a feature extractor to
analyze the effect of time-lapse on thermal and visible images and the
results are compared with conventional Principal Component
Analysis (PCA). A new triangle square ratio criterion is employed
instead of Euclidean distance to enhance the performance of nearest
neighbor classifier. The results of this study indicate that the ideal
feature vectors can be represented with high discrimination power
due to the global characteristic of orthogonal moment invariants.
Moreover, the effect of time-lapse has been decreasing and enhancing
the accuracy of face recognition considerably in comparison with
PCA. Furthermore, our experimental results based on moment
invariant and triangle square ratio criterion show that the proposed
approach achieves on average 13.6% higher in recognition rate than
PCA.
Abstract: Dengue fever is prevalent in Malaysia with numerous
cases including mortality recorded over the years. Public education
on the prevention of the desease through various means has been
carried out besides the enforcement of legal means to eradicate
Aedes mosquitoes, the dengue vector breeding ground. Hence, other
means need to be explored, such as predicting the seasonal peak
period of the dengue outbreak and identifying related climate factors
contributing to the increase in the number of mosquitoes. Simulation
model can be employed for this purpose. In this study, we created a
simulation of system dynamic to predict the spread of dengue
outbreak in Hulu Langat, Selangor Malaysia. The prototype was
developed using STELLA 9.1.2 software. The main data input are
rainfall, temperature and denggue cases. Data analysis from the graph
showed that denggue cases can be predicted accurately using these
two main variables- rainfall and temperature. However, the model
will be further tested over a longer time period to ensure its
accuracy, reliability and efficiency as a prediction tool for dengue
outbreak.
Abstract: In this paper we use exponential particle swarm
optimization (EPSO) to cluster data. Then we compare between
(EPSO) clustering algorithm which depends on exponential variation
for the inertia weight and particle swarm optimization (PSO)
clustering algorithm which depends on linear inertia weight. This
comparison is evaluated on five data sets. The experimental results
show that EPSO clustering algorithm increases the possibility to find
the optimal positions as it decrease the number of failure. Also show
that (EPSO) clustering algorithm has a smaller quantization error
than (PSO) clustering algorithm, i.e. (EPSO) clustering algorithm
more accurate than (PSO) clustering algorithm.
Abstract: True integration of multimedia services over wired or
wireless networks increase the productivity and effectiveness in
today-s networks. IP Multimedia Subsystems are Next Generation
Network architecture to provide the multimedia services over fixed
or mobile networks. This paper proposes an extended SIP-based QoS
Management architecture for IMS services over underlying IP access
networks. To guarantee the end-to-end QoS for IMS services in
interconnection backbone, SIP based proxy Modules are introduced
to support the QoS provisioning and to reduce the handoff disruption
time over IP access networks. In our approach these SIP Modules
implement the combination of Diffserv and MPLS QoS mechanisms
to assure the guaranteed QoS for real-time multimedia services. To
guarantee QoS over access networks, SIP Modules make QoS
resource reservations in advance to provide best QoS to IMS users
over heterogeneous networks. To obtain more reliable multimedia
services, our approach allows the use of SCTP protocol over SIP
instead of UDP due to its multi-streaming feature. This architecture
enables QoS provisioning for IMS roaming users to differentiate IMS
network from other common IP networks for transmission of realtime
multimedia services. To validate our approach simulation
models are developed on short scale basis. The results show that our
approach yields comparable performance for efficient delivery of
IMS services over heterogeneous IP access networks.
Abstract: Testing is an activity that is required both in the
development and maintenance of the software development life cycle
in which Integration Testing is an important activity. Integration
testing is based on the specification and functionality of the software
and thus could be called black-box testing technique. The purpose of
integration testing is testing integration between software
components. In function or system testing, the concern is with overall
behavior and whether the software meets its functional specifications
or performance characteristics or how well the software and
hardware work together. This explains the importance and necessity
of IT for which the emphasis is on interactions between modules and
their interfaces. Software errors should be discovered early during
IT to reduce the costs of correction. This paper introduces a new type
of integration error, presenting an overview of Integration Testing
techniques with comparison of each technique and also identifying
which technique detects what type of error.
Abstract: It is observed that the Weighted least-square (WLS)
technique, including the modifications, results in equiripple error
curve. The resultant error as a percent of the ideal value is highly
non-uniformly distributed over the range of frequencies for which the
differentiator is designed. The present paper proposes a modification
to the technique so that the optimization procedure results in lower
maximum relative error compared to the ideal values. Simulation
results for first order as well as higher order differentiators are given
to illustrate the excellent performance of the proposed method.