Abstract: We present a low frequency watermarking method
adaptive to image content. The image content is analyzed and
properties of HVS are exploited to generate a visual mask of the
same size as the approximation image. Using this mask we embed the
watermark in the approximation image without degrading the image
quality. Watermark detection is performed without using the original
image. Experimental results show that the proposed watermarking
method is robust against most common image processing operations,
which can be easily implemented and usually do not degrade the
image quality.
Abstract: In the past decade, artificial neural networks (ANNs)
have been regarded as an instrument for problem-solving and
decision-making; indeed, they have already done with a substantial
efficiency and effectiveness improvement in industries and businesses.
In this paper, the Back-Propagation neural Networks (BPNs) will be
modulated to demonstrate the performance of the collaborative
forecasting (CF) function of a Collaborative Planning, Forecasting and
Replenishment (CPFR®) system. CPFR functions the balance between
the sufficient product supply and the necessary customer demand in a
Supply and Demand Chain (SDC). Several classical standard BPN will
be grouped, collaborated and exploited for the easy implementation of
the proposed modular ANN framework based on the topology of a
SDC. Each individual BPN is applied as a modular tool to perform the
task of forecasting SKUs (Stock-Keeping Units) levels that are
managed and supervised at a POS (point of sale), a wholesaler, and a
manufacturer in an SDC. The proposed modular BPN-based CF
system will be exemplified and experimentally verified using lots of
datasets of the simulated SDC. The experimental results showed that a
complex CF problem can be divided into a group of simpler
sub-problems based on the single independent trading partners
distributed over SDC, and its SKU forecasting accuracy was satisfied
when the system forecasted values compared to the original simulated
SDC data. The primary task of implementing an autonomous CF
involves the study of supervised ANN learning methodology which
aims at making “knowledgeable" decision for the best SKU sales plan
and stocks management.
Abstract: Optical 3D measurement of objects is meaningful in
numerous industrial applications. In various cases shape acquisition
of weak textured objects is essential. Examples are repetition parts
made of plastic or ceramic such as housing parts or ceramic bottles as
well as agricultural products like tubers. These parts are often
conveyed in a wobbling way during the automated optical inspection.
Thus, conventional 3D shape acquisition methods like laser scanning
might fail. In this paper, a novel approach for acquiring 3D shape of
weak textured and moving objects is presented. To facilitate such
measurements an active stereo vision system with structured light is
proposed. The system consists of multiple camera pairs and auxiliary
laser pattern generators. It performs the shape acquisition within one
shot and is beneficial for rapid inspection tasks. An experimental
setup including hardware and software has been developed and
implemented.
Abstract: The Ad Hoc on demand distance vector (AODV) routing protocol is designed for mobile ad hoc networks (MANETs). AODV offers quick adaptation to dynamic link conditions; it is characterized by low memory overhead and low network utilization. The security issues related to the protocol remain challenging for the wireless network designers. Numerous schemes have been proposed for establishing secure communication between end users, these schemes identify that the secure operation of AODV is a bi tier task (routing and secure exchange of information at separate levels). Our endeavor in this paper would focus on achieving the routing and secure data exchange in a single step. This will facilitate the user nodes to perform routing, mutual authentications, generation and secure exchange of session key in one step thus ensuring confidentiality, integrity and authentication of data exchange in a more suitable way.
Abstract: Although face recognition seems as an easy task for
human, automatic face recognition is a much more challenging task
due to variations in time, illumination and pose. In this paper, the
influence of time-lapse on visible and thermal images is examined.
Orthogonal moment invariants are used as a feature extractor to
analyze the effect of time-lapse on thermal and visible images and the
results are compared with conventional Principal Component
Analysis (PCA). A new triangle square ratio criterion is employed
instead of Euclidean distance to enhance the performance of nearest
neighbor classifier. The results of this study indicate that the ideal
feature vectors can be represented with high discrimination power
due to the global characteristic of orthogonal moment invariants.
Moreover, the effect of time-lapse has been decreasing and enhancing
the accuracy of face recognition considerably in comparison with
PCA. Furthermore, our experimental results based on moment
invariant and triangle square ratio criterion show that the proposed
approach achieves on average 13.6% higher in recognition rate than
PCA.
Abstract: Information and Communication Technologies (ICT) in mathematical education is a very active field of research and innovation, where learning is understood to be meaningful and grasping multiple linked representation rather than rote memorization, a great amount of literature offering a wide range of theories, learning approaches, methodologies and interpretations, are generally stressing the potentialities for teaching and learning using ICT. Despite the utilization of new learning approaches with ICT, students experience difficulties in learning concepts relevant to understanding mathematics, much remains unclear about the relationship between the computer environment, the activities it might support, and the knowledge that might emerge from such activities. Many questions that might arise in this regard: to what extent does the use of ICT help students in the process of understanding and solving tasks or problems? Is it possible to identify what aspects or features of students' mathematical learning can be enhanced by the use of technology? This paper will highlight the interest of the integration of information and communication technologies (ICT) into the teaching and learning of mathematics (quadratic functions), it aims to investigate the effect of four instructional methods on students- mathematical understanding and problem solving. Quantitative and qualitative methods are used to report about 43 students in middle school. Results showed that mathematical thinking and problem solving evolves as students engage with ICT activities and learn cooperatively.
Abstract: The feature extraction method(s) used to recognize
hand-printed characters play an important role in ICR applications.
In order to achieve high recognition rate for a recognition system, the
choice of a feature that suits for the given script is certainly an
important task. Even if a new feature required to be designed for a
given script, it is essential to know the recognition ability of the
existing features for that script. Devanagari script is being used in
various Indian languages besides Hindi the mother tongue of majority
of Indians. This research examines a variety of feature extraction
approaches, which have been used in various ICR/OCR applications,
in context to Devanagari hand-printed script. The study is conducted
theoretically and experimentally on more that 10 feature extraction
methods. The various feature extraction methods have been evaluated
on Devanagari hand-printed database comprising more than 25000
characters belonging to 43 alphabets. The recognition ability of the
features have been evaluated using three classifiers i.e. k-NN, MLP
and SVM.
Abstract: One of the long standing challenging aspect in mobile robotics is the ability to navigate autonomously, avoiding modeled and unmodeled obstacles especially in crowded and unpredictably changing environment. A successful way of structuring the navigation task in order to deal with the problem is within behavior based navigation approaches. In this study, Issues of individual behavior design and action coordination of the behaviors will be addressed using fuzzy logic. A layered approach is employed in this work in which a supervision layer based on the context makes a decision as to which behavior(s) to process (activate) rather than processing all behavior(s) and then blending the appropriate ones, as a result time and computational resources are saved.
Abstract: Ensemble learning algorithms such as AdaBoost and
Bagging have been in active research and shown improvements in
classification results for several benchmarking data sets with mainly
decision trees as their base classifiers. In this paper we experiment to
apply these Meta learning techniques with classifiers such as random
forests, neural networks and support vector machines. The data sets
are from MAGIC, a Cherenkov telescope experiment. The task is to
classify gamma signals from overwhelmingly hadron and muon
signals representing a rare class classification problem. We compare
the individual classifiers with their ensemble counterparts and
discuss the results. WEKA a wonderful tool for machine learning has
been used for making the experiments.
Abstract: Although the World Wide Web is considered the
largest source of information there exists nowadays, due to its
inherent dynamic characteristics, the task of finding useful and
qualified information can become a very frustrating experience. This
study presents a research on the information mining systems in the
Web; and proposes an implementation of these systems by means of
components that can be built using the technology of Web services.
This implies that they can encompass features offered by a services
oriented architecture (SOA) and specific components may be used by
other tools, independent of platforms or programming languages.
Hence, the main objective of this work is to provide an architecture
to Web mining systems, divided into stages, where each step is a
component that will incorporate the characteristics of SOA. The
separation of these steps was designed based upon the existing
literature. Interesting results were obtained and are shown here.
Abstract: Electrocardiogram (ECG) segmentation is necessary
to help reduce the time consuming task of manually annotating
ECG-s. Several algorithms have been developed to segment the ECG
automatically. We first review several of such methods, and then
present a new single lead segmentation method based on Adaptive
piecewise constant approximation (APCA) and Piecewise derivative
dynamic time warping (PDDTW). The results are tested on the QT
database. We compared our results to Laguna-s two lead method. Our
proposed approach has a comparable mean error, but yields a slightly
higher standard deviation than Laguna-s method.
Abstract: We have developed a computer program consisting of
6 subtests assessing the children hand dexterity applicable in the
rehabilitation medicine. We have carried out a normative study on a
representative sample of 285 children aged from 7 to 15 (mean age
11.3) and we have proposed clinical standards for three age groups
(7-9, 9-11, 12-15 years). We have shown statistical significance of
differences among the corresponding mean values of the task time
completion. We have also found a strong correlation between the task
time completion and the age of the subjects, as well as we have
performed the test-retest reliability checks in the sample of 84
children, giving the high values of the Pearson coefficients for the
dominant and non-dominant hand in the range 0.740.97 and
0.620.93, respectively.
A new MATLAB-based programming tool aiming at analysis of
cardiologic RR intervals and blood pressure descriptors, is worked
out, too. For each set of data, ten different parameters are extracted: 2
in time domain, 4 in frequency domain and 4 in Poincaré plot
analysis. In addition twelve different parameters of baroreflex
sensitivity are calculated. All these data sets can be visualized in time
domain together with their power spectra and Poincaré plots. If
available, the respiratory oscillation curves can be also plotted for
comparison. Another application processes biological data obtained
from BLAST analysis.
Abstract: A sequential decision problem, based on the task ofidentifying the species of trees given acoustic echo data collectedfrom them, is considered with well-known stochastic classifiers,including single and mixture Gaussian models. Echoes are processedwith a preprocessing stage based on a model of mammalian cochlearfiltering, using a new discrete low-pass filter characteristic. Stoppingtime performance of the sequential decision process is evaluated andcompared. It is observed that the new low pass filter processingresults in faster sequential decisions.
Abstract: High-frequency (HF) communications have been used by military organizations for more than 90 years. The opportunity of very long range communications without the need for advanced equipment makes HF a convenient and inexpensive alternative of satellite communications. Besides the advantages, voice and data transmission over HF is a challenging task, because the HF channel generally suffers from Doppler shift and spread, multi-path, cochannel interference, and many other sources of noise. In constructing an HF data modem, all these effects must be taken into account. STANAG 4539 is a NATO standard for high-speed data transmission over HF. It allows data rates up to 12800 bps over an HF channel of 3 kHz. In this work, an efficient implementation of STANAG 4539 on a single Texas Instruments- TMS320C6747 DSP chip is described. The state-of-the-art algorithms used in the receiver and the efficiency of the implementation enables real-time high-speed data / digitized voice transmission over poor HF channels.
Abstract: Use of the Internet and the World-Wide-Web
(WWW) has become widespread in recent years and mobile agent
technology has proliferated at an equally rapid rate. In this scenario
load balancing becomes important for P2P systems. Beside P2P
systems can be highly heterogeneous, i.e., they may consists of peers
that range from old desktops to powerful servers connected to
internet through high-bandwidth lines. There are various loads
balancing policies came into picture. Primitive one is Message
Passing Interface (MPI). Its wide availability and portability make it
an attractive choice; however the communication requirements are
sometimes inefficient when implementing the primitives provided by
MPI. In this scenario we use the concept of mobile agent because
Mobile agent (MA) based approach have the merits of high
flexibility, efficiency, low network traffic, less communication
latency as well as highly asynchronous. In this study we present
decentralized load balancing scheme using mobile agent technology
in which when a node is overloaded, task migrates to less utilized
nodes so as to share the workload. However, the decision of which
nodes receive migrating task is made in real-time by defining certain
load balancing policies. These policies are executed on PMADE (A
Platform for Mobile Agent Distribution and Execution) in
decentralized manner using JuxtaNet and various load balancing
metrics are discussed.
Abstract: In this paper, we propose a fixed formatting method of PPX(Pretty Printer for XML). PPX is a query language for XML database which has extensive formatting capability that produces HTML as the result of a query. The fixed formatting method is to completely specify the combination of variables and layout specification operators within the layout expression of the GENERATE clause of PPX. In the experiment, a quick comparison shows that PPX requires far less description compared to XSLT or XQuery programs doing the same tasks.
Abstract: This paper examines the modeling and analysis of a
cruise control system using a Petri net based approach, task graphs,
invariant analysis and behavioral properties. It shows how the
structures used can be verified and optimized.
Abstract: Tandem mass spectrometry (MS/MS) is the engine
driving high-throughput protein identification. Protein mixtures possibly
representing thousands of proteins from multiple species are
treated with proteolytic enzymes, cutting the proteins into smaller
peptides that are then analyzed generating MS/MS spectra. The
task of determining the identity of the peptide from its spectrum
is currently the weak point in the process. Current approaches to de
novo sequencing are able to compute candidate peptides efficiently.
The problem lies in the limitations of current scoring functions. In this
paper we introduce the concept of proteome signature. By examining
proteins and compiling proteome signatures (amino acid usage) it is
possible to characterize likely combinations of amino acids and better
distinguish between candidate peptides. Our results strongly support
the hypothesis that a scoring function that considers amino acid usage
patterns is better able to distinguish between candidate peptides. This
in turn leads to higher accuracy in peptide prediction.
Abstract: Tasks of the work were study the possible E.coli
contamination in red deer meat, identify pathogenic strains from
isolated E.coli, determine their incidence in red deer meat and
determine the presence of VT1, VT2 and eaeA genes for the
pathogenic E.coli. 8 (10%) samples were randomly selected from 80
analysed isolates of E.coli and PCR reaction was performed on them.
PCR was done both on initial materials – samples of red deer meat -
and for already isolated liqueurs. Two of analysed venison samples
contain verotoxin-producing strains of E. coli. It means that this meat
is not safe to consumer. It was proven by the sequestration reaction of
E. coli and by comparison of the obtained results with the database of
microorganism genome available on the internet that the isolated
culture corresponds to region 16S rDNS of E. coli thus presenting
correctness of the microbiological methods.
Abstract: This paper presents a modified version of the
maximum urgency first scheduling algorithm. The maximum
urgency algorithm combines the advantages of fixed and dynamic
scheduling to provide the dynamically changing systems with
flexible scheduling. This algorithm, however, has a major
shortcoming due to its scheduling mechanism which may cause a
critical task to fail. The modified maximum urgency first scheduling
algorithm resolves the mentioned problem. In this paper, we propose
two possible implementations for this algorithm by using either
earliest deadline first or modified least laxity first algorithms for
calculating the dynamic priorities. These two approaches are
compared together by simulating the two algorithms. The earliest
deadline first algorithm as the preferred implementation is then
recommended. Afterwards, we make a comparison between our
proposed algorithm and maximum urgency first algorithm using
simulation and results are presented. It is shown that modified
maximum urgency first is superior to maximum urgency first, since it
usually has less task preemption and hence, less related overhead. It
also leads to less failed non-critical tasks in overloaded situations.