Abstract: In this paper, a fragile watermarking scheme is proposed for color image specified object-s authentication. The color image is first transformed from RGB to YST color space, suitable for watermarking the color media. The T channel corresponds to the chrominance component of a color image andYS ÔèÑ T , therefore selected for embedding the watermark. The T channel is first divided into 2×2 non-overlapping blocks and the two LSBs are set to zero. The object that is to be authenticated is also divided into 2×2 nonoverlapping blocks and each block-s intensity mean is computed followed by eight bit encoding. The generated watermark is then embedded into T channel randomly selected 2×2 block-s LSBs using 2D-Torus Automorphism. Selection of block size is paramount for exact localization and recovery of work. The proposed scheme is blind, efficient and secure with ability to detect and locate even minor tampering applied to the image with full recovery of original work. The quality of watermarked media is quite high both subjectively and objectively. The technique is suitable for class of images with format such as gif, tif or bitmap.
Abstract: In this paper, we propose a multiple objective optimization model with respect to portfolio selection problem for investors looking forward to diversify their equity investments in a number of equity markets. Based on Markowitz-s M-V model we developed a Fuzzy Mixed Integer Multi-Objective Nonlinear Programming Problem (FMIMONLP) to maximize the investors- future gains on equity markets, reach the optimal proportion of the budget to be invested in different equities. A numerical example with a comprehensive analysis on artificial data from several equity markets is presented in order to illustrate the proposed model and its solution method. The model performed well compared with the deterministic version of the model.
Abstract: The proper selection of the AC-side passive filter
interconnecting the voltage source converter to the power supply is
essential to obtain satisfactory performances of an active power filter
system. The use of the LCL-type filter has the advantage of
eliminating the high frequency switching harmonics in the current
injected into the power supply. This paper is mainly focused on
analyzing the influence of the interface filter parameters on the active
filtering performances. Some design aspects are pointed out. Thus,
the design of the AC interface filter starts from transfer functions by
imposing the filter performance which refers to the significant current
attenuation of the switching harmonics without affecting the
harmonics to be compensated. A Matlab/Simulink model of the entire
active filtering system including a concrete nonlinear load has been
developed to examine the system performances. It is shown that a
gamma LC filter could accomplish the attenuation requirement of the
current provided by converter. Moreover, the existence of an optimal
value of the grid-side inductance which minimizes the total harmonic
distortion factor of the power supply current is pointed out.
Nevertheless, a small converter-side inductance and a damping
resistance in series with the filter capacitance are absolutely needed
in order to keep the ripple and oscillations of the current at the
converter side within acceptable limits. The effect of change in the
LCL-filter parameters is evaluated. It is concluded that good active
filtering performances can be achieved with small values of the
capacitance and converter-side inductance.
Abstract: Hierarchical Mobile IPv6 (HMIPv6) was designed to
support IP micro-mobility management in the Next Generation
Networks (NGN) framework. The main design behind this protocol is
the usage of Mobility Anchor Point (MAP) located at any level router
of network to support hierarchical mobility management. However,
the distance MAP selection in HMIPv6 causes MAP overloaded and
increase frequent binding update as the network grows. Therefore, to
address the issue in designing MAP selection scheme, we propose a
dynamic load control mechanism integrates with a speed detection
mechanism (DMS-DLC). From the experimental results we obtain
that the proposed scheme gives better distribution in MAP load and
increase handover speed.
Abstract: Nowadays, one of the most important problems of the
metropolises and the world large cities is the habitant traffic difficulty
and lack of sufficient parking site for the vehicles. Esfahan city as the
third metropolis of Iran has encountered with the vehicles parkingplace
problems in the most parts of fourteen regions of the city. The
non principled and non systematic dispersal and lack of parking sites
in the city has created an unfavorable status for its traffic and has
caused the air and sound pollutions increase; in addition, it wastes the
most portions of the citizenship and travelers' charge and time in
urban pathways and disturbs their mental and psychical calmness,
thus leads to their intensive dissatisfaction. In this study, by the usage
of AHP model in GIS environment, the effective criteria in selecting
the public parking sites have been combined with each other, and the
results of the created layers overlapping represent the parking
utilitarian vastness and widths. The achieved results of this research
indicate the pretty appropriate public parking sites selection in region
number 3 of Esfahan; but inconsequential dispersal and lack of these
parking sites in this region have caused abundant transportation
problems in Esfahan city.
Abstract: The paper presents a multimodal approach for biometric authentication, based on multiple classifiers. The proposed solution uses a post-classification biometric fusion method in which the biometric data classifiers outputs are combined in order to improve the overall biometric system performance by decreasing the classification error rates. The paper shows also the biometric recognition task improvement by means of a carefully feature selection, as much as not all of the feature vectors components support the accuracy improvement.
Abstract: Concatenative speech synthesis is a method that can
make speech sound which has naturalness and high-individuality of a
speaker by introducing a large speech corpus. Based on this method, in
this paper, we propose a voice conversion method whose conversion
speech has high-individuality and naturalness. The authors also have
two subjective evaluation experiments for evaluating individuality and
sound quality of conversion speech. From the results, following three
facts have be confirmed: (a) the proposal method can convert the
individuality of speakers well, (b) employing the framework of unit
selection (especially join cost) of concatenative speech synthesis into
conventional voice conversion improves the sound quality of
conversion speech, and (c) the proposal method is robust against the
difference of genders between a source speaker and a target speaker.
Abstract: Databases have become ubiquitous. Almost all IT applications are storing into and retrieving information from databases. Retrieving information from the database requires knowledge of technical languages such as Structured Query Language (SQL). However majority of the users who interact with the databases do not have a technical background and are intimidated by the idea of using languages such as SQL. This has led to the development of a few Natural Language Database Interfaces (NLDBIs). A NLDBI allows the user to query the database in a natural language. This paper highlights on architecture of new NLDBI system, its implementation and discusses on results obtained. In most of the typical NLDBI systems the natural language statement is converted into an internal representation based on the syntactic and semantic knowledge of the natural language. This representation is then converted into queries using a representation converter. A natural language query is translated to an equivalent SQL query after processing through various stages. The work has been experimented on primitive database queries with certain constraints.
Abstract: The density estimates considered in this paper comprise
a base density and an adjustment component consisting of a linear
combination of orthogonal polynomials. It is shown that, in the
context of density approximation, the coefficients of the linear combination
can be determined either from a moment-matching technique
or a weighted least-squares approach. A kernel representation of
the corresponding density estimates is obtained. Additionally, two
refinements of the Kronmal-Tarter stopping criterion are proposed
for determining the degree of the polynomial adjustment. By way of
illustration, the density estimation methodology advocated herein is
applied to two data sets.
Abstract: Pharmacology curriculum plays an integral role in
medical education. Learning pharmacology to choose and prescribe
drugs is a major challenge encountered by students. We developed
pharmacology applied learning activities for first year medical
students that included realistic clinical situations with escalating
complications which required the students to analyze the situation
and think critically to choose a safe drug. Tutor feedback was
provided at the end of session. Evaluation was done to assess the
students- level of interest and usefulness of the sessions in rational
selection of drugs. Majority (98 %) of the students agreed that the
session was an extremely useful learning exercise and agreed that
similar sessions would help in rational selection of drugs. Applied
learning sessions in the early years of medical program may promote
deep learning and bridge the gap between pharmacology theory and
clinical practice. Besides, it may also enhance safe prescribing skills.
Abstract: Computer worm detection is commonly performed by
antivirus software tools that rely on prior explicit knowledge of the
worm-s code (detection based on code signatures). We present an
approach for detection of the presence of computer worms based on
Artificial Neural Networks (ANN) using the computer's behavioral
measures. Identification of significant features, which describe the
activity of a worm within a host, is commonly acquired from security
experts. We suggest acquiring these features by applying feature
selection methods. We compare three different feature selection
techniques for the dimensionality reduction and identification of the
most prominent features to capture efficiently the computer behavior
in the context of worm activity. Additionally, we explore three
different temporal representation techniques for the most prominent
features. In order to evaluate the different techniques, several
computers were infected with five different worms and 323 different
features of the infected computers were measured. We evaluated
each technique by preprocessing the dataset according to each one
and training the ANN model with the preprocessed data. We then
evaluated the ability of the model to detect the presence of a new
computer worm, in particular, during heavy user activity on the
infected computers.
Abstract: A novel approach to speech coding using the hybrid architecture is presented. Advantages of parametric and perceptual coding methods are utilized together in order to create a speech coding algorithm assuring better signal quality than in traditional CELP parametric codec. Two approaches are discussed. One is based on selection of voiced signal components that are encoded using parametric algorithm, unvoiced components that are encoded perceptually and transients that remain unencoded. The second approach uses perceptual encoding of the residual signal in CELP codec. The algorithm applied for precise transient selection is described. Signal quality achieved using the proposed hybrid codec is compared to quality of some standard speech codecs.
Abstract: There is an urgent need to develop novel
Mycobacterium tuberculosis (Mtb) drugs that are active against drug
resistant bacteria but, more importantly, kill persistent bacteria. Our
study structured based on integrated analysis of metabolic pathways,
small molecule screening and similarity Search in PubChem
Database. Metabolic analysis approaches based on Unified weighted
used for potent target selection. Our results suggest that pantothenate
synthetase (panC) and and 3-methyl-2-oxobutanoate hydroxymethyl
transferase (panB) as a appropriate drug targets. In our study, we
used pantothenate synthetase because of existence inhibitors. We
have reported the discovery of new antitubercular compounds
through ligand based approaches using computational tools.
Abstract: Orthogonal Frequency Division Multiplexing
(OFDM) is an efficient method of data transmission for high speed
communication systems. However, the main drawback of OFDM
systems is that, it suffers from the problem of high Peak-to-Average
Power Ratio (PAPR) which causes inefficient use of the High Power
Amplifier and could limit transmission efficiency. OFDM consist of
large number of independent subcarriers, as a result of which the
amplitude of such a signal can have high peak values. In this paper,
we propose an effective reduction scheme that combines DCT and
SLM techniques. The scheme is composed of the DCT followed by
the SLM using the Riemann matrix to obtain phase sequences for the
SLM technique. The simulation results show PAPR can be greatly
reduced by applying the proposed scheme. In comparison with
OFDM, while OFDM had high values of PAPR –about 10.4dB our
proposed method achieved about 4.7dB reduction of the PAPR with
low complexities computation. This approach also avoids
randomness in phase sequence selection, which makes it simpler to
decode at the receiver. As an added benefit, the matrices can be
generated at the receiver end to obtain the data signal and hence it is
not required to transmit side information (SI).
Abstract: Testing accounts for the major percentage of technical
contribution in the software development process. Typically, it
consumes more than 50 percent of the total cost of developing a
piece of software. The selection of software tests is a very important
activity within this process to ensure the software reliability
requirements are met. Generally tests are run to achieve maximum
coverage of the software code and very little attention is given to the
achieved reliability of the software. Using an existing methodology,
this paper describes how to use Bayesian Belief Networks (BBNs) to
select unit tests based on their contribution to the reliability of the
module under consideration. In particular the work examines how the
approach can enhance test-first development by assessing the quality
of test suites resulting from this development methodology and
providing insight into additional tests that can significantly reduce
the achieved reliability. In this way the method can produce an
optimal selection of inputs and the order in which the tests are
executed to maximize the software reliability. To illustrate this
approach, a belief network is constructed for a modern software
system incorporating the expert opinion, expressed through
probabilities of the relative quality of the elements of the software,
and the potential effectiveness of the software tests. The steps
involved in constructing the Bayesian Network are explained as is a
method to allow for the test suite resulting from test-driven
development.
Abstract: The selection of parents and breeding strategies for
the successful maize hybrid production will be facilitated by
heterotic groupings of parental lines and determination of combining
abilities of them. Fourteen maize inbred lines, used in maize breeding
programs in Iran, were crossed in a diallel mating design. The 91 F1
hybrids and the 14 parental lines were studied during two years at
four locations of Iran for investigation of combining ability of
gentypes for grain yield and to determine heterotic patterns among
germplasm sources, using both, the Griffing-s method and the biplot
approach for diallel analysis. The graphical representation offered by
biplot analysis allowed a rapid and effective overview of general
combining ability (GCA) and specific combining ability (SCA)
effects of the inbred lines, their performance in crosses, as well as
grouping patterns of similar genotypes. GCA and SCA effects were
significant for grain yield (GY). Based on significant positive GCA
effects, the lines derived from LSC could be used as parent in crosses
to increase GY. The maximum best- parent heterosis values and
highest SCA effects resulted from crosses B73 × MO17 and A679 ×
MO17 for GY. The best heterotic patterns were LSC × RYD, which
would be potentially useful in maize breeding programs to obtain
high-yielding hybrids in the same climate of Iran.
Abstract: In this paper, we consider the effect of the initial
sample size on the performance of a sequential approach that used
in selecting a good enough simulated system, when the number
of alternatives is very large. We implement a sequential approach
on M=M=1 queuing system under some parameter settings, with a
different choice of the initial sample sizes to explore the impacts on
the performance of this approach. The results show that the choice
of the initial sample size does affect the performance of our selection
approach.
Abstract: The study in this paper underlines the importance of
correct joint selection of the spreading codes for uplink of multicarrier
code division multiple access (MC-CDMA) at the transmitter
side and detector at the receiver side in the presence of nonlinear
distortion due to high power amplifier (HPA). The bit error rate
(BER) of system for different spreading sequences (Walsh code, Gold
code, orthogonal Gold code, Golay code and Zadoff-Chu code) and
different kinds of receivers (minimum mean-square error receiver
(MMSE-MUD) and microstatistic multi-user receiver (MSF-MUD))
is compared by means of simulations for MC-CDMA transmission
system. Finally, the results of analysis will show, that the application
of MSF-MUD in combination with Golay codes can outperform
significantly the other tested spreading codes and receivers for all
mostly used models of HPA.
Abstract: This paper presents modeling and optimization of two NP-hard problems in flexible manufacturing system (FMS), part type selection problem and loading problem. Due to the complexity and extent of the problems, the paper was split into two parts. The first part of the papers has discussed the modeling of the problems and showed how the real coded genetic algorithms (RCGA) can be applied to solve the problems. This second part discusses the effectiveness of the RCGA which uses an array of real numbers as chromosome representation. The novel proposed chromosome representation produces only feasible solutions which minimize a computational time needed by GA to push its population toward feasible search space or repair infeasible chromosomes. The proposed RCGA improves the FMS performance by considering two objectives, maximizing system throughput and maintaining the balance of the system (minimizing system unbalance). The resulted objective values are compared to the optimum values produced by branch-and-bound method. The experiments show that the proposed RCGA could reach near optimum solutions in a reasonable amount of time.
Abstract: The number of features required to represent an image
can be very huge. Using all available features to recognize objects
can suffer from curse dimensionality. Feature selection and
extraction is the pre-processing step of image mining. Main issues in
analyzing images is the effective identification of features and
another one is extracting them. The mining problem that has been
focused is the grouping of features for different shapes. Experiments
have been conducted by using shape outline as the features. Shape
outline readings are put through normalization and dimensionality
reduction process using an eigenvector based method to produce a
new set of readings. After this pre-processing step data will be
grouped through their shapes. Through statistical analysis, these
readings together with peak measures a robust classification and
recognition process is achieved. Tests showed that the suggested
methods are able to automatically recognize objects through their
shapes. Finally, experiments also demonstrate the system invariance
to rotation, translation, scale, reflection and to a small degree of
distortion.