Abstract: This paper presents a novel method for remaining
useful life prediction using the Elliptical Basis Function (EBF)
network and a Markov chain. The EBF structure is trained by a
modified Expectation-Maximization (EM) algorithm in order to take
into account the missing covariate set. No explicit extrapolation is
needed for internal covariates while a Markov chain is constructed to
represent the evolution of external covariates in the study. The
estimated external and the unknown internal covariates constitute an
incomplete covariate set which are then used and analyzed by the EBF
network to provide survival information of the asset. It is shown in the
case study that the method slightly underestimates the remaining
useful life of an asset which is a desirable result for early maintenance
decision and resource planning.
Abstract: When it comes to last, it is regarded as the critical
foundation of shoe design and development. A computer aided
methodology for various last form designs is proposed in this study.
The reverse engineering is mainly applied to the process of scanning
for the last form. Then with the minimum energy for revision of
surface continuity, the surface reconstruction of last is rebuilt by the
feature curves of the scanned last. When the surface reconstruction of
last is completed, the weighted arithmetic mean method is applied to
the computation on the shape morphing for the control mesh of last,
thus 3D last form of different sizes is generated from its original form
feature with functions remained. In the end, the result of this study is
applied to an application for 3D last reconstruction system. The
practicability of the proposed methodology is verified through later
case studies.
Abstract: Key management represents a major and the most
sensitive part of cryptographic systems. It includes key generation,
key distribution, key storage, and key deletion. It is also considered
the hardest part of cryptography. Designing secure cryptographic
algorithms is hard, and keeping the keys secret is much harder.
Cryptanalysts usually attack both symmetric and public key
cryptosystems through their key management. We introduce a
protocol to exchange cipher keys over insecure communication
channel. This protocol is based on public key cryptosystem,
especially elliptic curve cryptosystem. Meanwhile, it tests the cipher
keys and selects only the good keys and rejects the weak one.
Abstract: On a such wide-area environment as a Grid, data
placement is an important aspect of distributed database systems. In
this paper, we address the problem of initial placement of database
no-replicated fragments in Grid architecture. We propose a graph
based approach that considers resource restrictions. The goal is to
optimize the use of computing, storage and communication
resources. The proposed approach is developed in two phases: in the
first phase, we perform fragment grouping using knowledge about
fragments dependency and, in the second phase, we determine an
efficient placement of the fragment groups on the Grid. We also
show, via experimental analysis that our approach gives solutions
that are close to being optimal for different databases and Grid
configurations.
Abstract: Text categorization is the problem of classifying text
documents into a set of predefined classes. After a preprocessing
step, the documents are typically represented as large sparse vectors.
When training classifiers on large collections of documents, both the
time and memory restrictions can be quite prohibitive. This justifies
the application of feature selection methods to reduce the
dimensionality of the document-representation vector. In this paper,
three feature selection methods are evaluated: Random Selection,
Information Gain (IG) and Support Vector Machine feature selection
(called SVM_FS). We show that the best results were obtained with
SVM_FS method for a relatively small dimension of the feature
vector. Also we present a novel method to better correlate SVM
kernel-s parameters (Polynomial or Gaussian kernel).
Abstract: Due to important issues, such as deadlock, starvation,
communication, non-deterministic behavior and synchronization,
concurrent systems are very complex, sensitive, and error-prone.
Thus ensuring reliability and accuracy of these systems is very
essential. Therefore, there has been a big interest in the formal
specification of concurrent programs in recent years. Nevertheless,
some features of concurrent systems, such as dynamic process
creation, scheduling and starvation have not been specified formally
yet. Also, some other features have been specified partially and/or
have been described using a combination of several different
formalisms and methods whose integration needs too much effort. In
other words, a comprehensive and integrated specification that could
cover all aspects of concurrent systems has not been provided yet.
Thus, this paper makes two major contributions: firstly, it provides a
comprehensive formal framework to specify all well-known features
of concurrent systems. Secondly, it provides an integrated
specification of these features by using just a single formal notation,
i.e., the Z language.
Abstract: Character segmentation is an important preprocessing
step for text recognition. In degraded documents, existence of
touching characters decreases recognition rate drastically, for any
optical character recognition (OCR) system. In this paper we have
proposed a complete solution for segmenting touching characters in
all the three zones of printed Gurmukhi script. A study of touching
Gurmukhi characters is carried out and these characters have been
divided into various categories after a careful analysis. Structural
properties of the Gurmukhi characters are used for defining the
categories. New algorithms have been proposed to segment the
touching characters in middle zone, upper zone and lower zone.
These algorithms have shown a reasonable improvement in
segmenting the touching characters in degraded printed Gurmukhi
script. The algorithms proposed in this paper are applicable only to
machine printed text. We have also discussed a new and useful
technique to segment the horizontally overlapping lines.
Abstract: Control of complex systems is one of important files in complex systems, that not only relies on the essence of complex systems which is denoted by the core concept – emergence, but also embodies the elementary concept in control theory. Aiming at giving a clear and self-contained description of emergence, the paper introduces a formal way to completely describe the formation and dynamics of emergence in complex systems. Consequently, this paper indicates the Emergence-Oriented Control methodology that contains three kinds of basic control schemes: the direct control, the system re-structuring and the system calibration. As a universal ontology, the Emergence-Oriented Control provides a powerful tool for identifying and resolving control problems in specific systems.
Abstract: Very Large and/or computationally complex optimization problems sometimes require parallel or highperformance computing for achieving a reasonable time for computation. One of the most popular and most complicate problems of this family is “Traveling Salesman Problem". In this paper we have introduced a Branch & Bound based algorithm for the solution of such complicated problems. The main focus of the algorithm is to solve the “symmetric traveling salesman problem". We reviewed some of already available algorithms and felt that there is need of new algorithm which should give optimal solution or near to the optimal solution. On the basis of the use of logarithmic sampling, it was found that the proposed algorithm produced a relatively optimal solution for the problem and results excellent performance as compared with the traditional algorithms of this series.
Abstract: In this paper we propose two first non-generic constructions
of multisignature scheme based on coding theory. The
first system make use of the CFS signature scheme and is secure
in random oracle while the second scheme is based on the KKS
construction and is a few times. The security of our construction relies
on a difficult problems in coding theory: The Syndrome Decoding
problem which has been proved NP-complete [4].
Abstract: Distributed Computing Systems are usually considered the most suitable model for practical solutions of many parallel algorithms. In this paper an enhanced distributed system is presented to improve the time complexity of Binary Indexed Trees (BIT). The proposed system uses multi-uniform processors with identical architectures and a specially designed distributed memory system. The analysis of this system has shown that it has reduced the time complexity of the read query to O(Log(Log(N))), and the update query to constant complexity, while the naive solution has a time complexity of O(Log(N)) for both queries. The system was implemented and simulated using VHDL and Verilog Hardware Description Languages, with xilinx ISE 10.1, as the development environment and ModelSim 6.1c, similarly as the simulation tool. The simulation has shown that the overhead resulting by the wiring and communication between the system fragments could be fairly neglected, which makes it applicable to practically reach the maximum speed up offered by the proposed model.
Abstract: In this paper we use data mining techniques to investigate factors that contribute significantly to enhancing the risk of acute coronary syndrome. We assume that the dependent variable is diagnosis – with dichotomous values showing presence or absence of disease. We have applied binary regression to the factors affecting the dependent variable. The data set has been taken from two different cardiac hospitals of Karachi, Pakistan. We have total sixteen variables out of which one is assumed dependent and other 15 are independent variables. For better performance of the regression model in predicting acute coronary syndrome, data reduction techniques like principle component analysis is applied. Based on results of data reduction, we have considered only 14 out of sixteen factors.
Abstract: Acoustic Imaging based sound localization using microphone
array is a challenging task in digital-signal processing.
Discrete Fourier transform (DFT) based near-field acoustical holography
(NAH) is an important acoustical technique for sound source
localization and provide an efficient solution to the ill-posed problem.
However, in practice, due to the usage of small curtailed aperture
and its consequence of significant spectral leakage, the DFT could
not reconstruct the active-region-of-sound (AROS) effectively, especially
near the edges of aperture. In this paper, we emphasize the
fundamental problems of DFT-based NAH, provide a solution to
spectral leakage effect by the extrapolation based on linear predictive
coding and 2D Tukey windowing. This approach has been tested to
localize the single and multi-point sound sources. We observe that
incorporating extrapolation technique increases the spatial resolution,
localization accuracy and reduces spectral leakage when small curtail
aperture with a lower number of sensors accounts.
Abstract: Automatic keyphrase extraction is useful in efficiently
locating specific documents in online databases. While several
techniques have been introduced over the years, improvement on
accuracy rate is minimal. This research examines attribute scores for
author-supplied keyphrases to better understand how the scores affect
the accuracy rate of automatic keyphrase extraction. Five attributes
are chosen for examination: Term Frequency, First Occurrence, Last
Occurrence, Phrase Position in Sentences, and Term Cohesion
Degree. The results show that First Occurrence is the most reliable
attribute. Term Frequency, Last Occurrence and Term Cohesion
Degree display a wide range of variation but are still usable with
suggested tweaks. Only Phrase Position in Sentences shows a totally
unpredictable pattern. The results imply that the commonly used
ranking approach which directly extracts top ranked potential phrases
from candidate keyphrase list as the keyphrases may not be reliable.
Abstract: This paper deals with the application for contentbased
image retrieval to extract color feature from natural images
stored in the image database by segmenting the image through
clustering. We employ a class of nonparametric techniques in which
the data points are regarded as samples from an unknown probability
density. Explicit computation of the density is avoided by using the
mean shift procedure, a robust clustering technique, which does not
require prior knowledge of the number of clusters, and does not
constrain the shape of the clusters. A non-parametric technique for
the recovery of significant image features is presented and
segmentation module is developed using the mean shift algorithm to
segment each image. In these algorithms, the only user set parameter
is the resolution of the analysis and either gray level or color images
are accepted as inputs. Extensive experimental results illustrate
excellent performance.
Abstract: In recent five decades, textured yarns of polyester fiber produced by false twist method are the most
important and mass-produced manmade fibers. There are
many parameters of cross section which affect the physical and mechanical properties of textured yarns. These parameters
are surface area, perimeter, equivalent diameter, large
diameter, small diameter, convexity, stiffness, eccentricity, and hydraulic diameter. These parameters were evaluated by
digital image processing techniques. To find trends between production criteria and evaluated parameters of cross section, three criteria of production line have been adjusted and different types of yarns were produced. These criteria are
temperature, drafting ratio, and D/Y ratio. Finally the relations between production criteria and cross section parameters were
considered. The results showed that the presented technique can recognize and measure the parameters of fiber cross section in acceptable accuracy. Also, the optimum condition
of adjustments has been estimated from results of image analysis evaluation.
Abstract: In this paper, parallel interface for microprocessor
trainer was implemented. A programmable parallel–port device such
as the IC 8255A is initialized for simple input or output and for
handshake input or output by choosing kinds of modes. The hardware
connections and the programs can be used to interface
microprocessor trainer and a personal computer by using IC 8255A.
The assembly programs edited on PC-s editor can be downloaded to
the trainer.
Abstract: In this paper we present a novel technique for data
hiding in binary document images. We use the concept of entropy in
order to identify document specific least distortive areas throughout
the binary document image. The document image is treated as any
other image and the proposed method utilizes the standard document
characteristics for the embedding process. Proposed method
minimizes perceptual distortion due to embedding and allows
watermark extraction without the requirement of any side information
at the decoder end.
Abstract: The colors of the human skin represent a special
category of colors, because they are distinctive from the colors of
other natural objects. This category is found as a cluster in color
spaces, and the skin color variations between people are mostly due
to differences in the intensity. Besides, the face detection based on
skin color detection is a faster method as compared to other
techniques. In this work, we present a system to track faces by
carrying out skin color detection in four different color spaces: HSI,
YCbCr, YES and RGB. Once some skin color regions have been
detected for each color space, we label each and get some
characteristics such as size and position. We are supposing that a face
is located in one the detected regions. Next, we compare and employ
a polling strategy between labeled regions to determine the final
region where the face effectively has been detected and located.
Abstract: In this paper a fast motion estimation method for
H.264/AVC named Triplet Search Motion Estimation (TS-ME) is
proposed. Similar to some of the traditional fast motion estimation
methods and their improved proposals which restrict the search points
only to some selected candidates to decrease the computation
complexity, proposed algorithm separate the motion search process to
several steps but with some new features. First, proposed algorithm try
to search the real motion area using proposed triplet patterns instead of
some selected search points to avoid dropping into the local minimum.
Then, in the localized motion area a novel 3-step motion search
algorithm is performed. Proposed search patterns are categorized into
three rings on the basis of the distance from the search center. These
three rings are adaptively selected by referencing the surrounding
motion vectors to early terminate the motion search process. On the
other hand, computation reduction for sub pixel motion search is also
discussed considering the appearance probability of the sub pixel
motion vector. From the simulation results, motion estimation speed
improved by a factor of up to 38 when using proposed algorithm than
that of the reference software of H.264/AVC with ignorable picture
quality loss.