Abstract: Motion capture devices have been utilized in
producing several contents, such as movies and video games. However,
since motion capture devices are expensive and inconvenient to use,
motions segmented from captured data was recycled and synthesized
to utilize it in another contents, but the motions were generally
segmented by contents producers in manual. Therefore, automatic
motion segmentation is recently getting a lot of attentions. Previous
approaches are divided into on-line and off-line, where on-line
approaches segment motions based on similarities between
neighboring frames and off-line approaches segment motions by
capturing the global characteristics in feature space. In this paper, we
propose a graph-based high-level motion segmentation method. Since
high-level motions consist of several repeated frames within temporal
distances, we consider all similarities among all frames within the
temporal distance. This is achieved by constructing a graph, where
each vertex represents a frame and the edges between the frames are
weighted by their similarity. Then, normalized cuts algorithm is used
to partition the constructed graph into several sub-graphs by globally
finding minimum cuts. In the experiments, the results using the
proposed method showed better performance than PCA-based method
in on-line and GMM-based method in off-line, as the proposed method
globally segment motions from the graph constructed based
similarities between neighboring frames as well as similarities among
all frames within temporal distances.
Abstract: Wireless Sensor Networks consist of inexpensive, low power sensor nodes deployed to monitor the environment and collect
data. Gathering information in an energy efficient manner is a critical aspect to prolong the network lifetime. Clustering algorithms have an advantage of enhancing the network lifetime. Current clustering algorithms usually focus on global re-clustering and local re-clustering separately. This paper, proposed a combination of those two reclustering methods to reduce the energy consumption of the network. Furthermore, the proposed algorithm can apply to homogeneous as well as heterogeneous wireless sensor networks. In addition, the cluster head rotation happens, only when its energy drops below a dynamic threshold value computed by the algorithm. The simulation result shows that the proposed algorithm prolong the network lifetime compared to existing algorithms.
Abstract: We present a hardware oriented method for real-time
measurements of object-s position in video. The targeted application
area is light spots used as references for robotic navigation. Different
algorithms for dynamic thresholding are explored in combination
with component labeling and Center Of Gravity (COG) for highest
possible precision versus Signal-to-Noise Ratio (SNR). This method
was developed with a low hardware cost in focus having only one
convolution operation required for preprocessing of data.
Abstract: A robust still image face localization algorithm
capable of operating in an unconstrained visual environment is
proposed. First, construction of a robust skin classifier within a
shifted HSV color space is described. Then various filtering
operations are performed to better isolate face candidates and
mitigate the effect of substantial non-skin regions. Finally, a novel
Bhattacharyya-based face detection algorithm is used to compare
candidate regions of interest with a unique illumination-dependent
face model probability distribution function approximation.
Experimental results show a 90% face detection success rate despite
the demands of the visually noisy environment.
Abstract: Due to the constant increase in the volume of information available to applications in fields varying from medical diagnosis to web search engines, accurate support of similarity becomes an important task. This is also the case of spam filtering techniques where the similarities between the known and incoming messages are the fundaments of making the spam/not spam decision. We present a novel approach to filtering based solely on layout, whose goal is not only to correctly identify spam, but also warn about major emerging threats. We propose a mathematical formulation of the email message layout and based on it we elaborate an algorithm to separate different types of emails and find the new, numerically relevant spam types.
Abstract: In modern day disaster recovery mission has become
one of the top priorities in any natural disaster management regime.
Smart autonomous robots may play a significant role in such
missions, including search for life under earth quake hit rubbles,
Tsunami hit islands, de-mining in war affected areas and many other
such situations. In this paper current state of many walking robots are
compared and advantages of hexapod systems against wheeled robots
are described. In our research we have selected a hexapod spider
robot; we are developing focusing mainly on efficient navigation
method in different terrain using apposite gait of locomotion, which
will make it faster and at the same time energy efficient to navigate
and negotiate difficult terrain. This paper describes the method of
terrain negotiation navigation in a hazardous field.
Abstract: The objective of this study is to present the test
results of variable air volume (VAV) air conditioning system
optimized by two objective genetic algorithm (GA). The objective
functions are energy savings and thermal comfort. The optimal set
points for fuzzy logic controller (FLC) are the supply air temperature
(Ts), the supply duct static pressure (Ps), the chilled water
temperature (Tw), and zone temperature (Tz) that is taken as the
problem variables. Supply airflow rate and chilled water flow rate are
considered to be the constraints. The optimal set point values are
obtained from GA process and assigned into fuzzy logic controller
(FLC) in order to conserve energy and maintain thermal comfort in
real time VAV air conditioning system. A VAV air conditioning
system with FLC installed in a software laboratory has been taken for
the purpose of energy analysis. The total energy saving obtained in
VAV GA optimization system with FLC compared with constant air
volume (CAV) system is expected to achieve 31.5%. The optimal
duct static pressure obtained through Genetic fuzzy methodology
attributes to better air distribution by delivering the optimal quantity
of supply air to the conditioned space. This combination enhanced
the advantages of uniform air distribution, thermal comfort and
improved energy savings potential.
Abstract: In this study, we present a new and fast algorithm for lung segmentation using CTA images. This process is quite important especially at lung vessel segmentation, detection of pulmonary emboly, finding nodules or segmentation of airways. Applied method has been carried out at four steps. At first step, images have been applied optimal threshold. At the second one, the subsegment vessels, which have a place in lung region and which are in small dimension, have been removed. At the third one, identifying and segmentation of lungs and airway edges have been carried out. Lastly, by throwing away the airway, lung segmentation has been presented.
Abstract: Nowadays the devices of night vision are widely used both for military and civil applications. The variety of night vision applications require a variety of the night vision devices designs. A web-based architecture of a software system for design assessment before producing of night vision devices is developed. The proposed architecture of the web-based system is based on the application of a mathematical model for designing of night vision devices. An algorithm with two components – for iterative design and for intelligent design is developed and integrated into system architecture. The iterative component suggests compatible modules combinations to choose from. The intelligent component provides compatible combinations of modules satisfying given user requirements to device parameters. The proposed web-based architecture of a system for design assessment of night vision devices is tested via a prototype of the system. The testing showed the applicability of both iterative and intelligent components of algorithm.
Abstract: The most important property of the Gene Ontology is
the terms. These control vocabularies are defined to provide
consistent descriptions of gene products that are shareable and
computationally accessible by humans, software agent, or other
machine-readable meta-data. Each term is associated with
information such as definition, synonyms, database references, amino
acid sequences, and relationships to other terms. This information has
made the Gene Ontology broadly applied in microarray and
proteomic analysis. However, the process of searching the terms is
still carried out using traditional approach which is based on keyword
matching. The weaknesses of this approach are: ignoring semantic
relationships between terms, and highly depending on a specialist to
find similar terms. Therefore, this study combines semantic similarity
measure and genetic algorithm to perform a better retrieval process
for searching semantically similar terms. The semantic similarity
measure is used to compute similitude strength between two terms.
Then, the genetic algorithm is employed to perform batch retrievals
and to handle the situation of the large search space of the Gene
Ontology graph. The computational results are presented to show the
effectiveness of the proposed algorithm.
Abstract: Electronic banking must be secure and easy to use and
many banks heavily advertise an apparent of 100% secure system
which is contestable in many points. In this work, an alternative
approach to the design of e-banking system, through a new solution
for user authentication and security with digital certificate called
LumaCert is introduced. The certificate applies new algorithm for
asymmetric encryption by utilizing two mathematical operators
called Pentors and UltraPentors. The public and private key in this
algorithm represent a quadruple of parameters which are directly
dependent from the above mentioned operators. The strength of the
algorithm resides in the inability to find the respective Pentor and
UltraPentor operator from the mentioned parameters.
Abstract: The overall service performance of I/O intensive
system depends mainly on workload on its storage system. In
heterogeneous storage environment where storage elements from
different vendors with different capacity and performance are put
together, workload should be distributed according to storage
capability. This paper addresses data placement issue in short video
sharing website. Workload contributed by a video is estimated by the
number of views and life time span of existing videos in same
category. Experiment was conducted on 42,000 video titles in six
weeks. Result showed that the proposed algorithm distributed
workload and maintained balance better than round robin and random
algorithms.
Abstract: Exchange algorithm with constraints on magnitude and phase error separately in new way is presented in this paper. An important feature of the algorithms presented in this paper is that they allow for design constraints which often arise in practical filter design problems. Meeting required minimum stopband attenuation or a maximum deviation from the desired magnitude and phase responses in the passbands are common design constraints that can be handled by the methods proposed here. This new algorithm may have important advantages over existing technique, with respect to the speed and stability of convergence, memory requirement and low ripples.
Abstract: Many problems in computer vision and image
processing present potential for parallel implementations through one
of the three major paradigms of geometric parallelism, algorithmic
parallelism and processor farming. Static process scheduling
techniques are used successfully to exploit geometric and algorithmic
parallelism, while dynamic process scheduling is better suited to
dealing with the independent processes inherent in the process
farming paradigm. This paper considers the application of parallel or
multi-computers to a class of problems exhibiting spatial data
characteristic of the geometric paradigm. However, by using
processor farming paradigm, a dynamic scheduling technique is
developed to suit the MIMD structure of the multi-computers. A
hybrid scheme of scheduling is also developed and compared with
the other schemes. The specific problem chosen for the investigation
is the Hough transform for line detection.
Abstract: In data mining, the association rules are used to search
for the relations of items of the transactions database. Following the
data is collected and stored, it can find rules of value through
association rules, and assist manager to proceed marketing strategy
and plan market framework. In this paper, we attempt fuzzy partition
methods and decide membership function of quantitative values of
each transaction item. Also, by managers we can reflect the
importance of items as linguistic terms, which are transformed as
fuzzy sets of weights. Next, fuzzy weighted frequent pattern growth
(FWFP-Growth) is used to complete the process of data mining. The
method above is expected to improve Apriori algorithm for its better
efficiency of the whole association rules. An example is given to
clearly illustrate the proposed approach.
Abstract: The direct synthesis process of dimethyl ether (DME)
from syngas in slurry reactors is considered to be promising because
of its advantages in caloric transfer. In this paper, the influences of
operating conditions (temperature, pressure and weight hourly space
velocity) on the conversion of CO, selectivity of DME and methanol
were studied in a stirred autoclave over Cu-Zn-Al-Zr slurry catalyst,
which is far more suitable to liquid phase dimethyl ether synthesis
process than bifunctional catalyst commercially. A Langmuir-
Hinshelwood mechanism type global kinetics model for liquid phase
DME direct synthesis based on methanol synthesis models and a
methanol dehydration model has been investigated by fitting our
experimental data. The model parameters were estimated with
MATLAB program based on general Genetic Algorithms and
Levenberg-Marquardt method, which is suitably fitting experimental
data and its reliability was verified by statistical test and residual
error analysis.
Abstract: Midpoint filter is quite effective in recovering the
images confounded by the short-tailed (uniform) noise. It, however,
performs poorly in the presence of additive long-tailed (impulse)
noise and it does not preserve the edge structures of the image
signals. Median smoother discards outliers (impulses) effectively, but
it fails to provide adequate smoothing for images corrupted with nonimpulse
noise. In this paper, two nonlinear techniques for image
filtering, namely, New Filter I and New Filter II are proposed based
on a nonlinear high-pass filter algorithm. New Filter I is constructed
using a midpoint filter, a highpass filter and a combiner. It suppresses
uniform noise quite well. New Filter II is configured using an alpha
trimmed midpoint filter, a median smoother of window size 3x3, the
high pass filter and the combiner. It is robust against impulse noise
and attenuates uniform noise satisfactorily. Both the filters are shown
to exhibit good response at the image boundaries (edges). The
proposed filters are evaluated for their performance on a test image
and the results obtained are included.
Abstract: Recently, the Spherical Motion Models (SMM-s) have been introduced [1]. These new models have been developed for 3D local landmark-base Autonomous Navigation (AN). This paper is revealing new arguments and experimental results to support the SMM-s characteristics. The accuracy and the robustness in performing a specific task are the main concerns of the new investigations. To analyze their performances of the SMM-s, the most powerful tools of estimation theory, the extended Kalman filter (EKF) and unscented Kalman filter (UKF), which give the best estimations in noisy environments, have been employed. The Monte Carlo validation implementations used to test the stability and robustness of the models have been employed as well.
Abstract: Computational techniques derived from digital image processing are playing a significant role in the security and digital copyrights of multimedia and visual arts. This technology has the effect within the domain of computers. This research presents discrete M-band wavelet transform (MWT) and cosine transform (DCT) based watermarking algorithm by incorporating the principal component analysis (PCA). The proposed algorithm is expected to achieve higher perceptual transparency. Specifically, the developed watermarking scheme can successfully resist common signal processing, such as geometric distortions, and Gaussian noise. In addition, the proposed algorithm can be parameterized, thus resulting in more security. To meet these requirements, the image is transformed by a combination of MWT & DCT. In order to improve the security further, we randomize the watermark image to create three code books. During the watermark embedding, PCA is applied to the coefficients in approximation sub-band. Finally, first few component bands represent an excellent domain for inserting the watermark.
Abstract: In wireless and mobile communications, this progress
provides opportunities for introducing new standards and improving
existing services. Supporting multimedia traffic with wireless networks
quality of service (QoS). In this paper, a grey-fuzzy controller for radio
resource management (GF-RRM) is presented to maximize the number
of the served calls and QoS provision in wireless networks. In a
wireless network, the call arrival rate, the call duration and the
communication overhead between the base stations and the control
center are vague and uncertain. In this paper, we develop a method to
predict the cell load and to solve the RRM problem based on the
GF-RRM, and support the present facility has been built on the
application-level of the wireless networks. The GF-RRM exhibits the
better adaptability, fault-tolerant capability and performance than other
algorithms. Through simulations, we evaluate the blocking rate, update
overhead, and channel acquisition delay time of the proposed method.
The results demonstrate our algorithm has the lower blocking rate, less
updated overhead, and shorter channel acquisition delay.