Abstract: Next generation wireless/mobile networks will be IP based cellular networks integrating the internet with cellular networks. In this paper, we propose a new architecture for a high speed transport system and a mobile management protocol for mobile internet users in a transport system. Existing mobility management protocols (MIPv6, HMIPv6) do not consider real world fast moving wireless hosts (e.g. passengers in a train). For this reason, we define a virtual organization (VO) and proposed the VO architecture for the transport system. We also classify mobility as VO mobility (intra VO) and macro mobility (inter VO). Handoffs in VO are locally managed and transparent to the CH while macro mobility is managed with Mobile IPv6. And, from the features of the transport system, such as fixed route and steady speed, we deduce the movement route and the handoff disruption time of each handoff. To reduce packet loss during handoff disruption time, we propose pre-registration scheme using pre-registration. Moreover, the proposed protocol can eliminate unnecessary binding updates resulting from sequence movement at high speed. The performance evaluations demonstrate our proposed protocol has a good performance at transport system environment. Our proposed protocol can be applied to the usage of wireless internet on the train, subway, and high speed train.
Abstract: Cutting tools are widely used in manufacturing processes and drilling is the most commonly used machining process. Although drill-bits used in drilling may not be expensive, their breakage can cause damage to expensive work piece being drilled and at the same time has major impact on productivity. Predicting drill-bit breakage, therefore, is important in reducing cost and improving productivity. This study uses twenty features extracted from two degradation signals viz., thrust force and torque. The methodology used involves developing and comparing decision tree, random forest, and multinomial logistic regression models for classifying and predicting drill-bit breakage using degradation signals.
Abstract: Selecting the data modeling technique for an
information system is determined by the objective of the resultant
data model. Dimensional modeling is the preferred modeling
technique for data destined for data warehouses and data mining,
presenting data models that ease analysis and queries which are in
contrast with entity relationship modeling. The establishment of data
warehouses as components of information system landscapes in
many organizations has subsequently led to the development of
dimensional modeling. This has been significantly more developed
and reported for the commercial database management systems as
compared to the open sources thereby making it less affordable for
those in resource constrained settings. This paper presents
dimensional modeling of HIV patient information using open source
modeling tools. It aims to take advantage of the fact that the most
affected regions by the HIV virus are also heavily resource
constrained (sub-Saharan Africa) whereas having large quantities of
HIV data. Two HIV data source systems were studied to identify
appropriate dimensions and facts these were then modeled using two
open source dimensional modeling tools. Use of open source would
reduce the software costs for dimensional modeling and in turn make
data warehousing and data mining more feasible even for those in
resource constrained settings but with data available.
Abstract: Bond Graph as a unified multidisciplinary tool is widely
used not only for dynamic modelling but also for Fault Detection and
Isolation because of its structural and causal proprieties. A binary
Fault Signature Matrix is systematically generated but to make the
final binary decision is not always feasible because of the problems
revealed by such method. The purpose of this paper is introducing a
methodology for the improvement of the classical binary method of
decision-making, so that the unknown and identical failure signatures
can be treated to improve the robustness. This approach consists of
associating the evaluated residuals and the components reliability data
to build a Hybrid Bayesian Network. This network is used in two
distinct inference procedures: one for the continuous part and the
other for the discrete part. The continuous nodes of the network are
the prior probabilities of the components failures, which are used by
the inference procedure on the discrete part to compute the posterior
probabilities of the failures. The developed methodology is applied
to a real steam generator pilot process.
Abstract: The paper investigates the feasibility of constructing a software multi-agent based monitoring and classification system and utilizing it to provide an automated and accurate classification of end users developing applications in the spreadsheet domain. The agents function autonomously to provide continuous and periodic monitoring of excels spreadsheet workbooks. Resulting in, the development of the MultiAgent classification System (MACS) that is in compliance with the specifications of the Foundation for Intelligent Physical Agents (FIPA). However, different technologies have been brought together to build MACS. The strength of the system is the integration of the agent technology with the FIPA specifications together with other technologies that are Windows Communication Foundation (WCF) services, Service Oriented Architecture (SOA), and Oracle Data Mining (ODM). The Microsoft's .NET widows service based agents were utilized to develop the monitoring agents of MACS, the .NET WCF services together with SOA approach allowed the distribution and communication between agents over the WWW that is in order to satisfy the monitoring and classification of the multiple developer aspect. ODM was used to automate the classification phase of MACS.
Abstract: In this paper, we propose a hybrid machine learning
system based on Genetic Algorithm (GA) and Support Vector
Machines (SVM) for stock market prediction. A variety of indicators
from the technical analysis field of study are used as input features.
We also make use of the correlation between stock prices of different
companies to forecast the price of a stock, making use of technical
indicators of highly correlated stocks, not only the stock to be
predicted. The genetic algorithm is used to select the set of most
informative input features from among all the technical indicators.
The results show that the hybrid GA-SVM system outperforms the
stand alone SVM system.
Abstract: This paper presents an effective traffic lights detection
method at the night-time. First, candidate blobs of traffic lights are
extracted from RGB color image. Input image is represented on the
dominant color domain by using color transform proposed by Ruta,
then red and green color dominant regions are selected as candidates.
After candidate blob selection, we carry out shape filter for noise
reduction using information of blobs such as length, area, area of
boundary box, etc. A multi-class classifier based on SVM (Support
Vector Machine) applies into the candidates. Three kinds of features
are used. We use basic features such as blob width, height, center
coordinate, area, area of blob. Bright based stochastic features are also
used. In particular, geometric based moment-s values between
candidate region and adjacent region are proposed and used to improve
the detection performance. The proposed system is implemented on
Intel Core CPU with 2.80 GHz and 4 GB RAM and tested with the
urban and rural road videos. Through the test, we show that the
proposed method using PF, BMF, and GMF reaches up to 93 % of
detection rate with computation time of in average 15 ms/frame.
Abstract: Fuzzy random variables have been introduced as an imprecise concept of numeric values for characterizing the imprecise knowledge. The descriptive parameters can be used to describe the primary features of a set of fuzzy random observations. In fuzzy environments, the expected values are usually represented as fuzzy-valued, interval-valued or numeric-valued descriptive parameters using various metrics. Instead of the concept of area metric that is usually adopted in the relevant studies, the numeric expected value is proposed by the concept of distance metric in this study based on two characters (fuzziness and randomness) of FRVs. Comparing with the existing measures, although the results show that the proposed numeric expected value is same with those using the different metric, if only triangular membership functions are used. However, the proposed approach has the advantages of intuitiveness and computational efficiency, when the membership functions are not triangular types. An example with three datasets is provided for verifying the proposed approach.
Abstract: Among the most fundamental prerequisites for the successful development of electronic Government Services (e- Government) is Citizen Acceptance. Based on the UTAUT model, the paper describes a hypothetical framework that integrates the unique features of E- government to improve our understanding of the acceptance and usage of e-Government Saudi Arabia. The proposed model, based on UTAUT, includes the characteristics of Egovernment, consideration and inclusion of trust, privacy, and Saudi culture and context.
Abstract: The purpose of this work was to study the effect of the
irrigation using waste water with various electric conductivities (T(0,92ds/m), EC3 (3ds/m) and EC6 (6ds/m) on three varieties of
quinoa cultivated in a field south of Morocco. The follow up of the evolution of the chemical and agronomic parameters throughout the
culture made it possible to determine the responses to the saline stress in arid conditions. Results showed that the salinity caused the
depression of plant-s height, and reduced the fresh and dry weight in
the different parts of the three varieties plants. The increase of the irrigation water EC didn-t affect the yield for the varieties. Thus,
quinoa resisted to salinity and proved a behavior of a facultative halophyte crop. In fact, the cultivation of this using treated wastewater is feasible especially in arid areas for a sustainable use of
water resources.
Abstract: This paper starts with a critical view of beautiful female images in the mass media being frequently generated by a stereotypical Korean concept of beauty. Several female beauty myths have evolved in Korea during the present decade. Nearly all of them have formed due to a deeply-ingrained androcentric ideology which objectifies women. Mass media causes the public to hold a distorted concept about female beauty. There is a huge gap between women in reality and representative women in the mass media. It is essential to have an unbiased perception of female images presented in the mass media. Due to cosmetic advertisements projecting contemporary images of female beauty to promote products, cosmetics images will be examined in regard to female beauty myths portrayed by the mass media. This paper will analyze features of female beauty myths in Korea and their intrinsic characteristics.
Abstract: Protein residue contact map is a compact
representation of secondary structure of protein. Due to the
information hold in the contact map, attentions from researchers in
related field were drawn and plenty of works have been done
throughout the past decade. Artificial intelligence approaches have
been widely adapted in related works such as neural networks,
genetic programming, and Hidden Markov model as well as support
vector machine. However, the performance of the prediction was not
generalized which probably depends on the data used to train and
generate the prediction model. This situation shown the importance
of the features or information used in affecting the prediction
performance. In this research, support vector machine was used to
predict protein residue contact map on different combination of
features in order to show and analyze the effectiveness of the
features.
Abstract: The advancement in wireless technology with the wide
use of mobile devices have drawn the attention of the research and
technological communities towards wireless environments, such as
Wireless Local Area Networks (WLANs), Wireless Wide Area
Networks (WWANs), and mobile systems and ad-hoc networks.
Unfortunately, wired and wireless networks are expressively different
in terms of link reliability, bandwidth, and time of propagation delay
and by adapting new solutions for these enhanced
telecommunications, superior quality, efficiency, and opportunities
will be provided where wireless communications were otherwise
unfeasible. Some researchers define 4G as a significant improvement
of 3G, where current cellular network’s issues will be solved and data
transfer will play a more significant role. For others, 4G unifies
cellular and wireless local area networks, and introduces new routing
techniques, efficient solutions for sharing dedicated frequency bands,
and an increased mobility and bandwidth capacity. This paper
discusses the possible solutions and enhancements probabilities that
proposed to improve the performance of Transmission Control
Protocol (TCP) over different wireless networks and also the paper
investigated each approach in term of advantages and disadvantages.
Abstract: The practice of burying the solid waste under the ground is one of the waste disposal methods and dumping is known as an ultimate method in the fastest-growing cities like Rasht city in Iran. Some municipalities select the solid waste landfills without feasibility studies, programming, design and management plans. Therefore, several social and environmental impacts are created by these sites. In this study, the suitability of solid waste landfill in Rasht city, capital of Gilan Province is reviewed using Regional Screening Method (RSM), Geographic Information System (GIS) and Analytical Hierarchy Process (AHP). The results indicated that according to the suitability maps, the value of study site is midsuitable to suitable based on RSM and mid-suitable based on AHP.
Abstract: This paper features the proposed modeling and design
of a Robust Decentralized Periodic Output Feedback (RDPOF)
control technique for the active vibration control of smart flexible
multimodel Euler-Bernoulli cantilever beams for a multivariable
(MIMO) case by retaining the first 6 vibratory modes. The beam
structure is modeled in state space form using the concept of
piezoelectric theory, the Euler-Bernoulli beam theory and the Finite
Element Method (FEM) technique by dividing the beam into 4 finite
elements and placing the piezoelectric sensor / actuator at two finite
element locations (positions 2 and 4) as collocated pairs, i.e., as
surface mounted sensor / actuator, thus giving rise to a multivariable
model of the smart structure plant with two inputs and two outputs.
Five such multivariable models are obtained by varying the
dimensions (aspect ratios) of the aluminum beam, thus giving rise to
a multimodel of the smart structure system. Using model order
reduction technique, the reduced order model of the higher order
system is obtained based on dominant eigen value retention and the
method of Davison. RDPOF controllers are designed for the above 5
multivariable-multimodel plant. The closed loop responses with the
RDPOF feedback gain and the magnitudes of the control input are
observed and the performance of the proposed multimodel smart
structure system with the controller is evaluated for vibration control.
Abstract: Heart sound is an acoustic signal and many techniques
used nowadays for human recognition tasks borrow speech recognition
techniques. One popular choice for feature extraction of accoustic
signals is the Mel Frequency Cepstral Coefficients (MFCC) which
maps the signal onto a non-linear Mel-Scale that mimics the human
hearing. However the Mel-Scale is almost linear in the frequency
region of heart sounds and thus should produce similar results with
the standard cepstral coefficients (CC). In this paper, MFCC is
investigated to see if it produces superior results for PCG based
human identification system compared to CC. Results show that the
MFCC system is still superior to CC despite linear filter-banks in
the lower frequency range, giving up to 95% correct recognition rate
for MFCC and 90% for CC. Further experiments show that the high
recognition rate is due to the implementation of filter-banks and not
from Mel-Scaling.
Abstract: In this paper three different approaches for person
verification and identification, i.e. by means of fingerprints, face and
voice recognition, are studied. Face recognition uses parts-based
representation methods and a manifold learning approach. The
assessment criterion is recognition accuracy. The techniques under
investigation are: a) Local Non-negative Matrix Factorization
(LNMF); b) Independent Components Analysis (ICA); c) NMF with
sparse constraints (NMFsc); d) Locality Preserving Projections
(Laplacianfaces). Fingerprint detection was approached by classical
minutiae (small graphical patterns) matching through image
segmentation by using a structural approach and a neural network as
decision block. As to voice / speaker recognition, melodic cepstral
and delta delta mel cepstral analysis were used as main methods, in
order to construct a supervised speaker-dependent voice recognition
system. The final decision (e.g. “accept-reject" for a verification
task) is taken by using a majority voting technique applied to the
three biometrics. The preliminary results, obtained for medium
databases of fingerprints, faces and voice recordings, indicate the
feasibility of our study and an overall recognition precision (about
92%) permitting the utilization of our system for a future complex
biometric card.
Abstract: Markov games are a generalization of Markov
decision process to a multi-agent setting. Two-player zero-sum
Markov game framework offers an effective platform for designing
robust controllers. This paper presents two novel controller design
algorithms that use ideas from game-theory literature to produce
reliable controllers that are able to maintain performance in presence
of noise and parameter variations. A more widely used approach for
controller design is the H∞ optimal control, which suffers from high
computational demand and at times, may be infeasible. Our approach
generates an optimal control policy for the agent (controller) via a
simple Linear Program enabling the controller to learn about the
unknown environment. The controller is facing an unknown
environment, and in our formulation this environment corresponds to
the behavior rules of the noise modeled as the opponent. Proposed
controller architectures attempt to improve controller reliability by a
gradual mixing of algorithmic approaches drawn from the game
theory literature and the Minimax-Q Markov game solution
approach, in a reinforcement-learning framework. We test the
proposed algorithms on a simulated Inverted Pendulum Swing-up
task and compare its performance against standard Q learning.
Abstract: The study of the interaction between humans and
computers has been emerging during the last few years. This
interaction will be more powerful if computers are able to perceive
and respond to human nonverbal communication such as emotions. In
this study, we present the image-based approach to emotion
classification through lower facial expression. We employ a set of
feature points in the lower face image according to the particular face
model used and consider their motion across each emotive expression
of images. The vector of displacements of all feature points input to
the Adaptive Support Vector Machines (A-SVMs) classifier that
classify it into seven basic emotions scheme, namely neutral, angry,
disgust, fear, happy, sad and surprise. The system was tested on the
Japanese Female Facial Expression (JAFFE) dataset of frontal view
facial expressions [7]. Our experiments on emotion classification
through lower facial expressions demonstrate the robustness of
Adaptive SVM classifier and verify the high efficiency of our
approach.
Abstract: Over the past decades, automatic face recognition has become a highly active research area, mainly due to the countless application possibilities in both the private as well as the public sector. Numerous algorithms have been proposed in the literature to cope with the problem of face recognition, nevertheless, a group of methods commonly referred to as appearance based have emerged as the dominant solution to the face recognition problem. Many comparative studies concerned with the performance of appearance based methods have already been presented in the literature, not rarely with inconclusive and often with contradictory results. No consent has been reached within the scientific community regarding the relative ranking of the efficiency of appearance based methods for the face recognition task, let alone regarding their susceptibility to appearance changes induced by various environmental factors. To tackle these open issues, this paper assess the performance of the three dominant appearance based methods: principal component analysis, linear discriminant analysis and independent component analysis, and compares them on equal footing (i.e., with the same preprocessing procedure, with optimized parameters for the best possible performance, etc.) in face verification experiments on the publicly available XM2VTS database. In addition to the comparative analysis on the XM2VTS database, ten degraded versions of the database are also employed in the experiments to evaluate the susceptibility of the appearance based methods on various image degradations which can occur in "real-life" operating conditions. Our experimental results suggest that linear discriminant analysis ensures the most consistent verification rates across the tested databases.