Abstract: In this paper we discuss the effect of unbounded particle interaction operator on particle growth and we study how this can address the choice of appropriate time steps of the numerical simulation. We provide also rigorous mathematical proofs showing that large particles become dominating with increasing time while small particles contribute negligibly. Second, we discuss the efficiency of the algorithm by performing numerical simulations tests and by comparing the simulated solutions with some known analytic solutions to the Smoluchowski equation.
Abstract: The emergence of blended learning has been
influenced by the rapid changes in Higher Education within the last
few years. However, there is a lack of studies that look into the future
of blended learning in the Saudi context. The most likely explanation
is that blended learning is relatively new and, with respect to learning
in general, under-researched. This study addresses this gap and
explores the views of lecturers and students towards the future of
blended learning in Saudi Arabia. This study was informed by the
interpretive paradigm that appears to be most appropriate to
understand and interpret the perceptions of students and instructors
towards a new learning environment. While globally there has been
considerable research on the perceptions of e-learning and blended
learning with its different models, there is plenty of space for further
research specifically in the Arab region, and in Saudi Arabia where
blended learning is now being introduced.
Abstract: When cars are released from the factory, strut noises are very small and therefore it is difficult to perceive them. As the use time and travel distance increase, however, strut noises get larger so as to cause users much uneasiness. The noises generated at the field include engine noises and flow noises and therefore it is difficult to clearly discern the noises generated from struts. This study developed a test method which can reproduce field strut noises in the lab. Using the newly developed noise evaluation test, this study analyzed the effects that insulator performance degradation and failure can have on car noises. The study also confirmed that the insulator durability test by the simple back-and-forth motion cannot completely reflect the state of the parts failure in the field. Based on this, the study also confirmed that field noises can be reproduced through a durability test that considers heat aging.
Abstract: As the Internet continues to grow at a rapid pace as
the primary medium for communications and commerce and as
telecommunication networks and systems continue to expand their
global reach, digital information has become the most popular and
important information resource and our dependence upon the
underlying cyber infrastructure has been increasing significantly.
Unfortunately, as our dependency has grown, so has the threat to the
cyber infrastructure from spammers, attackers and criminal
enterprises. In this paper, we propose a new machine learning based
network intrusion detection framework for cyber security. The
detection process of the framework consists of two stages: model
construction and intrusion detection. In the model construction stage,
a semi-supervised machine learning algorithm is applied to a
collected set of network audit data to generate a profile of normal
network behavior and in the intrusion detection stage, input network
events are analyzed and compared with the patterns gathered in the
profile, and some of them are then flagged as anomalies should these
events are sufficiently far from the expected normal behavior. The
proposed framework is particularly applicable to the situations where
there is only a small amount of labeled network training data
available, which is very typical in real world network environments.
Abstract: Quality of Service (QoS) Routing aims to find path between source and destination satisfying the QoS requirements which efficiently using the network resources and underlying routing algorithm and to fmd low-cost paths that satisfy given QoS constraints. One of the key issues in providing end-to-end QoS guarantees in packet networks is determining feasible path that satisfies a number of QoS constraints. We present a Optimized Multi- Constrained Routing (OMCR) algorithm for the computation of constrained paths for QoS routing in computer networks. OMCR applies distance vector to construct a shortest path for each destination with reference to a given optimization metric, from which a set of feasible paths are derived at each node. OMCR is able to fmd feasible paths as well as optimize the utilization of network resources. OMCR operates with the hop-by-hop, connectionless routing model in IP Internet and does not create any loops while fmding the feasible paths. Nodes running OMCR not necessarily maintaining global view of network state such as topology, resource information and routing updates are sent only to neighboring nodes whereas its counterpart link-state routing method depend on complete network state for constrained path computation and that incurs excessive communication overhead.
Abstract: As the enormous amount of on-line text grows on the
World-Wide Web, the development of methods for automatically
summarizing this text becomes more important. The primary goal of
this research is to create an efficient tool that is able to summarize
large documents automatically. We propose an Evolving
connectionist System that is adaptive, incremental learning and
knowledge representation system that evolves its structure and
functionality. In this paper, we propose a novel approach for Part of
Speech disambiguation using a recurrent neural network, a paradigm
capable of dealing with sequential data. We observed that
connectionist approach to text summarization has a natural way of
learning grammatical structures through experience. Experimental
results show that our approach achieves acceptable performance.
Abstract: Ontology is widely being used as a tool for organizing
information, creating the relation between the subjects within the
defined knowledge domain area. Various fields such as Civil,
Biology, and Management have successful integrated ontology in
decision support systems for managing domain knowledge and to
assist their decision makers. Gross pollutant traps (GPT) are devices
used in trapping and preventing large items or hazardous particles in
polluting and entering our waterways. However choosing and
determining GPT is a challenge in Malaysia as there are inadequate
GPT data repositories being captured and shared. Hence ontology is
needed to capture, organize and represent this knowledge into
meaningful information which can be contributed to the efficiency of
GPT selection in Malaysia urbanization. A GPT Ontology framework
is therefore built as the first step to capture GPT knowledge which
will then be integrated into the decision support system. This paper
will provide several examples of the GPT ontology, and explain how
it is constructed by using the Protégé tool.
Abstract: Feature selection study is gaining importance due to its contribution to save classification cost in terms of time and computation load. In search of essential features, one of the methods to search the features is via the decision tree. Decision tree act as an intermediate feature space inducer in order to choose essential features. In decision tree-based feature selection, some studies used decision tree as a feature ranker with a direct threshold measure, while others remain the decision tree but utilized pruning condition that act as a threshold mechanism to choose features. This paper proposed threshold measure using Manhattan Hierarchical Cluster distance to be utilized in feature ranking in order to choose relevant features as part of the feature selection process. The result is promising, and this method can be improved in the future by including test cases of a higher number of attributes.
Abstract: The complexity of today-s software systems makes
collaborative development necessary to accomplish tasks.
Frameworks are necessary to allow developers perform their tasks
independently yet collaboratively. Similarity detection is one of the
major issues to consider when developing such frameworks. It allows
developers to mine existing repositories when developing their own
views of a software artifact, and it is necessary for identifying the
correspondences between the views to allow merging them and
checking their consistency. Due to the importance of the
requirements specification stage in software development, this paper
proposes a framework for collaborative development of Object-
Oriented formal specifications along with a similarity detection
approach to support the creation, merging and consistency checking
of specifications. The paper also explores the impact of using
additional concepts on improving the matching results. Finally, the
proposed approach is empirically evaluated.
Abstract: Personal computers draw non-sinusoidal current
with odd harmonics more significantly. Power Quality of
distribution networks is severely affected due to the flow of these
generated harmonics during the operation of electronic loads. In
this paper, mathematical modeling of odd harmonics in current like
3rd, 5th, 7th and 9th influencing the power quality has been presented.
Live signals have been captured with the help of power quality
analyzer for analysis purpose. The interesting feature is that Total
Harmonic Distortion (THD) in current decreases with the increase
of nonlinear loads has been verified theoretically. The results
obtained using mathematical expressions have been compared with
the practical results and exciting results have been found.
Abstract: Simultaneous transient conduction and radiation heat
transfer with heat generation is investigated. Analysis is carried out
for both steady and unsteady situations. two-dimensional gray
cylindrical enclosure with an absorbing, emitting, and isotropically
scattering medium is considered. Enclosure boundaries are assumed
at specified temperatures. The heat generation rate is considered
uniform and constant throughout the medium. The lattice Boltzmann
method (LBM) was used to solve the energy equation of a transient
conduction-radiation heat transfer problem. The control volume finite
element method (CVFEM) was used to compute the radiative
information. To study the compatibility of the LBM for the energy
equation and the CVFEM for the radiative transfer equation, transient
conduction and radiation heat transfer problems in 2-D cylindrical
geometries were considered. In order to establish the suitability of the
LBM, the energy equation of the present problem was also solved
using the the finite difference method (FDM) of the computational
fluid dynamics. The CVFEM used in the radiative heat transfer was
employed to compute the radiative information required for the
solution of the energy equation using the LBM or the FDM (of the
CFD). To study the compatibility and suitability of the LBM for the
solution of energy equation and the CVFEM for the radiative
information, results were analyzed for the effects of various
parameters such as the boundary emissivity. The results of the LBMCVFEM
combination were found to be in excellent agreement with
the FDM-CVFEM combination. The number of iterations and the
steady state temperature in both of the combinations were found
comparable. Results are found for situations with and without heat
generation. Heat generation is found to have significant bearing on
temperature distribution.
Abstract: We present a simplified equalization technique for a
π/4 differential quadrature phase shift keying ( π/4 -DQPSK) modulated
signal in a multipath fading environment. The proposed equalizer is
realized as a fractionally spaced adaptive decision feedback equalizer
(FS-ADFE), employing exponential step-size least mean square
(LMS) algorithm as the adaptation technique. The main advantage of
the scheme stems from the usage of exponential step-size LMS algorithm
in the equalizer, which achieves similar convergence behavior
as that of a recursive least squares (RLS) algorithm with significantly
reduced computational complexity. To investigate the finite-precision
performance of the proposed equalizer along with the π/4 -DQPSK
modem, the entire system is evaluated on a 16-bit fixed point digital
signal processor (DSP) environment. The proposed scheme is found
to be attractive even for those cases where equalization is to be
performed within a restricted number of training samples.
Abstract: In this paper, we propose a reversible watermarking
scheme based on histogram shifting (HS) to embed watermark bits
into the H.264/AVC standard videos by modifying the last nonzero
level in the context adaptive variable length coding (CAVLC) domain.
The proposed method collects all of the last nonzero coefficients (or
called last level coefficient) of 4×4 sub-macro blocks in a macro
block and utilizes predictions for the current last level from the
neighbor block-s last levels to embed watermark bits. The feature of
the proposed method is low computational and has the ability of
reversible recovery. The experimental results have demonstrated that
our proposed scheme has acceptable degradation on video quality and
output bit-rate for most test videos.
Abstract: Classification of video sequences based on their contents is a vital process for adaptation techniques. It helps decide which adaptation technique best fits the resource reduction requested by the client. In this paper we used the principal feature analysis algorithm to select a reduced subset of video features. The main idea is to select only one feature from each class based on the similarities between the features within that class. Our results showed that using this feature reduction technique the source video features can be completely omitted from future classification of video sequences.
Abstract: Tofurther advance research on immune-related genes
from T. molitor, we constructed acDNA library and analyzed
expressed sequence taq (EST) sequences from 1,056 clones. After
removing vector sequence and quality checkingthrough thePhred
program (trim_alt 0.05 (P-score>20), 1039 sequences were generated.
The average length of insert was 792 bp. In addition, we identified 162
clusters, 167 contigs and 391 contigs after clustering and assembling
process using a TGICL package. EST sequences were searchedagainst
NCBI nr database by local BLAST (blastx, E
Abstract: Soccer simulation is an effort to motivate researchers and practitioners to do artificial and robotic intelligence research; and at the same time put into practice and test the results. Many researchers and practitioners throughout the world are continuously working to polish their ideas and improve their implemented systems. At the same time, new groups are forming and they bring bright new thoughts to the field. The research includes designing and executing robotic soccer simulation algorithms. In our research, a soccer simulation player is considered to be an intelligent agent that is capable of receiving information from the environment, analyze it and to choose the best action from a set of possible ones, for its next move. We concentrate on developing a two-phase method for the soccer player agent to choose its best next move. The method is then implemented into our software system called Nexus simulation team of Ferdowsi University. This system is based on TsinghuAeolus[1] team that was the champion of the world RoboCup soccer simulation contest in 2001 and 2002.
Abstract: One of the main concerns in the Information Technology field is adoption with new technologies in organizations which may result in increasing the usage paste of these technologies.This study aims to look at the issue of culture-s role in accepting and using new technologies in organizations. The study examines the effect of culture on accepting and intention to use new technology in organizations. Studies show culture is one of the most important barriers in adoption new technologies. The model used for accepting and using new technology is Technology Acceptance Model (TAM), while for culture and dimensions a well-known theory by Hofsted was used. Results of the study show significant effect of culture on intention to use new technologies. All four dimensions of culture were tested to find the strength of relationship with behavioral intention to use new technologies. Findings indicate the important role of culture in the level of intention to use new technologies and different role of each dimension to improve adaptation process. The study suggests that transferring of new technologies efforts are most likely to be successful if the parties are culturally aligned.
Abstract: Knowing consumers' preferences and perceptions of
the sensory evaluation of drink products are very significant to
manufacturers and retailers alike. With no appropriate sensory
analysis, there is a high risk of market disappointment. This paper
aims to rank the selected coffee products and also to determine the
best of quality attribute through sensory evaluation using fuzzy
decision making model. Three products of coffee drinks were used
for sensory evaluation. Data were collected from thirty judges at a
hypermarket in Kuala Terengganu, Malaysia. The judges were asked
to specify their sensory evaluation in linguistic terms of the quality
attributes of colour, smell, taste and mouth feel for each product and
also the weight of each quality attribute. Five fuzzy linguistic terms
represent the quality attributes were introduced prior analysing. The
judgment membership function and the weights were compared to
rank the products and also to determine the best quality attribute. The
product of Indoc was judged as the first in ranking and 'taste' as the
best quality attribute. These implicate the importance of sensory
evaluation in identifying consumers- preferences and also the
competency of fuzzy approach in decision making.
Abstract: Advent enhancements in the field of computing have
increased massive use of web based electronic documents. Current
Copyright protection laws are inadequate to prove the ownership for
electronic documents and do not provide strong features against
copying and manipulating information from the web. This has
opened many channels for securing information and significant
evolutions have been made in the area of information security.
Digital Watermarking has developed into a very dynamic area of
research and has addressed challenging issues for digital content.
Watermarking can be visible (logos or signatures) and invisible
(encoding and decoding). Many visible watermarking techniques
have been studied for text documents but there are very few for web
based text. XML files are used to trade information on the internet
and contain important information. In this paper, two invisible
watermarking techniques using Synonyms and Acronyms are
proposed for XML files to prove the intellectual ownership and to
achieve the security. Analysis is made for different attacks and
amount of capacity to be embedded in the XML file is also noticed.
A comparative analysis for capacity is also made for both methods.
The system has been implemented using C# language and all tests are
made practically to get the results.
Abstract: One major source of performance decline in speaker
recognition system is channel mismatch between training and testing.
This paper focuses on improving channel robustness of speaker
recognition system in two aspects of channel compensation technique
and channel robust features. The system is text-independent speaker
identification system based on two-stage recognition. In the aspect of
channel compensation technique, this paper applies MAP (Maximum
A Posterior Probability) channel compensation technique, which was
used in speech recognition, to speaker recognition system. In the
aspect of channel robust features, this paper introduces
pitch-dependent features and pitch-dependent speaker model for the
second stage recognition. Based on the first stage recognition to
testing speech using GMM (Gaussian Mixture Model), the system
uses GMM scores to decide if it needs to be recognized again. If it
needs to, the system selects a few speakers from all of the speakers
who participate in the first stage recognition for the second stage
recognition. For each selected speaker, the system obtains 3
pitch-dependent results from his pitch-dependent speaker model, and
then uses ANN (Artificial Neural Network) to unite the 3
pitch-dependent results and 1 GMM score for getting a fused result.
The system makes the second stage recognition based on these fused
results. The experiments show that the correct rate of two-stage
recognition system based on MAP channel compensation technique
and pitch-dependent features is 41.7% better than the baseline system
for closed-set test.