Abstract: As the enormous amount of on-line text grows on the
World-Wide Web, the development of methods for automatically
summarizing this text becomes more important. The primary goal of
this research is to create an efficient tool that is able to summarize
large documents automatically. We propose an Evolving
connectionist System that is adaptive, incremental learning and
knowledge representation system that evolves its structure and
functionality. In this paper, we propose a novel approach for Part of
Speech disambiguation using a recurrent neural network, a paradigm
capable of dealing with sequential data. We observed that
connectionist approach to text summarization has a natural way of
learning grammatical structures through experience. Experimental
results show that our approach achieves acceptable performance.
Abstract: Ontology is widely being used as a tool for organizing
information, creating the relation between the subjects within the
defined knowledge domain area. Various fields such as Civil,
Biology, and Management have successful integrated ontology in
decision support systems for managing domain knowledge and to
assist their decision makers. Gross pollutant traps (GPT) are devices
used in trapping and preventing large items or hazardous particles in
polluting and entering our waterways. However choosing and
determining GPT is a challenge in Malaysia as there are inadequate
GPT data repositories being captured and shared. Hence ontology is
needed to capture, organize and represent this knowledge into
meaningful information which can be contributed to the efficiency of
GPT selection in Malaysia urbanization. A GPT Ontology framework
is therefore built as the first step to capture GPT knowledge which
will then be integrated into the decision support system. This paper
will provide several examples of the GPT ontology, and explain how
it is constructed by using the Protégé tool.
Abstract: Circle grid space filling plate is a flow conditioner with a fractal pattern and used to eliminate turbulence originating from pipe fittings in experimental fluid flow applications. In this paper, steady state, incompressible, swirling turbulent flow through circle grid space filling plate has been studied. The solution and the analysis were carried out using finite volume CFD solver FLUENT 6.2. Three turbulence models were used in the numerical investigation and their results were compared with the pressure drop correlation of BS EN ISO 5167-2:2003. The turbulence models investigated here are the standard k-ε, realizable k-ε, and the Reynolds Stress Model (RSM). The results showed that the RSM model gave the best agreement with the ISO pressure drop correlation. The effects of circle grids space filling plate thickness and Reynolds number on the flow characteristics have been investigated as well.
Abstract: The development of shape and size of a crack in a
pressure vessel under uniaxial and biaxial loadings is important in
fitness-for-service evaluations such as leak-before-break. In this
work finite element modelling was used to evaluate the mean stress
and the J-integral around a front of a surface-breaking crack. A
procedure on the basis of ductile tearing resistance curves of high and
low constrained fracture mechanics geometries was developed to
estimate the amount of ductile crack extension for surface-breaking
cracks and to show the evolution of the initial crack shape. The
results showed non-uniform constraint levels and crack driving forces
around the crack front at large deformation levels. It was also shown
that initially semi-elliptical surface cracks under biaxial load
developed higher constraint levels around the crack front than in
uniaxial tension. However similar crack shapes were observed with
more extensions associated with cracks under biaxial loading.
Abstract: We demonstrate single-photon interference over 10 km using a plug and play system for quantum key distribution. The quality of the interferometer is measured by using the interferometer
visibility. The coding of the signal is based on the phase coding and the value of visibility is based on the interference effect, which result a number of count. The setup gives full control of polarization inside
the interferometer. The quality measurement of the interferometer is based on number of count per second and the system produces 94 % visibility in one of the detectors.
Abstract: The third phase of web means semantic web requires many web pages which are annotated with metadata. Thus, a crucial question is where to acquire these metadata. In this paper we propose our approach, a semi-automatic method to annotate the texts of documents and web pages and employs with a quite comprehensive knowledge base to categorize instances with regard to ontology. The approach is evaluated against the manual annotations and one of the most popular annotation tools which works the same as our tool. The approach is implemented in .net framework and uses the WordNet for knowledge base, an annotation tool for the Semantic Web.
Abstract: Feature selection study is gaining importance due to its contribution to save classification cost in terms of time and computation load. In search of essential features, one of the methods to search the features is via the decision tree. Decision tree act as an intermediate feature space inducer in order to choose essential features. In decision tree-based feature selection, some studies used decision tree as a feature ranker with a direct threshold measure, while others remain the decision tree but utilized pruning condition that act as a threshold mechanism to choose features. This paper proposed threshold measure using Manhattan Hierarchical Cluster distance to be utilized in feature ranking in order to choose relevant features as part of the feature selection process. The result is promising, and this method can be improved in the future by including test cases of a higher number of attributes.
Abstract: The creation of a sustainable future depends on the knowledge and involvement of the people, as well as an understanding of the consequences of individual actions. Construction industry has long been associated with the detrimental effects to our mother earth. In Malaysia, the government, professional bodies and private companies are beginning to take heed in the necessity to reduce this environmental problem without restraining the need for development. This paper focuses on the actions undertaken by the Malaysian government, non-government organizations and construction players in promoting sustainability in construction. To ensure that those concerted efforts are not only skin deep in its impact, a survey was conducted to investigate the awareness of the developers regarding this issue and whether those developers has absorb the concept of sustainable construction in their current practices. The survey revealed that although the developers are aware of the rising issues on sustainability, little efforts are generated from them in implementing it. More effort is necessary to boost this application and further stimulate actions and strategies towards a sustainable built environment.
Abstract: The complexity of today-s software systems makes
collaborative development necessary to accomplish tasks.
Frameworks are necessary to allow developers perform their tasks
independently yet collaboratively. Similarity detection is one of the
major issues to consider when developing such frameworks. It allows
developers to mine existing repositories when developing their own
views of a software artifact, and it is necessary for identifying the
correspondences between the views to allow merging them and
checking their consistency. Due to the importance of the
requirements specification stage in software development, this paper
proposes a framework for collaborative development of Object-
Oriented formal specifications along with a similarity detection
approach to support the creation, merging and consistency checking
of specifications. The paper also explores the impact of using
additional concepts on improving the matching results. Finally, the
proposed approach is empirically evaluated.
Abstract: XML has become a popular standard for information exchange via web. Each XML document can be presented as a rooted, ordered, labeled tree. The Node label shows the exact position of a node in the original document. Region and Dewey encoding are two famous methods of labeling trees. In this paper, we propose a new insert friendly labeling method named IFDewey based on recently proposed scheme, called Extended Dewey. In Extended Dewey many labels must be modified when a new node is inserted into the XML tree. Our method eliminates this problem by reserving even numbers for future insertion. Numbers generated by Extended Dewey may be even or odd. IFDewey modifies Extended Dewey so that only odd numbers are generated and even numbers can then be used for a much easier insertion of nodes.
Abstract: Personal computers draw non-sinusoidal current
with odd harmonics more significantly. Power Quality of
distribution networks is severely affected due to the flow of these
generated harmonics during the operation of electronic loads. In
this paper, mathematical modeling of odd harmonics in current like
3rd, 5th, 7th and 9th influencing the power quality has been presented.
Live signals have been captured with the help of power quality
analyzer for analysis purpose. The interesting feature is that Total
Harmonic Distortion (THD) in current decreases with the increase
of nonlinear loads has been verified theoretically. The results
obtained using mathematical expressions have been compared with
the practical results and exciting results have been found.
Abstract: Simultaneous transient conduction and radiation heat
transfer with heat generation is investigated. Analysis is carried out
for both steady and unsteady situations. two-dimensional gray
cylindrical enclosure with an absorbing, emitting, and isotropically
scattering medium is considered. Enclosure boundaries are assumed
at specified temperatures. The heat generation rate is considered
uniform and constant throughout the medium. The lattice Boltzmann
method (LBM) was used to solve the energy equation of a transient
conduction-radiation heat transfer problem. The control volume finite
element method (CVFEM) was used to compute the radiative
information. To study the compatibility of the LBM for the energy
equation and the CVFEM for the radiative transfer equation, transient
conduction and radiation heat transfer problems in 2-D cylindrical
geometries were considered. In order to establish the suitability of the
LBM, the energy equation of the present problem was also solved
using the the finite difference method (FDM) of the computational
fluid dynamics. The CVFEM used in the radiative heat transfer was
employed to compute the radiative information required for the
solution of the energy equation using the LBM or the FDM (of the
CFD). To study the compatibility and suitability of the LBM for the
solution of energy equation and the CVFEM for the radiative
information, results were analyzed for the effects of various
parameters such as the boundary emissivity. The results of the LBMCVFEM
combination were found to be in excellent agreement with
the FDM-CVFEM combination. The number of iterations and the
steady state temperature in both of the combinations were found
comparable. Results are found for situations with and without heat
generation. Heat generation is found to have significant bearing on
temperature distribution.
Abstract: We present a simplified equalization technique for a
π/4 differential quadrature phase shift keying ( π/4 -DQPSK) modulated
signal in a multipath fading environment. The proposed equalizer is
realized as a fractionally spaced adaptive decision feedback equalizer
(FS-ADFE), employing exponential step-size least mean square
(LMS) algorithm as the adaptation technique. The main advantage of
the scheme stems from the usage of exponential step-size LMS algorithm
in the equalizer, which achieves similar convergence behavior
as that of a recursive least squares (RLS) algorithm with significantly
reduced computational complexity. To investigate the finite-precision
performance of the proposed equalizer along with the π/4 -DQPSK
modem, the entire system is evaluated on a 16-bit fixed point digital
signal processor (DSP) environment. The proposed scheme is found
to be attractive even for those cases where equalization is to be
performed within a restricted number of training samples.
Abstract: With the enormous growth on the web, users get easily
lost in the rich hyper structure. Thus developing user friendly and
automated tools for providing relevant information without any
redundant links to the users to cater to their needs is the primary task
for the website owners. Most of the existing web mining algorithms
have concentrated on finding frequent patterns while neglecting the
less frequent one that are likely to contain the outlying data such as
noise, irrelevant and redundant data. This paper proposes new
algorithm for mining the web content by detecting the redundant
links from the web documents using set theoretical(classical
mathematics) such as subset, union, intersection etc,. Then the
redundant links is removed from the original web content to get the
required information by the user..
Abstract: The weight constrained shortest path problem
(WCSPP) is one of most several known basic problems in
combinatorial optimization. Because of its importance in many areas
of applications such as computer science, engineering and operations
research, many researchers have extensively studied the WCSPP.
This paper mainly concentrates on the reduction of total search space
for finding WCSP using some existing Genetic Algorithm (GA). For
this purpose, some controlled schemes of genetic operators are
adopted on list chromosome representation. This approach gives a
near optimum solution with smaller elapsed generation than classical
GA technique. From further analysis on the matter, a new
generalized schema theorem is also developed from the philosophy
of Holland-s theorem.
Abstract: In this paper, we propose a reversible watermarking
scheme based on histogram shifting (HS) to embed watermark bits
into the H.264/AVC standard videos by modifying the last nonzero
level in the context adaptive variable length coding (CAVLC) domain.
The proposed method collects all of the last nonzero coefficients (or
called last level coefficient) of 4×4 sub-macro blocks in a macro
block and utilizes predictions for the current last level from the
neighbor block-s last levels to embed watermark bits. The feature of
the proposed method is low computational and has the ability of
reversible recovery. The experimental results have demonstrated that
our proposed scheme has acceptable degradation on video quality and
output bit-rate for most test videos.
Abstract: Classification of video sequences based on their contents is a vital process for adaptation techniques. It helps decide which adaptation technique best fits the resource reduction requested by the client. In this paper we used the principal feature analysis algorithm to select a reduced subset of video features. The main idea is to select only one feature from each class based on the similarities between the features within that class. Our results showed that using this feature reduction technique the source video features can be completely omitted from future classification of video sequences.
Abstract: Tofurther advance research on immune-related genes
from T. molitor, we constructed acDNA library and analyzed
expressed sequence taq (EST) sequences from 1,056 clones. After
removing vector sequence and quality checkingthrough thePhred
program (trim_alt 0.05 (P-score>20), 1039 sequences were generated.
The average length of insert was 792 bp. In addition, we identified 162
clusters, 167 contigs and 391 contigs after clustering and assembling
process using a TGICL package. EST sequences were searchedagainst
NCBI nr database by local BLAST (blastx, E
Abstract: The use of machine vision to inspect the outcome of
surgical tasks is investigated, with the aim of incorporating this
approach in robotic surgery systems. Machine vision is a non-contact
form of inspection i.e. no part of the vision system is in direct contact
with the patient, and is therefore well suited for surgery where
sterility is an important consideration,. As a proof-of-concept, three
primary surgical tasks for a common neurosurgical procedure were
inspected using machine vision. Experiments were performed on
cadaveric pig heads to simulate the two possible outcomes i.e.
satisfactory or unsatisfactory, for tasks involved in making a burr
hole, namely incision, retraction, and drilling. We identify low level
image features to distinguish the two outcomes, as well as report on
results that validate our proposed approach. The potential of using
machine vision in a surgical environment, and the challenges that
must be addressed, are identified and discussed.
Abstract: This paper outlines the development of a learning retrieval agent. Task of this agent is to extract knowledge of the Active Semantic Network in respect to user-requests. Based on a reinforcement learning approach, the agent learns to interpret the user-s intention. Especially, the learning algorithm focuses on the retrieval of complex long distant relations. Increasing its learnt knowledge with every request-result-evaluation sequence, the agent enhances his capability in finding the intended information.