Abstract: SeqWord Gene Island Sniffer, a new program for
the identification of mobile genetic elements in sequences of bacterial chromosomes is presented. This program is based on the
analysis of oligonucleotide usage variations in DNA sequences. 3,518 mobile genetic elements were identified in 637 bacterial
genomes and further analyzed by sequence similarity and the
functionality of encoded proteins. The results of this study are stored in an open database http://anjie.bi.up.ac.za/geidb/geidbhome.
php). The developed computer program and the database provide the information valuable for further investigation of the
distribution of mobile genetic elements and virulence factors among bacteria. The program is available for download at www.bi.up.ac.za/SeqWord/sniffer/index.html.
Abstract: Recent developments in Soft computing techniques,
power electronic switches and low-cost computational hardware have
made it possible to design and implement sophisticated control
strategies for sensorless speed control of AC motor drives. Such an
attempt has been made in this work, for Sensorless Speed Control of
Induction Motor (IM) by means of Direct Torque Fuzzy Control
(DTFC), PI-type fuzzy speed regulator and MRAS speed estimator
strategy, which is absolutely nonlinear in its nature. Direct torque
control is known to produce quick and robust response in AC drive
system. However, during steady state, torque, flux and current ripple
occurs. So, the performance of conventional DTC with PI speed
regulator can be improved by implementing fuzzy logic techniques.
Certain important issues in design including the space vector
modulated (SVM) 3-Ф voltage source inverter, DTFC design,
generation of reference torque using PI-type fuzzy speed regulator
and sensor less speed estimator have been resolved. The proposed
scheme is validated through extensive numerical simulations on
MATLAB. The simulated results indicate the sensor less speed
control of IM with DTFC and PI-type fuzzy speed regulator provides
satisfactory high dynamic and static performance compare to
conventional DTC with PI speed regulator.
Abstract: In this work social stratification is considered as one
of significant factor which generate the phenomena “terrorism” and it
puts the accent on correlation connection between them, with the
object of creation info-logical model generation of phenomena of
“terrorism” based on stratification process.
Abstract: Using Dynamic Bayesian Networks (DBN) to model genetic regulatory networks from gene expression data is one of the major paradigms for inferring the interactions among genes. Averaging a collection of models for predicting network is desired, rather than relying on a single high scoring model. In this paper, two kinds of model searching approaches are compared, which are Greedy hill-climbing Search with Restarts (GSR) and Markov Chain Monte Carlo (MCMC) methods. The GSR is preferred in many papers, but there is no such comparison study about which one is better for DBN models. Different types of experiments have been carried out to try to give a benchmark test to these approaches. Our experimental results demonstrated that on average the MCMC methods outperform the GSR in accuracy of predicted network, and having the comparable performance in time efficiency. By proposing the different variations of MCMC and employing simulated annealing strategy, the MCMC methods become more efficient and stable. Apart from comparisons between these approaches, another objective of this study is to investigate the feasibility of using DBN modeling approaches for inferring gene networks from few snapshots of high dimensional gene profiles. Through synthetic data experiments as well as systematic data experiments, the experimental results revealed how the performances of these approaches can be influenced as the target gene network varies in the network size, data size, as well as system complexity.
Abstract: The compression-absorption heat pump (C-A HP), one
of the promising heat recovery equipments that make process hot
water using low temperature heat of wastewater, was evaluated by
computer simulation. A simulation program was developed based on
the continuity and the first and second laws of thermodynamics. Both
the absorber and desorber were modeled using UA-LMTD method. In
order to prevent an unfeasible temperature profile and to reduce
calculation errors from the curved temperature profile of a mixture,
heat loads were divided into lots of segments. A single-stage
compressor was considered. A compressor cooling load was also
taken into account. An isentropic efficiency was computed from the
map data. Simulation conditions were given based on the system
consisting of ordinarily designed components. The simulation results
show that most of the total entropy generation occurs during the
compression and cooling process, thus suggesting the possibility that
system performance can be enhanced if a rectifier is introduced.
Abstract: In this paper we propose a Multiple Description Image Coding(MDIC) scheme to generate two compressed and balanced rates descriptions in the wavelet domain (Daubechies biorthogonal (9, 7) wavelet) using pairwise correlating transform optimal and application method for Generalized Multiple Description Coding (GMDC) to image coding in the wavelet domain. The GMDC produces statistically correlated streams such that lost streams can be estimated from the received data. Our performance test shown that the proposed method gives more improvement and good quality of the reconstructed image when the wavelet coefficients are normalized by Gaussian Scale Mixture (GSM) model then the Gaussian one ,.
Abstract: This paper presents modeling and analysis of 12-phase distribution static compensator (DSTATCOM), which is capable of balancing the source currents in spite of unbalanced loading and phase outages. In addition to balance the supply current, the power factor can be set to a desired value. The theory of instantaneous symmetrical components is used to generate the twelve-phase reference currents. These reference currents are then tracked using current controlled voltage source inverter, operated in a hysteresis band control scheme. An ideal compensator in place of physical realization of the compensator is used. The performance of the proposed DTATCOM is validated through MATLAB simulation and detailed simulation results are given.
Abstract: The increasing demand for sufficient and clean
energy forces industrial and service companies to align their strategies towards efficient consumption. This trend refers also to the
residential building sector. There, large amounts of energy consumption are caused by house and facility heating. Many of the
operated hot water heating systems lack hydraulic balanced working
conditions for heat distribution and –transmission and lead to
inefficient heating. Through hydraulic balancing of heating systems,
significant energy savings for primary and secondary energy can be
achieved. This paper addresses the use of KNX-technology (Smart
Buildings) in residential buildings to ensure a dynamic adaption of
hydraulic system's performance, in order to increase the heating
system's efficiency. In this paper, the procedure of heating system
segmentation into hydraulically independent units (meshes) is
presented. Within these meshes, the heating valve are addressed and
controlled by a central facility server. Feasibility criteria towards
such drivers will be named. The dynamic hydraulic balance is
achieved by positioning these valves according to heating loads, that
are generated from the temperature settings in the corresponding
rooms. The energetic advantages of single room heating control
procedures, based on the application FacilityManager, is presented.
Abstract: Covering-based rough sets is an extension of rough
sets and it is based on a covering instead of a partition of the
universe. Therefore it is more powerful in describing some practical
problems than rough sets. However, by extending the rough sets,
covering-based rough sets can increase the roughness of each model
in recognizing objects. How to obtain better approximations from
the models of a covering-based rough sets is an important issue.
In this paper, two concepts, determinate elements and indeterminate
elements in a universe, are proposed and given precise definitions
respectively. This research makes a reasonable refinement of the
covering-element from a new viewpoint. And the refinement may
generate better approximations of covering-based rough sets models.
To prove the theory above, it is applied to eight major coveringbased
rough sets models which are adapted from other literature.
The result is, in all these models, the lower approximation increases
effectively. Correspondingly, in all models, the upper approximation
decreases with exceptions of two models in some special situations.
Therefore, the roughness of recognizing objects is reduced. This
research provides a new approach to the study and application of
covering-based rough sets.
Abstract: In this work, are discussed two formulations of the boundary element method - BEM to perform linear bending analysis of plates reinforced by beams. Both formulations are based on the Kirchhoff's hypothesis and they are obtained from the reciprocity theorem applied to zoned plates, where each sub-region defines a beam or a slab. In the first model the problem values are defined along the interfaces and the external boundary. Then, in order to reduce the number of degrees of freedom kinematics hypothesis are assumed along the beam cross section, leading to a second formulation where the collocation points are defined along the beam skeleton, instead of being placed on interfaces. On these formulations no approximation of the generalized forces along the interface is required. Moreover, compatibility and equilibrium conditions along the interface are automatically imposed by the integral equation. Thus, these formulations require less approximation and the total number of the degree s of freedom is reduced. In the numerical examples are discussed the differences between these two BEM formulations, comparing as well the results to a well-known finite element code.
Abstract: In the proposed method for Web page-ranking, a
novel theoretic model is introduced and tested by examples of order
relationships among IP addresses. Ranking is induced using a
convexity feature, which is learned according to these examples
using a self-organizing procedure. We consider the problem of selforganizing
learning from IP data to be represented by a semi-random
convex polygon procedure, in which the vertices correspond to IP
addresses. Based on recent developments in our regularization
theory for convex polygons and corresponding Euclidean distance
based methods for classification, we develop an algorithmic
framework for learning ranking functions based on a Computational
Geometric Theory. We show that our algorithm is generic, and
present experimental results explaining the potential of our approach.
In addition, we explain the generality of our approach by showing its
possible use as a visualization tool for data obtained from diverse
domains, such as Public Administration and Education.
Abstract: A large amount of valuable information is available in
plain text clinical reports. New techniques and technologies are
applied to extract information from these reports. In this study, we
developed a domain based software system to transform 600
Otorhinolaryngology discharge notes to a structured form for
extracting clinical data from the discharge notes. In order to decrease
the system process time discharge notes were transformed into a data
table after preprocessing. Several word lists were constituted to
identify common section in the discharge notes, including patient
history, age, problems, and diagnosis etc. N-gram method was used
for discovering terms co-Occurrences within each section. Using this
method a dataset of concept candidates has been generated for the
validation step, and then Predictive Apriori algorithm for Association
Rule Mining (ARM) was applied to validate candidate concepts.
Abstract: Microarray data profiles gene expression on a whole
genome scale, therefore, it provides a good way to study associations
between gene expression and occurrence or progression of cancer.
More and more researchers realized that microarray data is helpful
to predict cancer sample. However, the high dimension of gene
expressions is much larger than the sample size, which makes this
task very difficult. Therefore, how to identify the significant genes
causing cancer becomes emergency and also a hot and hard research
topic. Many feature selection algorithms have been proposed in
the past focusing on improving cancer predictive accuracy at the
expense of ignoring the correlations between the features. In this
work, a novel framework (named by SGS) is presented for stable gene
selection and efficient cancer prediction . The proposed framework
first performs clustering algorithm to find the gene groups where
genes in each group have higher correlation coefficient, and then
selects the significant genes in each group with Bayesian Lasso and
important gene groups with group Lasso, and finally builds prediction
model based on the shrinkage gene space with efficient classification
algorithm (such as, SVM, 1NN, Regression and etc.). Experiment
results on real world data show that the proposed framework often
outperforms the existing feature selection and prediction methods,
say SAM, IG and Lasso-type prediction model.
Abstract: Names are important in many societies, even in technologically oriented ones which use e.g. ID systems to identify individual people. Names such as surnames are the most important as they are used in many processes, such as identifying of people and genealogical research. On the other hand variation of names can be a major problem for the identification and search for people, e.g. web search or security reasons. Name matching presumes a-priori that the recorded name written in one alphabet reflects the phonetic identity of two samples or some transcription error in copying a previously recorded name. We add to this the lode that the two names imply the same person. This paper describes name variations and some basic description of various name matching algorithms developed to overcome name variation and to find reasonable variants of names which can be used to further increasing mismatches for record linkage and name search. The implementation contains algorithms for computing a range of fuzzy matching based on different types of algorithms, e.g. composite and hybrid methods and allowing us to test and measure algorithms for accuracy. NYSIIS, LIG2 and Phonex have been shown to perform well and provided sufficient flexibility to be included in the linkage/matching process for optimising name searching.
Abstract: Decision support based upon risk analysis into
comparison of the electricity generation from different renewable
energy technologies can provide information about their effects on
the environment and society. The aim of this paper is to develop the
assessment framework regarding risks to health and environment,
and the society-s benefits of the electric power plant generation from
different renewable sources. The multicriteria framework to
multiattribute risk analysis technique and the decision analysis
interview technique are applied in order to support the decisionmaking
process for the implementing renewable energy projects to
the Bangkok case study. Having analyses the local conditions and
appropriate technologies, five renewable power plants are postulated
as options. As this work demonstrates, the analysis can provide a tool
to aid decision-makers for achieving targets related to promote
sustainable energy system.
Abstract: Developing an accurate classifier for high dimensional microarray datasets is a challenging task due to availability of small sample size. Therefore, it is important to determine a set of relevant genes that classify the data well. Traditionally, gene selection method often selects the top ranked genes according to their discriminatory power. Often these genes are correlated with each other resulting in redundancy. In this paper, we have proposed a hybrid method using feature ranking and wrapper method (Genetic Algorithm with multiclass SVM) to identify a set of relevant genes that classify the data more accurately. A new fitness function for genetic algorithm is defined that focuses on selecting the smallest set of genes that provides maximum accuracy. Experiments have been carried on four well-known datasets1. The proposed method provides better results in comparison to the results found in the literature in terms of both classification accuracy and number of genes selected.
Abstract: Web-based cooperative learning focuses on (1) the interaction and the collaboration of community members, and (2) the sharing and the distribution of knowledge and expertise by network technology to enhance learning performance. Numerous research literatures related to web-based cooperative learning have demonstrated that cooperative scripts have a positive impact to specify, sequence, and assign cooperative learning activities. Besides, literatures have indicated that role-play in web-based cooperative learning environments enhances two or more students to work together toward the completion of a common goal. Since students generally do not know each other and they lack the face-to-face contact that is necessary for the negotiation of assigning group roles in web-based cooperative learning environments, this paper intends to further extend the application of genetic algorithm (GA) and propose a GA-based algorithm to tackle the problem of role assignment in web-based cooperative learning environments, which not only saves communication costs but also reduces conflict between group members in negotiating role assignments.
Abstract: In this paper an analysis of blackouts in electric power
transmission systems is implemented using a model and studied in
simple networks with a regular topology. The proposed model
describes load demand and network improvements evolving on a
slow timescale as well as the fast dynamics of cascading overloads
and outages.
Abstract: Water vapour transport properties of gypsum block
are studied in dependence on relative humidity using inverse analysis
based on genetic algorithm. The computational inverse analysis is
performed for the relative humidity profiles measured along the
longitudinal axis of a rod sample. Within the performed transient
experiment, the studied sample is exposed to two environments with
different relative humidity, whereas the temperature is kept constant.
For the basic gypsum characterisation and for the assessment of input
material parameters necessary for computational application of
genetic algorithm, the basic material properties of gypsum are
measured as well as its thermal and water vapour storage parameters.
On the basis of application of genetic algorithm, the relative
humidity dependent water vapour diffusion coefficient and water
vapour diffusion resistance factor are calculated.
Abstract: The protection of the contents of digital products is
referred to as content authentication. In some applications, to be able
to authenticate a digital product could be extremely essential. For
example, if a digital product is used as a piece of evidence in the
court, its integrity could mean life or death of the accused. Generally,
the problem of content authentication can be solved using semifragile
digital watermarking techniques. Recently many authors have
proposed Computer Generated Hologram Watermarking (CGHWatermarking)
techniques. Starting from these studies, in this paper
a semi-fragile Computer Generated Hologram coding technique is
proposed, which is able to detect malicious tampering while
tolerating some incidental distortions. The proposed technique uses
as watermark an encrypted image, and it is well suitable for digital
image authentication.