Abstract: Cancers could normally be marked by a number of
differentially expressed genes which show enormous potential as
biomarkers for a certain disease. Recent years, cancer classification
based on the investigation of gene expression profiles derived by
high-throughput microarrays has widely been used. The selection of
discriminative genes is, therefore, an essential preprocess step in
carcinogenesis studies. In this paper, we have proposed a novel gene
selector using information-theoretic measures for biological
discovery. This multivariate filter is a four-stage framework through
the analyses of feature relevance, feature interdependence, feature
redundancy-dependence and subset rankings, and having been
examined on the colon cancer data set. Our experimental result show
that the proposed method outperformed other information theorem
based filters in all aspect of classification errors and classification
performance.
Abstract: The present models and simulation algorithms of intracellular stochastic kinetics are usually based on the premise that diffusion is so fast that the concentrations of all the involved species are homogeneous in space. However, recents experimental measurements of intracellular diffusion constants indicate that the assumption of a homogeneous well-stirred cytosol is not necessarily valid even for small prokaryotic cells. In this work a mathematical treatment of diffusion that can be incorporated in a stochastic algorithm simulating the dynamics of a reaction-diffusion system is presented. The movement of a molecule A from a region i to a region j of the space is represented as a first order reaction Ai k- ! Aj , where the rate constant k depends on the diffusion coefficient. The diffusion coefficients are modeled as function of the local concentration of the solutes, their intrinsic viscosities, their frictional coefficients and the temperature of the system. The stochastic time evolution of the system is given by the occurrence of diffusion events and chemical reaction events. At each time step an event (reaction or diffusion) is selected from a probability distribution of waiting times determined by the intrinsic reaction kinetics and diffusion dynamics. To demonstrate the method the simulation results of the reaction-diffusion system of chaperoneassisted protein folding in cytoplasm are shown.
Abstract: Business process model describes process flow of a
business and can be seen as the requirement for developing a
software application. This paper discusses a BPM2CD guideline
which complements the Model Driven Architecture concept by
suggesting how to create a platform-independent software model in
the form of a UML class diagram from a business process model. An
important step is the identification of UML classes from the business
process model. A technique for object-oriented analysis called
domain analysis is borrowed and key concepts in the business
process model will be discovered and proposed as candidate classes
for the class diagram. The paper enhances this step by using ontology
search to help identify important classes for the business domain. As
ontology is a source of knowledge for a particular domain which
itself can link to ontologies of related domains, the search can give a
refined set of candidate classes for the resulting class diagram.
Abstract: Camera calibration is an indispensable step for augmented
reality or image guided applications where quantitative information
should be derived from the images. Usually, a camera
calibration is obtained by taking images of a special calibration object
and extracting the image coordinates of projected calibration marks
enabling the calculation of the projection from the 3d world coordinates
to the 2d image coordinates. Thus such a procedure exhibits
typical steps, including feature point localization in the acquired
images, camera model fitting, correction of distortion introduced by
the optics and finally an optimization of the model-s parameters. In
this paper we propose to extend this list by further step concerning
the identification of the optimal subset of images yielding the smallest
overall calibration error. For this, we present a Monte Carlo based
algorithm along with a deterministic extension that automatically
determines the images yielding an optimal calibration. Finally, we
present results proving that the calibration can be significantly
improved by automated image selection.
Abstract: The conjugate gradient optimization algorithm
usually used for nonlinear least squares is presented and is
combined with the modified back propagation algorithm yielding
a new fast training multilayer perceptron (MLP) algorithm
(CGFR/AG). The approaches presented in the paper consist of
three steps: (1) Modification on standard back propagation
algorithm by introducing gain variation term of the activation
function, (2) Calculating the gradient descent on error with
respect to the weights and gains values and (3) the determination
of the new search direction by exploiting the information
calculated by gradient descent in step (2) as well as the previous
search direction. The proposed method improved the training
efficiency of back propagation algorithm by adaptively modifying
the initial search direction. Performance of the proposed method
is demonstrated by comparing to the conjugate gradient algorithm
from neural network toolbox for the chosen benchmark. The
results show that the number of iterations required by the
proposed method to converge is less than 20% of what is required
by the standard conjugate gradient and neural network toolbox
algorithm.
Abstract: The aim of this paper is to present a new method
which can be used for progressive transmission of electrocardiogram
(ECG). The idea consists in transforming any ECG signal to an
image, containing one beat in each row. In the first step, the beats are
synchronized in order to reduce the high frequencies due to inter-beat
transitions. The obtained image is then transformed using a discrete
version of Radon Transform (DRT). Hence, transmitting the ECG,
leads to transmit the most significant energy of the transformed
image in Radon domain. For decoding purpose, the receptor needs to
use the inverse Radon Transform as well as the two synchronization
frames.
The presented protocol can be adapted for lossy to lossless
compression systems. In lossy mode we show that the compression
ratio can be multiplied by an average factor of 2 for an acceptable
quality of reconstructed signal. These results have been obtained on
real signals from MIT database.
Abstract: We present a new method to reconstruct a temporally
coherent 3D animation from single or multi-view RGB-D video data
using unbiased feature point sampling. Given RGB-D video data, in
form of a 3D point cloud sequence, our method first extracts feature
points using both color and depth information. In the subsequent
steps, these feature points are used to match two 3D point clouds in
consecutive frames independent of their resolution. Our new motion
vectors based dynamic alignement method then fully reconstruct
a spatio-temporally coherent 3D animation. We perform extensive
quantitative validation using novel error functions to analyze the
results. We show that despite the limiting factors of temporal and
spatial noise associated to RGB-D data, it is possible to extract
temporal coherence to faithfully reconstruct a temporally coherent
3D animation from RGB-D video data.
Abstract: This paper introduces new algorithms (Fuzzy relative
of the CLARANS algorithm FCLARANS and Fuzzy c Medoids
based on randomized search FCMRANS) for fuzzy clustering of
relational data. Unlike existing fuzzy c-medoids algorithm (FCMdd)
in which the within cluster dissimilarity of each cluster is minimized
in each iteration by recomputing new medoids given current
memberships, FCLARANS minimizes the same objective function
minimized by FCMdd by changing current medoids in such away
that that the sum of the within cluster dissimilarities is minimized.
Computing new medoids may be effected by noise because outliers
may join the computation of medoids while the choice of medoids in
FCLARANS is dictated by the location of a predominant fraction of
points inside a cluster and, therefore, it is less sensitive to the
presence of outliers. In FCMRANS the step of computing new
medoids in FCMdd is modified to be based on randomized search.
Furthermore, a new initialization procedure is developed that add
randomness to the initialization procedure used with FCMdd. Both
FCLARANS and FCMRANS are compared with the robust and
linearized version of fuzzy c-medoids (RFCMdd). Experimental
results with different samples of the Reuter-21578, Newsgroups
(20NG) and generated datasets with noise show that FCLARANS is
more robust than both RFCMdd and FCMRANS. Finally, both
FCMRANS and FCLARANS are more efficient and their outputs
are almost the same as that of RFCMdd in terms of classification
rate.
Abstract: The coalescer process is one of the methods for oily water treatment by increasing the oil droplet size in order to enhance the separating velocity and thus effective separation. However, the presence of surfactants in an oily emulsion can limit the obtained mechanisms due to the small oil size related with stabilized emulsion. In this regard, the purpose of this research is to improve the efficiency of the coalescer process for treating the stabilized emulsion. The effects of bed types, bed height, liquid flow rate and stage coalescer (step-bed) on the treatment efficiencies in term of COD values were studied. Note that the treatment efficiency obtained experimentally was estimated by using the COD values and oil droplet size distribution. The study has shown that the plastic media has more effective to attach with oil particles than the stainless one due to their hydrophobic properties. Furthermore, the suitable bed height (3.5 cm) and step bed (3.5 cm with 2 steps) were necessary in order to well obtain the coalescer performance. The application of step bed coalescer process in reactor has provided the higher treatment efficiencies in term of COD removal than those obtained with classical process. The proposed model for predicting the area under curve and thus treatment efficiency, based on the single collector efficiency (ηT) and the attachment efficiency (α), provides relatively a good coincidence between the experimental and predicted values of treatment efficiencies in this study.
Abstract: This study investigated the number of Aedes larvae,
the key breeding sites of Aedes sp., and the relationship between
climatic factors and the incidence of DHF in Samui Islands. We
conducted our questionnaire and larval surveys from randomly
selected 105 households in Samui Islands in July-September 2006.
Pearson-s correlation coefficient was used to explore the primary
association between the DHF incidence and all climatic factors.
Multiple stepwise regression technique was then used to fit the
statistical model. The results showed that the positive indoor
containers were small jars, cement tanks, and plastic tanks. The
positive outdoor containers were small jars, cement tanks, plastic
tanks, used cans, tires, plastic bottles, discarded objects, pot saucers,
plant pots, and areca husks. All Ae. albopictus larval indices (i.e., CI,
HI, and BI) were higher than Ae. aegypti larval indices in this area.
These larval indices were higher than WHO standard. This indicated
a high risk of DHF transmission at Samui Islands. The multiple
stepwise regression model was y = –288.80 + 11.024xmean temp. The
mean temperature was positively associated with the DHF incidence
in this area.
Abstract: Web usage mining has become a popular research
area, as a huge amount of data is available online. These data can be
used for several purposes, such as web personalization, web structure
enhancement, web navigation prediction etc. However, the raw log
files are not directly usable; they have to be preprocessed in order to
transform them into a suitable format for different data mining tasks.
One of the key issues in the preprocessing phase is to identify web
users. Identifying users based on web log files is not a
straightforward problem, thus various methods have been developed.
There are several difficulties that have to be overcome, such as client
side caching, changing and shared IP addresses and so on. This paper
presents three different methods for identifying web users. Two of
them are the most commonly used methods in web log mining
systems, whereas the third on is our novel approach that uses a
complex cookie-based method to identify web users. Furthermore we
also take steps towards identifying the individuals behind the
impersonal web users. To demonstrate the efficiency of the new
method we developed an implementation called Web Activity
Tracking (WAT) system that aims at a more precise distinction of
web users based on log data. We present some statistical analysis
created by the WAT on real data about the behavior of the Hungarian
web users and a comprehensive analysis and comparison of the three
methods
Abstract: Whilst there is growing evidence that activity
across the lifespan is beneficial for improved health, there are
also many changes involved with the aging process and
subsequently the potential for reduced indices of health. The
nexus between health, physical activity and aging is complex
and has raised much interest in recent times due to the
realization that a multifaceted approached is necessary in
order to counteract a growing obesity epidemic. By
investigating age based trends within a population adhering to
competitive sport at older ages, further insight might be
gleaned to assist in understanding one of many factors
influencing this relationship.
BMI was derived using data gathered on a total of 6,071
masters athletes (51.9% male, 48.1% female) aged 25 to 91
years ( =51.5, s =±9.7), competing at the Sydney World
Masters Games (2009). Using linear and loess regression it
was demonstrated that the usual tendency for prevalence of
higher BMI increasing with age was reversed in the sample.
This trend in reversal was repeated for both male and female
only sub-sets of the sample participants, indicating the
possibility of improved prevalence of BMI with increasing
age for both the sample as a whole and these individual subgroups.
This evidence of improved classification in one index of
health (reduced BMI) for masters athletes (when compared to
the general population) implies there are either improved
levels of this index of health with aging due to adherence to
sport or possibly the reduced BMI is advantageous and
contributes to this cohort adhering (or being attracted) to
masters sport at older ages. Demonstration of this
proportionately under-investigated World Masters Games
population having an improved relationship between BMI and
increasing age over the general population is of particular
interest in the context of the measures being taken globally to
curb an obesity epidemic.
Abstract: The paper presents a technique suitable in robot
vision applications where it is not possible to establish the object position from one view. Usually, one view pose calculation methods
are based on the correspondence of image features established at a
training step and exactly the same image features extracted at the
execution step, for a different object pose. When such a
correspondence is not feasible because of the lack of specific features
a new method is proposed. In the first step the method computes
from two views the 3D pose of feature points. Subsequently, using a
registration algorithm, the set of 3D feature points extracted at the execution phase is aligned with the set of 3D feature points extracted
at the training phase. The result is a Euclidean transform which have
to be used by robot head for reorientation at execution step.
Abstract: The daily growing use of agents in software environments, because of many reasons such as independence and intelligence is not a secret anymore. One of such environments in which there is a prominent job for the agents would be emarketplaces in which a user is able to give those agents the responsibility of buying and selling, instead of searching the emarketplace himself. Making up a framework which has sufficient attention to the required roles and their relations, is the first step of achieving such e-markets. In this paper, we suggest a framework in order to establish such e-markets and we will continue investigating the roles such as seller or buyer and the relations in JADE environment in details.
Abstract: In this study, a new and fast algorithm for Ascending
Aorta (AscA) and Descending Aorta (DesA) segmentation is
presented using Computed Tomography Angiography images. This
process is quite important especially at the detection of aortic
plaques, aneurysms, calcification or stenosis. The applied method has
been carried out at four steps. At first step, lung segmentation is
achieved. At the second one, Mediastinum Region (MR) is detected
to use in the segmentation. At the third one, images have been
applied optimal threshold and components which are outside of the
MR were removed. Lastly, identifying and segmentation of AscA and
DesA have been carried out. The performance of the applied method
is found quite well for radiologists and it gives enough results to the
surgeries medically.
Abstract: Vehicle detection is the critical step for highway monitoring. In this paper we propose background subtraction and edge detection technique for vehicle detection. This technique uses the advantages of both approaches. The practical applications approved the effectiveness of this method. This method consists of two procedures: First, automatic background extraction procedure, in which the background is extracted automatically from the successive frames; Second vehicles detection procedure, which depend on edge detection and background subtraction. Experimental results show the effective application of this algorithm. Vehicles detection rate was higher than 91%.
Abstract: Knowledge management is a process taking any steps
that needed to get the most out of available knowledge resources.
KM involved several steps; capturing the knowledge discovering
new knowledge, sharing the knowledge and applied the knowledge in
the decision making process. In applying the knowledge, it is not
necessary for the individual that use the knowledge to comprehend it
as long as the available knowledge is used in guiding the decision
making and actions. When an expert is called and he provides stepby-
step procedure on how to solve the problems to the caller, the
expert is transferring the knowledge or giving direction to the caller.
And the caller is 'applying' the knowledge by following the
instructions given by the expert. An appropriate mechanism is
needed to ensure effective knowledge transfer which in this case is
by telephone or email. The problem with email and telephone is that
the knowledge is not fully circulated and disseminated to all users. In
this paper, with related experience of local university Help Desk, it is
proposed the usage of Information Technology (IT)to effectively
support the knowledge transfer in the organization. The issues
covered include the existing knowledge, the related works, the
methodology used in defining the knowledge management
requirements as well the overview of the prototype.
Abstract: Motion estimation is the most computationally
intensive part in video processing. Many fast motion estimation
algorithms have been proposed to decrease the computational
complexity by reducing the number of candidate motion vectors.
However, these studies are for fast search algorithms themselves while
almost image and video compressions are operated with software
based. Therefore, the timing constraints for running these motion
estimation algorithms not only challenge for the video codec but also
overwhelm for some of processors. In this paper, the performance of
motion estimation is enhanced by using Intel's Streaming SIMD
Extension 2 (SSE2) technology with Intel Pentium 4 processor.
Abstract: Asiatic Houbara ( Chlamydotis macqueenii ) is a
flagship and vulnerable species. In-situ conservation of this
threatened species demands for knowledge of its habitat selection.
The aim of this study was to determine habitat variables influencing
birds wintering and breeding selection in semi- arid central Iran.
Habitat features of the detected nest and pellet sites were compared
with paired and random plots by quantifying a number of habitat
variables. In wintering habitat use at micro scale houbara selected
sites where vegetation cover was significantly lower compard to
control sites( p< 0.001). Areas with low number of larger plant
species (p=0.03) that were not too close to a vegetation
patch(p
Abstract: Electrocardiogram (ECG) is considered to be the
backbone of cardiology. ECG is composed of P, QRS & T waves and
information related to cardiac diseases can be extracted from the
intervals and amplitudes of these waves. The first step in extracting
ECG features starts from the accurate detection of R peaks in the
QRS complex. We have developed a robust R wave detector using
wavelets. The wavelets used for detection are Daubechies and
Symmetric. The method does not require any preprocessing therefore,
only needs the ECG correct recordings while implementing the
detection. The database has been collected from MIT-BIH arrhythmia
database and the signals from Lead-II have been analyzed. MatLab
7.0 has been used to develop the algorithm. The ECG signal under
test has been decomposed to the required level using the selected
wavelet and the selection of detail coefficient d4 has been done based
on energy, frequency and cross-correlation analysis of decomposition
structure of ECG signal. The robustness of the method is apparent
from the obtained results.