Abstract: A multicriteria linear programming problem with integer variables and parameterized optimality principle "from lexicographic to Slater" is considered. A situation in which initial coefficients of penalty cost functions are not fixed but may be potentially a subject to variations is studied. For any efficient solution, appropriate measures of the quality are introduced which incorporate information about variations of penalty cost function coefficients. These measures correspond to the so-called stability and accuracy functions defined earlier for efficient solutions of a generic multicriteria combinatorial optimization problem with Pareto and lexicographic optimality principles. Various properties of such functions are studied and maximum norms of perturbations for which an efficient solution preserves the property of being efficient are calculated.
Abstract: Batch adsorption of recalcitrant melanoidin using the abundantly available coal fly ash was carried out. It had low specific surface area (SBET) of 1.7287 m2/g and pore volume of 0.002245 cm3/g while qualitative evaluation of the predominant phases in it was done by XRD analysis. Colour removal efficiency was found to be dependent on various factors studied. Maximum colour removal was achieved around pH 6, whereas increasing sorbent mass from 10g/L to 200 g/L enhanced colour reduction from 25% to 86% at 298 K. Spontaneity of the process was suggested by negative Gibbs free energy while positive values for enthalpy change showed endothermic nature of the process. Non-linear optimization of error functions resulted in Freundlich and Redlich-Peterson isotherms describing sorption equilibrium data best. The coal fly ash had maximum sorption capacity of 53 mg/g and could thus be used as a low cost adsorbent in melanoidin removal.
Abstract: A framework to estimate the state of dynamically
varying environment where data are generated from heterogeneous
sources possessing partial knowledge about the environment is presented.
This is entirely derived within Dempster-Shafer and Evidence
Filtering frameworks. The belief about the current state is expressed
as belief and plausibility functions. An addition to Single Input
Single Output Evidence Filter, Multiple Input Single Output Evidence
Filtering approach is introduced. Variety of applications such as
situational estimation of an emergency environment can be developed
within the framework successfully. Fire propagation scenario is used
to justify the proposed framework, simulation results are presented.
Abstract: Fischer-Tropsch synthesis is one of the most
important catalytic reactions that convert the synthetic gas to light
and heavy hydrocarbons. One of the main issues is selecting the type
of reactor. The slurry bubble reactor is suitable choice for Fischer-
Tropsch synthesis because of its good qualification to transfer heat
and mass, high durability of catalyst, low cost maintenance and
repair. The more common catalysts for Fischer-Tropsch synthesis are
Iron-based and Cobalt-based catalysts, the advantage of these
catalysts on each other depends on which type of hydrocarbons we
desire to produce. In this study, Fischer-Tropsch synthesis is modeled
with Iron and Cobalt catalysts in a slurry bubble reactor considering
mass and momentum balance and the hydrodynamic relations effect
on the reactor behavior. Profiles of reactant conversion and reactant
concentration in gas and liquid phases were determined as the
functions of residence time in the reactor. The effects of temperature,
pressure, liquid velocity, reactor diameter, catalyst diameter, gasliquid
and liquid-solid mass transfer coefficients and kinetic
coefficients on the reactant conversion have been studied. With 5%
increase of liquid velocity (with Iron catalyst), H2 conversions
increase about 6% and CO conversion increase about 4%, With 8%
increase of liquid velocity (with Cobalt catalyst), H2 conversions
increase about 26% and CO conversion increase about 4%. With
20% increase of gas-liquid mass transfer coefficient (with Iron
catalyst), H2 conversions increase about 12% and CO conversion
increase about 10% and with Cobalt catalyst H2 conversions increase
about 10% and CO conversion increase about 6%. Results show that
the process is sensitive to gas-liquid mass transfer coefficient and
optimum condition operation occurs in maximum possible liquid
velocity. This velocity must be more than minimum fluidization
velocity and less than terminal velocity in such a way that avoid
catalysts particles from leaving the fluidized bed.
Abstract: Transcription factors are a group of proteins that
helps for interpreting the genetic information in DNA.
Protein-protein interactions play a major role in the execution
of key biological functions of a cell. These interactions are
represented in the form of a graph with nodes and edges.
Studies have showed that some nodes have high degree of
connectivity and such nodes, known as hub nodes, are the
inevitable parts of the network. In the present paper a method
is proposed to identify hub transcription factor proteins using
sequence information. On a complete data set of transcription
factor proteins available from the APID database, the
proposed method showed an accuracy of 77%, sensitivity of
79% and specificity of 76%.
Abstract: In this paper, the statistical properties of filtered or convolved signals are considered by deriving the resulting density functions as well as the exact mean and variance expressions given a prior knowledge about the statistics of the individual signals in the filtering or convolution process. It is shown that the density function after linear convolution is a mixture density, where the number of density components is equal to the number of observations of the shortest signal. For circular convolution, the observed samples are characterized by a single density function, which is a sum of products.
Abstract: The purpose of this paper is to present two different
approaches of financial distress pre-warning models appropriate for
risk supervisors, investors and policy makers. We examine a sample
of the financial institutions and electronic companies of Taiwan
Security Exchange (TSE) market from 2002 through 2008. We
present a binary logistic regression with paned data analysis. With
the pooled binary logistic regression we build a model including
more variables in the regression than with random effects, while the
in-sample and out-sample forecasting performance is higher in
random effects estimation than in pooled regression. On the other
hand we estimate an Adaptive Neuro-Fuzzy Inference System
(ANFIS) with Gaussian and Generalized Bell (Gbell) functions and
we find that ANFIS outperforms significant Logit regressions in both
in-sample and out-of-sample periods, indicating that ANFIS is a
more appropriate tool for financial risk managers and for the
economic policy makers in central banks and national statistical
services.
Abstract: This paper proposes a modeling methodology for the
development of data analysis solution. The Author introduce the
approach to address data warehousing issues at the at enterprise level.
The methodology covers the process of the requirements eliciting and
analysis stage as well as initial design of data warehouse. The paper
reviews extended business process model, which satisfy the needs of
data warehouse development. The Author considers that the use of
business process models is necessary, as it reflects both enterprise
information systems and business functions, which are important for
data analysis. The Described approach divides development into
three steps with different detailed elaboration of models. The
Described approach gives possibility to gather requirements and
display them to business users in easy manner.
Abstract: PARADIGMA (PARticipative Approach to DIsease
Global Management) is a pilot project which aims to develop and
demonstrate an Internet based reference framework to share scientific
resources and findings in the treatment of major diseases.
PARADIGMA defines and disseminates a common methodology and
optimised protocols (Clinical Pathways) to support service functions
directed to patients and individuals on matters like prevention, posthospitalisation
support and awareness. PARADIGMA will provide a
platform of information services - user oriented and optimised
against social, cultural and technological constraints - supporting the
Health Care Global System of the Euro-Mediterranean Community
in a continuous improvement process.
Abstract: The Wavelet-Galerkin finite element method for
solving the one-dimensional heat equation is presented in this work.
Two types of basis functions which are the Lagrange and multi-level
wavelet bases are employed to derive the full form of matrix system.
We consider both linear and quadratic bases in the Galerkin method.
Time derivative is approximated by polynomial time basis that
provides easily extend the order of approximation in time space. Our
numerical results show that the rate of convergences for the linear
Lagrange and the linear wavelet bases are the same and in order 2
while the rate of convergences for the quadratic Lagrange and the
quadratic wavelet bases are approximately in order 4. It also reveals
that the wavelet basis provides an easy treatment to improve
numerical resolutions that can be done by increasing just its desired
levels in the multilevel construction process.
Abstract: In this paper, we present a novel statistical approach to
corpus-based speech synthesis. Classically, phonetic information is
defined and considered as acoustic reference to be respected. In this
way, many studies were elaborated for acoustical unit classification.
This type of classification allows separating units according to their
symbolic characteristics. Indeed, target cost and concatenation cost
were classically defined for unit selection.
In Corpus-Based Speech Synthesis System, when using large text
corpora, cost functions were limited to a juxtaposition of symbolic
criteria and the acoustic information of units is not exploited in the
definition of the target cost.
In this manuscript, we token in our consideration the unit phonetic
information corresponding to acoustic information. This would be realized
by defining a probabilistic linguistic Bi-grams model basically
used for unit selection. The selected units would be extracted from
the English TIMIT corpora.
Abstract: Specification-based testing enables us to detect errors
in the implementation of functions defined in given specifications.
Its effectiveness in achieving high path coverage and efficiency in
generating test cases are always major concerns of testers. The automatic
test cases generation approach based on formal specifications
proposed by Liu and Nakajima is aimed at ensuring high effectiveness
and efficiency, but this approach has not been empirically assessed.
In this paper, we present an experiment for assessing Liu-s testing
approach. The result indicates that this testing approach may not be
effective in some circumstances. We discuss the result, analyse the
specific causes for the ineffectiveness, and describe some suggestions
for improvement.
Abstract: The direct implementation of interleaver functions
in WiMAX is not hardware efficient due to presence of complex
functions. Also the conventional method i.e. using memories for
storing the permutation tables is silicon consuming. This work
presents a 2-D transformation for WiMAX channel interleaver
functions which reduces the overall hardware complexity to
compute the interleaver addresses on the fly. A fully reconfigurable
architecture for address generation in WiMAX
channel interleaver is presented, which consume 1.1 k-gates in
total. It can be configured for any block size and any modulation
scheme in WiMAX. The presented architecture can run at a
frequency of 200 MHz, thus fully supporting high bandwidth
requirements for WiMAX.
Abstract: In this paper we present a novel approach for human
Body configuration based on the Silhouette. We propose to address
this problem under the Bayesian framework. We use an effective
Model based MCMC (Markov Chain Monte Carlo) method to solve
the configuration problem, in which the best configuration could be
defined as MAP (maximize a posteriori probability) in Bayesian
model. This model based MCMC utilizes the human body model to
drive the MCMC sampling from the solution space. It converses the
original high dimension space into a restricted sub-space constructed
by the human model and uses a hybrid sampling algorithm. We
choose an explicit human model and carefully select the likelihood
functions to represent the best configuration solution. The
experiments show that this method could get an accurate
configuration and timesaving for different human from multi-views.
Abstract: this paper presents a novel neural network controller
with composite adaptation low to improve the trajectory tracking
problems of biped robots comparing with classical controller. The
biped model has 5_link and 6 degrees of freedom and actuated by
Plated Pneumatic Artificial Muscle, which have a very high power to
weight ratio and it has large stoke compared to similar actuators. The
proposed controller employ a stable neural network in to approximate
unknown nonlinear functions in the robot dynamics, thereby
overcoming some limitation of conventional controllers such as PD
or adaptive controllers and guarantee good performance. This NN
controller significantly improve the accuracy requirements by
retraining the basic PD/PID loop, but adding an inner adaptive loop
that allows the controller to learn unknown parameters such as
friction coefficient, therefore improving tracking accuracy.
Simulation results plus graphical simulation in virtual reality show
that NN controller tracking performance is considerably better than
PD controller tracking performance.
Abstract: Results of Chilean wine classification based on the
information provided by an electronic nose are reported in this paper.
The classification scheme consists of two parts; in the first stage,
Principal Component Analysis is used as feature extraction method to
reduce the dimensionality of the original information. Then, Radial
Basis Functions Neural Networks is used as pattern recognition
technique to perform the classification. The objective of this study is
to classify different Cabernet Sauvignon, Merlot and Carménère wine
samples from different years, valleys and vineyards of Chile.
Abstract: This research work is aimed at speech recognition
using scaly neural networks. A small vocabulary of 11 words were
established first, these words are “word, file, open, print, exit, edit,
cut, copy, paste, doc1, doc2". These chosen words involved with
executing some computer functions such as opening a file, print
certain text document, cutting, copying, pasting, editing and exit.
It introduced to the computer then subjected to feature extraction
process using LPC (linear prediction coefficients). These features are
used as input to an artificial neural network in speaker dependent
mode. Half of the words are used for training the artificial neural
network and the other half are used for testing the system; those are
used for information retrieval.
The system components are consist of three parts, speech
processing and feature extraction, training and testing by using neural
networks and information retrieval.
The retrieve process proved to be 79.5-88% successful, which is
quite acceptable, considering the variation to surrounding, state of
the person, and the microphone type.
Abstract: It well recognized that one feature that makes a
successful company is its ability to successfully align its business goals with its information communication technologies platform.
Enterprise Resource Planning (ERP) systems contribute to achieve better performance by integrating various business functions and
providing support for information flows. However, the technological
systems complexity is known to prevent the business users to exploit in an efficient way the Enterprise Resource Planning Systems (ERP).
This paper aims to investigate the role of training in improving the
usage of ERP systems. To this end, we have designed an instrument
survey to employees of a Norwegian multinational global provider of
technology solutions. Based on the analysis of collected data, we have delineated a training model that could be high relevance for
both researchers and practitioners as a step towards a better
understanding of ERP system implementation.
Abstract: Vehicle suspension design must fulfill
some conflicting criteria. Among those is ride comfort
which is attained by minimizing the acceleration
transmitted to the sprung mass, via suspension spring
and damper. Also good handling of a vehicle is a
desirable property which requires stiff suspension and
therefore is in contrast with a vehicle with good ride.
Among the other desirable features of a suspension is
the minimization of the maximum travel of suspension.
This travel which is called suspension working space in
vehicle dynamics literature is also a design constraint
and it favors good ride. In this research a full car 8
degrees of freedom model has been developed and the
three above mentioned criteria, namely: ride, handling
and working space has been adopted as objective
functions. The Multi Objective Programming (MOP)
discipline has been used to find the Pareto Front and
some reasoning used to chose a design point between
these non dominated points of Pareto Front.
Abstract: Complex networks have been intensively studied across
many fields, especially in Internet technology, biological engineering,
and nonlinear science. Software is built up out of many interacting
components at various levels of granularity, such as functions, classes,
and packages, representing another important class of complex networks.
It can also be studied using complex network theory. Over the
last decade, many papers on the interdisciplinary research between
software engineering and complex networks have been published.
It provides a different dimension to our understanding of software
and also is very useful for the design and development of software
systems. This paper will explore how to use the complex network
theory to analyze software structure, and briefly review the main
advances in corresponding aspects.