Abstract: In this paper, we give a certain decomposition of the
coefficient matrix of the fully fuzzy linear system (FFLS) to obtain
a simple algorithm for solving these systems. The new algorithm
can solve FFLS in a smaller computing process. We will illustrate
our method by solving some examples.
Abstract: The advances in location-based data collection
technologies such as GPS, RFID etc. and the rapid reduction of their
costs provide us with a huge and continuously increasing amount of
data about movement of vehicles, people and goods in an urban area.
This explosive growth of geospatially-referenced data has far
outpaced the planner-s ability to utilize and transform the data into
insightful information thus creating an adverse impact on the return
on the investment made to collect and manage this data. Addressing
this pressing need, we designed and developed DIVAD, a dynamic
and interactive visual analytics dashboard to allow city planners to
explore and analyze city-s transportation data to gain valuable
insights about city-s traffic flow and transportation requirements. We
demonstrate the potential of DIVAD through the use of interactive
choropleth and hexagon binning maps to explore and analyze large
taxi-transportation data of Singapore for different geographic and
time zones.
Abstract: Titanium alloys like Ti-6Al-2Sn-4Zr-6Mo (Ti-
6246) are widely used in aerospace applications. Component
manufacturing, however, is difficult and expensive as their
machinability is extremely poor. A thorough understanding of the
chip formation process is needed to improve related metal cutting
operations.In the current study, orthogonal cutting experiments have
been performed and theresulting chips were analyzed by optical
microscopy and scanning electron microscopy.Chips from aTi-
6246ingot were produced at different cutting speeds and cutting
depths. During the experiments, depending of the cutting conditions,
continuous or segmented chips were formed. Narrow, highly
deformed and grain oriented zones, the so-called shear zone,
separated individual segments. Different material properties have
been measured in the shear zones and the segments.
Abstract: This study aimed at assessing whether and to what extent moral judgment and behaviour were: 1. situation-dependent; 2. selectively dependent on cognitive and affective components; 3. influenced by gender and age; 4. reciprocally congruent. In order to achieve these aims, four different types of moral dilemmas were construed and five types of thinking were presented for each of them – representing five possible ways to evaluate the situation. The judgment criteria included selfishness, altruism, sense of justice, and the conflict between selfishness and the two moral issues. The participants were 250 unpaid volunteers (50% male; 50% female) belonging to two age-groups: young people and adults. The study entailed a 2 (gender) x 2 (age-group) x 5 (type of thinking) x 4 (situation) mixed design: the first two variables were betweensubjects, the others were within-subjects. Results have shown that: 1. moral judgment and behaviour are at least partially affected by the type of situations and by interpersonal variables such as gender and age; 2. moral reasoning depends in a similar manner on cognitive and affective factors; 3. there is not a gender polarity between the ethic of justice and the ethic of cure/ altruism; 4. moral reasoning and behavior are perceived as reciprocally congruent even though their congruence decreases with a more objective assessment. Such results were discussed in the light of contrasting theories on morality.
Abstract: In this paper, the effects of the restoring force device on the response of a space frame structure resting on sliding type of bearing with a restoring force device is studied. The NS component of the El - Centro earthquake and harmonic ground acceleration is considered for earthquake excitation. The structure is modeled by considering six-degrees of freedom (three translations and three rotations) at each node. The sliding support is modeled as a fictitious spring with two horizontal degrees of freedom. The response quantities considered for the study are the top floor acceleration, base shear, bending moment and base displacement. It is concluded from the study that the displacement of the structure reduces by the use of the restoring force device. Also, the peak values of acceleration, bending moment and base shear also decreases. The simulation results show the effectiveness of the developed and proposed method.
Abstract: Ground-source heat pumps achieve higher efficiencies
than conventional air-source heat pumps because they exchange heat
with the ground that is cooler in summer and hotter in winter than the
air environment. Earth heat exchangers are essential parts of the
ground-source heat pumps and the accurate prediction of their
performance is of fundamental importance. This paper presents the
development and validation of a numerical model through an
incompressible fluid flow, for the simulation of energy and
temperature changes in and around a U-tube borehole heat
exchanger. The FlexPDE software is used to solve the resulting
simultaneous equations that model the heat exchanger. The validated
model (through a comparison with experimental data) is then used to
extract conclusions on how various parameters like the U-tube
diameter, the variation of the ground thermal conductivity and
specific heat and the borehole filling material affect the temperature
of the fluid.
Abstract: Wireless sensor networks have been used in wide
areas of application and become an attractive area for researchers in
recent years. Because of the limited energy storage capability of
sensor nodes, Energy consumption is one of the most challenging
aspects of these networks and different strategies and protocols deals
with this area. This paper presents general methods for designing low
power wireless sensor network. Different sources of energy
consumptions in these networks are discussed here and techniques for
alleviating the consumption of energy are presented.
Abstract: Phytases are acid phosphatase enzymes, which
efficiently cleave phosphate moieties from phytic acid, thereby
generating myo-inositol and inorganic phosphate. Thirty four
isolates of endophytic fungi to produce of phytases were isolated
from leaf, stem and root fragments of soybean. Screening of 34
isolates of endophytic fungi identified the phytases produced by
Rhizoctonia sp. and Fusarium verticillioides . The phytase
production were the best induced by phytic acid and rice bran
compared the others inducer in submerged fermentation medium
used. The phytase produced by both Rhizoctonia sp. and F.
verticillioides have pH optimum at 4.0 and 5.0 respectively. The
characterization of phytase from Fusarium verticillioides showed that
temperature optimum was 500C and stability until 600C, the pH
optimum 5.0 and pH stability was 2.5 – 6.0, and substrate specificity
were rice bran>soybean meal>corn> coconut cake, respectively.
Abstract: Artificial Immune System is applied as a Heuristic
Algorithm for decades. Nevertheless, many of these applications
took advantage of the benefit of this algorithm but seldom proposed
approaches for enhancing the efficiency. In this paper, a
Self-evolving Artificial Immune System is proposed via developing
the T and B cell in Immune System and built a self-evolving
mechanism for the complexities of different problems. In this
research, it focuses on enhancing the efficiency of Clonal selection
which is responsible for producing Affinities to resist the invading of
Antigens. T and B cell are the main mechanisms for Clonal
Selection to produce different combinations of Antibodies.
Therefore, the development of T and B cell will influence the
efficiency of Clonal Selection for searching better solution.
Furthermore, for better cooperation of the two cells, a co-evolutional
strategy is applied to coordinate for more effective productions of
Antibodies. This work finally adopts Flow-shop scheduling
instances in OR-library to validate the proposed algorithm.
Abstract: In the age of global communications, heterogeneous
networks are seen to be the best choice of strategy to ensure continuous and uninterruptible services. This will allow mobile
terminal to stay in connection even they are migrating into different segment coverage through the handoff process. With the increase of
teletraffic demands in mobile cellular system, hierarchical cellular systems have been adopted extensively for more efficient channel
utilization and better QoS (Quality of Service). This paper presents a
bidirectional call overflow scheme between two layers of microcells and macrocells, where handoffs are decided by the velocity of mobile
making the call. To ensure that handoff calls are given higher priorities, it is assumed that guard channels are assigned in both
macrocells and microcells. A hysteresis value introduced in mobile velocity is used to allow mobile roam in the same cell if its velocity
changes back within the set threshold values. By doing this the number of handoffs is reduced thereby reducing the processing overhead and enhancing the quality of service to the end user.
Abstract: A predictive clustering hybrid regression (pCHR)
approach was developed and evaluated using dataset from H2-
producing sucrose-based bioreactor operated for 15 months. The aim
was to model and predict the H2-production rate using information
available about envirome and metabolome of the bioprocess. Selforganizing
maps (SOM) and Sammon map were used to visualize the
dataset and to identify main metabolic patterns and clusters in
bioprocess data. Three metabolic clusters: acetate coupled with other
metabolites, butyrate only, and transition phases were detected. The
developed pCHR model combines principles of k-means clustering,
kNN classification and regression techniques. The model performed
well in modeling and predicting the H2-production rate with mean
square error values of 0.0014 and 0.0032, respectively.
Abstract: The cables in a nuclear power plant are designed to be
used for about 40 years in safe operation environment. However, the
heat and radiation in the nuclear power plant causes the rapid
performance deterioration of cables in nuclear vessels and heat
exchangers, which requires cable lifetime estimation. The most
accurate method of estimating the cable lifetime is to evaluate the
cables in a laboratory. However, removing cables while the plant is
operating is not allowed because of its safety and cost. In this paper, a
robot system to estimate the cable lifetime in nuclear power plants is
developed and tested. The developed robot system can calculate a
modulus value to estimate the cable lifetime even when the nuclear
power plant is in operation.
Abstract: Network security attacks are the violation of
information security policy that received much attention to the
computational intelligence society in the last decades. Data mining
has become a very useful technique for detecting network intrusions
by extracting useful knowledge from large number of network data
or logs. Naïve Bayesian classifier is one of the most popular data
mining algorithm for classification, which provides an optimal way
to predict the class of an unknown example. It has been tested that
one set of probability derived from data is not good enough to have
good classification rate. In this paper, we proposed a new learning
algorithm for mining network logs to detect network intrusions
through naïve Bayesian classifier, which first clusters the network
logs into several groups based on similarity of logs, and then
calculates the prior and conditional probabilities for each group of
logs. For classifying a new log, the algorithm checks in which cluster
the log belongs and then use that cluster-s probability set to classify
the new log. We tested the performance of our proposed algorithm by
employing KDD99 benchmark network intrusion detection dataset,
and the experimental results proved that it improves detection rates
as well as reduces false positives for different types of network
intrusions.
Abstract: Modeling and simulation of biochemical reactions is of great interest in the context of system biology. The central dogma of this re-emerging area states that it is system dynamics and organizing principles of complex biological phenomena that give rise to functioning and function of cells. Cell functions, such as growth, division, differentiation and apoptosis are temporal processes, that can be understood if they are treated as dynamic systems. System biology focuses on an understanding of functional activity from a system-wide perspective and, consequently, it is defined by two hey questions: (i) how do the components within a cell interact, so as to bring about its structure and functioning? (ii) How do cells interact, so as to develop and maintain higher levels of organization and functions? In recent years, wet-lab biologists embraced mathematical modeling and simulation as two essential means toward answering the above questions. The credo of dynamics system theory is that the behavior of a biological system is given by the temporal evolution of its state. Our understanding of the time behavior of a biological system can be measured by the extent to which a simulation mimics the real behavior of that system. Deviations of a simulation indicate either limitations or errors in our knowledge. The aim of this paper is to summarize and review the main conceptual frameworks in which models of biochemical networks can be developed. In particular, we review the stochastic molecular modelling approaches, by reporting the principal conceptualizations suggested by A. A. Markov, P. Langevin, A. Fokker, M. Planck, D. T. Gillespie, N. G. van Kampfen, and recently by D. Wilkinson, O. Wolkenhauer, P. S. Jöberg and by the author.
Abstract: Classifying biomedical literature is a difficult and
challenging task, especially when a large number of biomedical
articles should be organized into a hierarchical structure. In this paper,
we present an approach for classifying a collection of biomedical text
abstracts downloaded from Medline database with the help of
ontology alignment. To accomplish our goal, we construct two types
of hierarchies, the OHSUMED disease hierarchy and the Medline
abstract disease hierarchies from the OHSUMED dataset and the
Medline abstracts, respectively. Then, we enrich the OHSUMED
disease hierarchy before adapting it to ontology alignment process for
finding probable concepts or categories. Subsequently, we compute
the cosine similarity between the vector in probable concepts (in the
“enriched" OHSUMED disease hierarchy) and the vector in Medline
abstract disease hierarchies. Finally, we assign category to the new
Medline abstracts based on the similarity score. The results obtained
from the experiments show the performance of our proposed approach
for hierarchical classification is slightly better than the performance of
the multi-class flat classification.
Abstract: Numerical study of two dimensional supersonic
hydrogen-air mixing layer is performed to investigate the effect of
turbulence and chemical additive on ignition distance. Chemical
reaction is treated using detail kinetics. Advection upstream splitting
method is used to calculate the fluxes and one equation turbulence
model is chosen here to simulate the considered problem. Hydrogen
peroxide is used as an additive and the results show that inflow
turbulence and chemical additive may drastically decrease the
ignition delay in supersonic combustion.
Abstract: Currently, there are many local area industrial networks
that can give guaranteed bandwidth to synchronous traffic, particularly
providing CBR channels (Constant Bit Rate), which allow
improved bandwidth management. Some of such networks operate
over Ethernet, delivering channels with enough capacity, specially
with compressors, to integrate multimedia traffic in industrial monitoring
and image processing applications with many sources. In
these industrial environments where a low latency is an essential
requirement, JPEG is an adequate compressing technique but it
generates VBR traffic (Variable Bit Rate). Transmitting VBR traffic
in CBR channels is inefficient and current solutions to this problem
significantly increase the latency or further degrade the quality. In
this paper an R(q) model is used which allows on-line calculation of
the JPEG quantification factor. We obtained increased quality, a lower
requirement for the CBR channel with reduced number of discarded
frames along with better use of the channel bandwidth.
Abstract: This paper deals with efficient computation of
probability coefficients which offers computational simplicity as
compared to spectral coefficients. It eliminates the need of inner
product evaluations in determination of signature of a combinational
circuit realizing given Boolean function. The method for computation
of probability coefficients using transform matrix, fast transform
method and using BDD is given. Theoretical relations for achievable
computational advantage in terms of required additions in computing
all 2n probability coefficients of n variable function have been
developed. It is shown that for n ≥ 5, only 50% additions are needed
to compute all probability coefficients as compared to spectral
coefficients. The fault detection techniques based on spectral
signature can be used with probability signature also to offer
computational advantage.
Abstract: The article presents test results on the changes
occurring in sewage sludge during the process of its storage. Tests
were conducted on mechanically dehydrated sewage sludge derived
from large municipal sewage treatment plants equipped with
biological sewage treatment systems. In testing presented in the paper
the focus was on the basic fuel properties of sewage sludge: moisture
content, heat of combustion, carbon share. In the first part of the
article the overview of the issues concerning the sewage sludge
management is presented and the genesis of tests is explained.
Further in the paper, selected results of conducted tests are discussed.
Changes in tested parameters were determined in the period of a 10-
month sewage storage.
Abstract: In an era of knowledge explosion, the growth of data
increases rapidly day by day. Since data storage is a limited resource,
how to reduce the data space in the process becomes a challenge issue.
Data compression provides a good solution which can lower the
required space. Data mining has many useful applications in recent
years because it can help users discover interesting knowledge in large
databases. However, existing compression algorithms are not
appropriate for data mining. In [1, 2], two different approaches were
proposed to compress databases and then perform the data mining
process. However, they all lack the ability to decompress the data to
their original state and improve the data mining performance. In this
research a new approach called Mining Merged Transactions with the
Quantification Table (M2TQT) was proposed to solve these problems.
M2TQT uses the relationship of transactions to merge related
transactions and builds a quantification table to prune the candidate
itemsets which are impossible to become frequent in order to improve
the performance of mining association rules. The experiments show
that M2TQT performs better than existing approaches.