Abstract: Due to the ever growing amount of publications about
protein-protein interactions, information extraction from text is
increasingly recognized as one of crucial technologies in
bioinformatics. This paper presents a Protein Interaction Extraction
System using a Link Grammar Parser from biomedical abstracts
(PIELG). PIELG uses linkage given by the Link Grammar Parser to
start a case based analysis of contents of various syntactic roles as
well as their linguistically significant and meaningful combinations.
The system uses phrasal-prepositional verbs patterns to overcome
preposition combinations problems. The recall and precision are
74.4% and 62.65%, respectively. Experimental evaluations with two
other state-of-the-art extraction systems indicate that PIELG system
achieves better performance. For further evaluation, the system is
augmented with a graphical package (Cytoscape) for extracting
protein interaction information from sequence databases. The result
shows that the performance is remarkably promising.
Abstract: This paper presents a mathematical model and a
methodology to analyze the losses in transmission expansion
planning (TEP) under uncertainty in demand. The methodology is
based on discrete particle swarm optimization (DPSO). DPSO is a
useful and powerful stochastic evolutionary algorithm to solve the
large-scale, discrete and nonlinear optimization problems like TEP.
The effectiveness of the proposed idea is tested on an actual
transmission network of the Azerbaijan regional electric company,
Iran. The simulation results show that considering the losses even for
transmission expansion planning of a network with low load growth
is caused that operational costs decreases considerably and the
network satisfies the requirement of delivering electric power more
reliable to load centers.
Abstract: This paper presents the experimental results of silicone rubber polymer insulators for 22 kV systems under salt water dip wheel test based on IEC 62217. Straight shed silicone rubber polymer insulators having leakage distance 685 mm were tested continuously 30,000 cycles. One test cycle includes 4 positions, energized, de-energized, salt water dip and deenergized, respectively. For one test cycle, each test specimen remains stationary for about 40 second in each position and takes 8 second for rotate to next position. By visual observation, sever surface erosion was observed on the trunk near the energized end of tested specimen. Puncture was observed on the upper shed near the energized end. In addition, decreasing in hydrophobicity and increasing in hardness were measured on tested specimen comparing with new specimen. Furthermore, chemical analysis by ATR-FTIR was conducted in order to elucidate the chemical change of tested specimens comparing with new specimen.
Abstract: The evaluation of residual reliability of large sized
parallel computer interconnection systems is not practicable with
the existing methods. Under such conditions, one must go for
approximation techniques which provide the upper bound and lower
bound on this reliability. In this context, a new approximation method
for providing bounds on residual reliability is proposed here. The
proposed method is well supported by two algorithms for simulation
purpose. The bounds on residual reliability of three different categories
of interconnection topologies are efficiently found by using
the proposed method
Abstract: Given the motivation of maps impact in enhancing the
perception of the quality of life in a region, this work examines the
use of spatial analytical techniques in exploring the role of space in
shaping human development patterns in Assiut governorate.
Variations of human development index (HDI) of the governorate-s
villages, districts and cities are mapped using geographic information
systems (GIS). Global and local spatial autocorrelation measures are
employed to assess the levels of spatial dependency in the data and to
map clusters of human development. Results show prominent
disparities in HDI between regions of Assiut. Strong patterns of
spatial association were found proving the presence of clusters on the
distribution of HDI. Finally, the study indicates several "hot-spots" in
the governorate to be area of more investigations to explore the
attributes of such levels of human development. This is very
important for accomplishing the development plan of poorest regions
currently adopted in Egypt.
Abstract: In most of the cases, natural disasters lead to the
necessity of evacuating people. The quality of evacuation
management is dramatically improved by the use of information
provided by decision support systems, which become indispensable
in case of large scale evacuation operations. This paper presents a
best practice case study. In November 2007, officers from the
Emergency Situations Inspectorate “Crisana" of Bihor County from
Romania participated to a cross-border evacuation exercise, when
700 people have been evacuated from Netherlands to Belgium. One
of the main objectives of the exercise was the test of four different
decision support systems. Afterwards, based on that experience,
software system called TEVAC (Trans Border Evacuation) has been
developed “in house" by the experts of this institution. This original
software system was successfully tested in September 2008, during
the deployment of the international exercise EU-HUROMEX 2008,
the scenario involving real evacuation of 200 persons from Hungary
to Romania. Based on the lessons learned and results, starting from
April 2009, the TEVAC software is used by all Emergency
Situations Inspectorates all over Romania.
Abstract: Using neural network we try to model the unknown function f for given input-output data pairs. The connection strength of each neuron is updated through learning. Repeated simulations of crisp neural network produce different values of weight factors that are directly affected by the change of different parameters. We propose the idea that for each neuron in the network, we can obtain quasi-fuzzy weight sets (QFWS) using repeated simulation of the crisp neural network. Such type of fuzzy weight functions may be applied where we have multivariate crisp input that needs to be adjusted after iterative learning, like claim amount distribution analysis. As real data is subjected to noise and uncertainty, therefore, QFWS may be helpful in the simplification of such complex problems. Secondly, these QFWS provide good initial solution for training of fuzzy neural networks with reduced computational complexity.
Abstract: Day by day technology increases and problems
associated with this technology also increase. Several researches
were carried out to investigate the deployment of such material safely
in geotechnical engineering in particular and civil engineering in
general. However, different types of waste material have such as
cement duct, fly ash and slag been proven to be suitable in several
applications. In this research cement dust mixed with different
percentages of sand will be used in some civil engineering
application as will be explained later in this paper throughout filed
and laboratory test. The used mixer (waste material with sand) prove
high performance, durability to environmental condition, low cost
and high benefits. At higher cement dust ratio, small cement ratio is
valuable for compressive strength and permeability. Also at small
cement dust ratio higher cement ratio is valuable for compressive
strength.
Abstract: With the advance in wireless networking, IEEE 802.16 WiMAX technology has been widely deployed for several applications such as “last mile" broadband service, cellular backhaul, and high-speed enterprise connectivity. As a result, military employed WiMAX as a high-speed wireless connection for data-link because of its point to multi-point and non-line-of-sight (NLOS) capability for many years. However, the risk of using WiMAX is a critical factor in some sensitive area of military applications especially in ammunition manufacturing such as solid propellant rocket production. The US DoD policy states that the following certification requirements are met for WiMAX: electromagnetic effects on the environment (E3) and Hazards of Electromagnetic Radiation to Ordnance (HERO). This paper discuses the Recommended Power Densities and Safe Separation Distance (SSD) for HERO on WiMAX systems deployed on solid propellant rocket production. The result of this research found that WiMAX is safe to operate at close proximity distances to the rocket production based on AF Guidance Memorandum immediately changing AFMAN 91-201.
Abstract: Artifact rejection plays a key role in many signal processing applications. The artifacts are disturbance that can occur during the signal acquisition and that can alter the analysis of the signals themselves. Our aim is to automatically remove the artifacts, in particular from the Electroencephalographic (EEG) recordings. A technique for the automatic artifact rejection, based on the Independent Component Analysis (ICA) for the artifact extraction and on some high order statistics such as kurtosis and Shannon-s entropy, was proposed some years ago in literature. In this paper we try to enhance this technique proposing a new method based on the Renyi-s entropy. The performance of our method was tested and compared to the performance of the method in literature and the former proved to outperform the latter.
Abstract: The inphase/quadrature (I/Q) amplitude and phase
imbalance effects are studied in coherent optical orthogonal
frequency division multiplexing (CO-OFDM) systems. An analytical
model for the I/Q imbalance is developed and supported by
simulation results. The results indicate that the I/Q imbalance degrades the BER performance considerably.
Abstract: HSDPA is a new feature which is introduced in
Release-5 specifications of the 3GPP WCDMA/UTRA standard to
realize higher speed data rate together with lower round-trip times.
Moreover, the HSDPA concept offers outstanding improvement of
packet throughput and also significantly reduces the packet call
transfer delay as compared to Release -99 DSCH. Till now the
HSDPA system uses turbo coding which is the best coding technique
to achieve the Shannon limit. However, the main drawbacks of turbo
coding are high decoding complexity and high latency which makes
it unsuitable for some applications like satellite communications,
since the transmission distance itself introduces latency due to
limited speed of light. Hence in this paper it is proposed to use LDPC
coding in place of Turbo coding for HSDPA system which decreases
the latency and decoding complexity. But LDPC coding increases the
Encoding complexity. Though the complexity of transmitter
increases at NodeB, the End user is at an advantage in terms of
receiver complexity and Bit- error rate. In this paper LDPC Encoder
is implemented using “sparse parity check matrix" H to generate a
codeword at Encoder and “Belief Propagation algorithm "for LDPC
decoding .Simulation results shows that in LDPC coding the BER
suddenly drops as the number of iterations increase with a small
increase in Eb/No. Which is not possible in Turbo coding. Also same
BER was achieved using less number of iterations and hence the
latency and receiver complexity has decreased for LDPC coding.
HSDPA increases the downlink data rate within a cell to a theoretical
maximum of 14Mbps, with 2Mbps on the uplink. The changes that
HSDPA enables includes better quality, more reliable and more
robust data services. In other words, while realistic data rates are
only a few Mbps, the actual quality and number of users achieved
will improve significantly.
Abstract: Generally, administrative systems in an academic
environment are disjoint and support independent queries. The
objective in this work is to semantically connect these independent
systems to provide support to queries run on the integrated platform.
The proposed framework, by enriching educational material in the
legacy systems, provides a value-added semantics layer where
activities such as annotation, query and reasoning can be carried out
to support management requirements. We discuss the development of
this ontology framework with a case study of UAE University
program administration to show how semantic web technologies can
be used by administration to develop student profiles for better
academic program management.
Abstract: This paper proposes a “soft systems" approach to
domain-driven design of computer-based information systems. We
propose a systemic framework combining techniques from Soft
Systems Methodology (SSM), the Unified Modelling Language
(UML), and an implementation pattern known as “Naked Objects".
We have used this framework in action research projects that have
involved the investigation and modelling of business processes using
object-oriented domain models and the implementation of software
systems based on those domain models. Within the proposed
framework, Soft Systems Methodology (SSM) is used as a guiding
methodology to explore the problem situation and to generate a
ubiquitous language (soft language) which can be used as the basis
for developing an object-oriented domain model. The domain model
is further developed using techniques based on the UML and is
implemented in software following the “Naked Objects"
implementation pattern. We argue that there are advantages from
combining and using techniques from different methodologies in this
way.
The proposed systemic framework is overviewed and justified as
multimethodologyusing Mingers multimethodology ideas.
This multimethodology approach is being evaluated through a
series of action research projects based on real-world case studies. A
Peer-Tutoring case study is presented here as a sample of the
framework evaluation process
Abstract: This paper unifies power optimization approaches in
various energy converters, such as: thermal, solar, chemical, and
electrochemical engines, in particular fuel cells. Thermodynamics
leads to converter-s efficiency and limiting power. Efficiency
equations serve to solve problems of upgrading and downgrading of
resources. While optimization of steady systems applies the
differential calculus and Lagrange multipliers, dynamic optimization
involves variational calculus and dynamic programming. In reacting
systems chemical affinity constitutes a prevailing component of an
overall efficiency, thus the power is analyzed in terms of an active
part of chemical affinity. The main novelty of the present paper in the
energy yield context consists in showing that the generalized heat
flux Q (involving the traditional heat flux q plus the product of
temperature and the sum products of partial entropies and fluxes of
species) plays in complex cases (solar, chemical and electrochemical)
the same role as the traditional heat q in pure heat engines.
The presented methodology is also applied to power limits in fuel
cells as to systems which are electrochemical flow engines propelled
by chemical reactions. The performance of fuel cells is determined by
magnitudes and directions of participating streams and mechanism of
electric current generation. Voltage lowering below the reversible
voltage is a proper measure of cells imperfection. The voltage losses,
called polarization, include the contributions of three main sources:
activation, ohmic and concentration. Examples show power maxima
in fuel cells and prove the relevance of the extension of the thermal
machine theory to chemical and electrochemical systems. The main
novelty of the present paper in the FC context consists in introducing
an effective or reduced Gibbs free energy change between products p
and reactants s which take into account the decrease of voltage and
power caused by the incomplete conversion of the overall reaction.
Abstract: Reduction of Single Input Single Output (SISO) discrete systems into lower order model, using a conventional and an evolutionary technique is presented in this paper. In the conventional technique, the mixed advantages of Modified Cauer Form (MCF) and differentiation are used. In this method the original discrete system is, first, converted into equivalent continuous system by applying bilinear transformation. The denominator of the equivalent continuous system and its reciprocal are differentiated successively, the reduced denominator of the desired order is obtained by combining the differentiated polynomials. The numerator is obtained by matching the quotients of MCF. The reduced continuous system is converted back into discrete system using inverse bilinear transformation. In the evolutionary technique method, Particle Swarm Optimization (PSO) is employed to reduce the higher order model. PSO method is based on the minimization of the Integral Squared Error (ISE) between the transient responses of original higher order model and the reduced order model pertaining to a unit step input. Both the methods are illustrated through numerical example.
Abstract: With Power system movement toward restructuring along with factors such as life environment pollution, problems of transmission expansion and with advancement in construction technology of small generation units, it is expected that small units like wind turbines, fuel cells, photovoltaic, ... that most of the time connect to the distribution networks play a very essential role in electric power industry. With increase in developing usage of small generation units, management of distribution networks should be reviewed. The target of this paper is to present a new method for optimal management of active and reactive power in distribution networks with regard to costs pertaining to various types of dispersed generations, capacitors and cost of electric energy achieved from network. In other words, in this method it-s endeavored to select optimal sources of active and reactive power generation and controlling equipments such as dispersed generations, capacitors, under load tapchanger transformers and substations in a way that firstly costs in relation to them are minimized and secondly technical and physical constraints are regarded. Because the optimal management of distribution networks is an optimization problem with continuous and discrete variables, the new evolutionary method based on Ant Colony Algorithm has been applied. The simulation results of the method tested on two cases containing 23 and 34 buses exist and will be shown at later sections.
Abstract: Silver nano-particles have been used for antibacterial
purpose and it is also believed to have removal of odorous compounds,
oxidation capacity as a metal catalyst. In this study, silver
nano-particles in nano sizes (5-30 nm) were prepared on the surface of
NaHCO3, the supporting material, using a sputtering method that
provided high silver content and minimized conglomerating problems
observed in the common AgNO3 photo-deposition method. The silver
nano-particles were dispersed by dissolving Ag-NaHCO3 into water,
and the dispersed silver nano-particles in the aqueous phase were
applied to remove inorganic odor compounds, H2S, in a scrubbing
reactor. Hydrogen sulfide in the gas phase was rapidly removed by the
silver nano-particles, and the concentration of sulfate (SO4
2-) ion
increased with time due to the oxidation reaction by silver as a
catalyst. Consequently, the experimental results demonstrated that the
silver nano-particles in the aqueous solution can be successfully
applied to remove odorous compounds without adding additional
energy sources and producing any harmful byproducts
Abstract: Whereas cellular wireless communication systems are
subject to short-and long-term fading. The effect of wireless channel
has largely been ignored in most of the teletraffic assessment
researches. In this paper, a mathematical teletraffic model is proposed
to estimate blocking and forced termination probabilities of cellular
wireless networks as a result of teletraffic behavior as well as the
outage of the propagation channel. To evaluate the proposed
teletraffic model, gamma inter-arrival and general service time
distributions have been considered based on wireless channel fading
effect. The performance is evaluated and compared with the classical
model. The proposed model is dedicated and investigated in different
operational conditions. These conditions will consider not only the
arrival rate process, but also, the different faded channels models.
Abstract: Most CT reconstruction system x-ray computed
tomography (CT) is a well established visualization technique in
medicine and nondestructive testing. However, since CT scanning
requires sampling of radiographic projections from different viewing
angles, common CT systems with mechanically moving parts are too
slow for dynamic imaging, for instance of multiphase flows or live
animals. A large number of X-ray projections are needed to
reconstruct CT images, so the collection and calculation of the
projection data consume too much time and harmful for patient. For
the purpose of solving the problem, in this study, we proposed a
method for tomographic reconstruction of a sample from a limited
number of x-ray projections by using linear interpolation method. In
simulation, we presented reconstruction from an experimental x-ray
CT scan of a Aluminum phantom that follows to two steps: X-ray
projections will be interpolated using linear interpolation method and
using it for CT reconstruction based upon Ordered Subsets
Expectation Maximization (OSEM) method.