Abstract: In this study, a reformer model simulation to use
refinery (Farashband refinery, Iran) waste natural gas. In the
petroleum and allied sectors where natural gas is being encountered
(in form of associated gas) without prior preparation for its positive
use, its combustion (which takes place in flares, an equipment through
which they are being disposed) has become a great problem because
of its associated environmental problems in form of gaseous emission.
The proposed model is used to product syngas from waste natural gas.
A detailed steady model described by a set of ordinary differential and
algebraic equations was developed to predict the behavior of the
overall process. The proposed steady reactor model was validated
against process data of a reformer synthesis plant recorded and a good
agreement was achieved. H2/CO ratio has important effect on Fischer-
Tropsch synthesis reactor product and we try to achieve this parameter
with best designing reformer reactor. We study different kind of
reformer reactors and then select auto thermal reforming process of
natural gas in a fixed bed reformer that adjustment H2/CO ratio with
CO2 and H2O injection. Finally a strategy was proposed for prevention
of extra natural gas to atmosphere.
Abstract: Network reconfiguration in distribution system is realized by changing the status of sectionalizing switches to reduce the power loss in the system. This paper presents a new method which applies an artificial bee colony algorithm (ABC) for determining the sectionalizing switch to be operated in order to solve the distribution system loss minimization problem. The ABC algorithm is a new population based metaheuristic approach inspired by intelligent foraging behavior of honeybee swarm. The advantage of ABC algorithm is that it does not require external parameters such as cross over rate and mutation rate as in case of genetic algorithm and differential evolution and it is hard to determine these parameters in prior. The other advantage is that the global search ability in the algorithm is implemented by introducing neighborhood source production mechanism which is a similar to mutation process. To demonstrate the validity of the proposed algorithm, computer simulations are carried out on 14, 33, and 119-bus systems and compared with different approaches available in the literature. The proposed method has outperformed the other methods in terms of the quality of solution and computational efficiency.
Abstract: EDF (Early Deadline First) algorithm is a very important scheduling algorithm for real- time systems . The EDF algorithm assigns priorities to each job according to their absolute deadlines and has good performance when the real-time system is not overloaded. When the real-time system is overloaded, many misdeadlines will be produced. But these misdeadlines are not uniformly distributed, which usually focus on some tasks. In this paper, we present an adaptive fuzzy control scheduling based on EDF algorithm. The improved algorithm can have a rectangular distribution of misdeadline ratios among all real-time tasks when the system is overloaded. To evaluate the effectiveness of the improved algorithm, we have done extensive simulation studies. The simulation results show that the new algorithm is superior to the old algorithm.
Abstract: This paper presents a methodology for operational and
economic characteristics based evaluation and selection of a power
plant using Graph theoretic approach. A universal evaluation index
on the basis of Operational and economics characteristics of a plant is
proposed which evaluates and ranks the various types of power plants.
The index thus obtained from the pool of operational characteristics
of the power plant attributes Digraph. The Digraph is developed
considering Operational and economics attributes of the power plants
and their relative importance for their smooth operation, installation
and commissioning and prioritizing their selection. The sensitivity
analysis of the attributes towards the objective has also been carried
out in order to study the impact of attributes over the desired outcome
i.e. the universal operational-economics index of the power plant.
Abstract: 4G Communication Networks provide heterogeneous
wireless technologies to mobile subscribers through IP based
networks and users can avail high speed access while roaming across
multiple wireless channels; possible by an organized way to manage
the Quality of Service (QoS) functionalities in these networks. This
paper proposes the idea of developing a novel QoS optimization
architecture that will judge the user requirements and knowing peak
times of services utilization can save the bandwidth/cost factors. The
proposed architecture can be customized according to the network
usage priorities so as to considerably improve a network-s QoS
performance.
Abstract: A real time distributed computing has
heterogeneously networked computers to solve a single problem. So
coordination of activities among computers is a complex task and
deadlines make more complex. The performances depend on many
factors such as traffic workloads, database system architecture,
underlying processors, disks speeds, etc. Simulation study have been
performed to analyze the performance under different transaction
scheduling: different workloads, arrival rate, priority policies,
altering slack factors and Preemptive Policy. The performance metric
of the experiments is missed percent that is the percentage of
transaction that the system is unable to complete. The throughput of
the system is depends on the arrival rate of transaction. The
performance can be enhanced with altering the slack factor value.
Working on slack value for the transaction can helps to avoid some
of transactions from killing or aborts. Under the Preemptive Policy,
many extra executions of new transactions can be carried out.
Abstract: The intermittent connectivity modifies the “always
on" network assumption made by all the distributed query processing
systems. In modern- day systems, the absence of network
connectivity is considered as a fault. Since the last upload, it might
not be feasible to transmit all the data accumulated right away over
the available connection. It is possible that vital information may be
delayed excessively when the less important information takes place
of the vital information. Owing to the restricted and uneven
bandwidth, it is vital that the mobile nodes make the most
advantageous use of the connectivity when it arrives. Hence, in order
to select the data that needs to be transmitted first, some sort of data
prioritization is essential. A continuous query processing system for
intermittently connected mobile networks that comprises of a delaytolerant
continuous query processor distributed across the mobile
hosts has been proposed in this paper. In addition, a mechanism for
prioritizing query results has been designed that guarantees enhanced
accuracy and reduced delay. It is illustrated that our architecture
reduces the client power consumption, increases query efficiency by
the extensive simulation results.
Abstract: The objective of this study is to identify the factors
that influence the online purchasing loyalty for Thai herbal products.
Survey research is used to gather data from Thai herb online
merchants to assess factors that have impacts on enhancing loyalty.
Data were collected from 300 online customers who had experience
in online purchasing of Thai Herbal products. Prior experience
consists of data from previous usage of online herbs, herb purchase
and internet usage. E-Quality data consists of information quality,
system quality, service quality and the product quality of Thai herbal
products sold online. The results suggest that prior experience, Equality,
attitude toward purchase and trust in online merchant have
major impacts on loyalty. The good attitude and E-Quality of
purchasing Thai herbal product online are the most significant
determinants affecting loyalty.
Abstract: In the era of great competition, understanding and satisfying
customers- requirements are the critical tasks for a company
to make a profits. Customer relationship management (CRM) thus
becomes an important business issue at present. With the help of
the data mining techniques, the manager can explore and analyze
from a large quantity of data to discover meaningful patterns and
rules. Among all methods, well-known association rule is most
commonly seen. This paper is based on Apriori algorithm and uses
genetic algorithms combining a data mining method to discover fuzzy
classification rules. The mined results can be applied in CRM to
help decision marker make correct business decisions for marketing
strategies.
Abstract: The fact that traditional food safety system in the
absence of food safety culture is inadequate has recently become a
cause of concern for food safety professionals and other stakeholders.
Focusing on implementation of traditional food safety system i.e
HACCP prerequisite program and HACCP without the presence of
food safety culture in the food industry has led to the processing,
marketing and distribution of contaminated foods. The results of this
are regular out breaks of food borne illnesses and recalls of foods
from retail outlets with serious consequences to the consumers and
manufacturers alike. This article will consider the importance of food
safety culture, the cases of outbreaks and recalls that occurred when
companies did not make food safety culture a priority. Most
importantly, the food safety cultures of some food industries in South
Africa were assessed from responses to questionnaires from food
safety/food industry professionals in Durban South Africa. The
article was concluded by recommending that both food
industry employees and employers alike take food safety culture
seriously.
Abstract: The linear methods of heart rate variability analysis
such as non-parametric (e.g. fast Fourier transform analysis) and
parametric methods (e.g. autoregressive modeling) has become an
established non-invasive tool for marking the cardiac health, but their
sensitivity and specificity were found to be lower than expected with
positive predictive value
Abstract: Soil organic carbon (SOC) plays a key role in soil
fertility, hydrology, contaminants control and acts as a sink or source
of terrestrial carbon content that can affect the concentration of
atmospheric CO2. SOC supports the sustainability and quality of
ecosystems, especially in semi-arid region. This study was
conducted to determine relative importance of 13 different
exploratory climatic, soil and geometric factors on the SOC contents
in one of the semiarid watershed zones in Iran. Two methods
canonical discriminate analysis (CDA) and feed-forward back
propagation neural networks were used to predict SOC. Stepwise
regression and sensitivity analysis were performed to identify
relative importance of exploratory variables. Results from sensitivity
analysis showed that 7-2-1 neural networks and 5 inputs in CDA
models output have highest predictive ability that explains %70 and
%65 of SOC variability. Since neural network models outperformed
CDA model, it should be preferred for estimating SOC.
Abstract: In many data mining applications, it is a priori known
that the target function should satisfy certain constraints imposed
by, for example, economic theory or a human-decision maker. In this
paper we consider partially monotone prediction problems, where the
target variable depends monotonically on some of the input variables
but not on all. We propose a novel method to construct prediction
models, where monotone dependences with respect to some of
the input variables are preserved by virtue of construction. Our
method belongs to the class of mixture models. The basic idea is to
convolute monotone neural networks with weight (kernel) functions
to make predictions. By using simulation and real case studies,
we demonstrate the application of our method. To obtain sound
assessment for the performance of our approach, we use standard
neural networks with weight decay and partially monotone linear
models as benchmark methods for comparison. The results show that
our approach outperforms partially monotone linear models in terms
of accuracy. Furthermore, the incorporation of partial monotonicity
constraints not only leads to models that are in accordance with the
decision maker's expertise, but also reduces considerably the model
variance in comparison to standard neural networks with weight
decay.
Abstract: The application of Neural Network for disease
diagnosis has made great progress and is widely used by physicians.
An Electrocardiogram carries vital information about heart activity and physicians use this signal for cardiac disease diagnosis which
was the great motivation towards our study. In our work, tachycardia
features obtained are used for the training and testing of a Neural
Network. In this study we are using Fuzzy Probabilistic Neural
Networks as an automatic technique for ECG signal analysis. As
every real signal recorded by the equipment can have different
artifacts, we needed to do some preprocessing steps before feeding it
to our system. Wavelet transform is used for extracting the
morphological parameters of the ECG signal. The outcome of the
approach for the variety of arrhythmias shows the represented
approach is superior than prior presented algorithms with an average
accuracy of about %95 for more than 7 tachy arrhythmias.
Abstract: Emerging Bio-engineering fields such as Brain
Computer Interfaces, neuroprothesis devices and modeling and
simulation of neural networks have led to increased research activity
in algorithms for the detection, isolation and classification of Action
Potentials (AP) from noisy data trains. Current techniques in the field
of 'unsupervised no-prior knowledge' biosignal processing include
energy operators, wavelet detection and adaptive thresholding. These
tend to bias towards larger AP waveforms, AP may be missed due to
deviations in spike shape and frequency and correlated noise
spectrums can cause false detection. Also, such algorithms tend to
suffer from large computational expense.
A new signal detection technique based upon the ideas of phasespace
diagrams and trajectories is proposed based upon the use of a
delayed copy of the AP to highlight discontinuities relative to
background noise. This idea has been used to create algorithms that
are computationally inexpensive and address the above problems.
Distinct AP have been picked out and manually classified from
real physiological data recorded from a cockroach. To facilitate
testing of the new technique, an Auto Regressive Moving Average
(ARMA) noise model has been constructed bases upon background
noise of the recordings. Along with the AP classification means this
model enables generation of realistic neuronal data sets at arbitrary
signal to noise ratio (SNR).
Abstract: As is known, one of the priority directions of research
works of natural sciences is introduction of applied section of
contemporary mathematics as approximate and numerical methods to
solving integral equation into practice. We fare with the solving of
integral equation while studying many phenomena of nature to whose
numerically solving by the methods of quadrature are mainly applied.
Taking into account some deficiency of methods of quadrature for
finding the solution of integral equation some sciences suggested of
the multistep methods with constant coefficients. Unlike these papers,
here we consider application of hybrid methods to the numerical
solution of Volterra integral equation. The efficiency of the suggested
method is proved and a concrete method with accuracy order p = 4
is constructed. This method in more precise than the corresponding
known methods.
Abstract: One of the key research issues in wireless sensor networks (WSNs) is how to efficiently deploy sensors to cover an area. In this paper, we present a Fishnet Based Dispatch Scheme (FiBDS) with energy aware mobility and interest based sensing angle. We propose two algorithms, one is FiBDS centralized algorithm and another is FiBDS distributed algorithm. The centralized algorithm is designed specifically for the non-time critical applications, commonly known as non real-time applications while the distributed algorithm is designed specifically for the time critical applications, commonly known as real-time applications. The proposed dispatch scheme works in a phase-selection manner. In this in each phase a specific constraint is dealt with according to the specified priority and then moved onto the next phase and at the end of each only the best suited nodes for the phase are chosen. Simulation results are presented to verify their effectiveness.
Abstract: We present a method to create special domain
collections from news sites. The method only requires a single
sample article as a seed. No prior corpus statistics are needed and the
method is applicable to multiple languages. We examine various
similarity measures and the creation of document collections for
English and Japanese. The main contributions are as follows. First,
the algorithm can build special domain collections from as little as
one sample document. Second, unlike other algorithms it does not
require a second “general" corpus to compute statistics. Third, in our
testing the algorithm outperformed others in creating collections
made up of highly relevant articles.
Abstract: From a set of shifted, blurred, and decimated image , super-resolution image reconstruction can get a high-resolution image. So it has become an active research branch in the field of image restoration. In general, super-resolution image restoration is an ill-posed problem. Prior knowledge about the image can be combined to make the problem well-posed, which contributes to some regularization methods. In the regularization methods at present, however, regularization parameter was selected by experience in some cases and other techniques have too heavy computation cost for computing the parameter. In this paper, we construct a new super-resolution algorithm by transforming the solving of the System stem Є=An into the solving of the equations X+A*X-1A=I , and propose an inverse iterative method.
Abstract: The IEEE 802.11e which is an enhanced version of the 802.11 WLAN standards incorporates the Quality of Service (QoS) which makes it a better choice for multimedia and real time applications. In this paper we study various aspects concerned with 802.11e standard. Further, the analysis results for this standard are compared with the legacy 802.11 standard. Simulation results show that IEEE 802.11e out performs legacy IEEE 802.11 in terms of quality of service due to its flow differentiated channel allocation and better queue management architecture. We also propose a method to improve the unfair allocation of bandwidth for downlink and uplink channels by varying the medium access priority level.