Abstract: Due to the stringent legislation for emission of diesel
engines and also increasing demand on fuel consumption, the
importance of detailed 3D simulation of fuel injection, mixing and
combustion have been increased in the recent years. In the present
work, FIRE code has been used to study the detailed modeling of
spray and mixture formation in a Caterpillar heavy-duty diesel
engine. The paper provides an overview of the submodels
implemented, which account for liquid spray atomization, droplet
secondary break-up, droplet collision, impingement, turbulent
dispersion and evaporation. The simulation was performed from
intake valve closing (IVC) to exhaust valve opening (EVO). The
predicted in-cylinder pressure is validated by comparing with
existing experimental data. A good agreement between the predicted
and experimental values ensures the accuracy of the numerical
predictions collected with the present work. Predictions of engine
emissions were also performed and a good quantitative agreement
between measured and predicted NOx and soot emission data were
obtained with the use of the present Zeldowich mechanism and
Hiroyasu model. In addition, the results reported in this paper
illustrate that the numerical simulation can be one of the most
powerful and beneficial tools for the internal combustion engine
design, optimization and performance analysis.
Abstract: Clustering in high dimensional space is a difficult
problem which is recurrent in many fields of science and
engineering, e.g., bioinformatics, image processing, pattern
reorganization and data mining. In high dimensional space some of
the dimensions are likely to be irrelevant, thus hiding the possible
clustering. In very high dimensions it is common for all the objects in
a dataset to be nearly equidistant from each other, completely
masking the clusters. Hence, performance of the clustering algorithm
decreases.
In this paper, we propose an algorithmic framework which
combines the (reduct) concept of rough set theory with the k-means
algorithm to remove the irrelevant dimensions in a high dimensional
space and obtain appropriate clusters. Our experiment on test data
shows that this framework increases efficiency of the clustering
process and accuracy of the results.
Abstract: Short Message Service (SMS) has grown in
popularity over the years and it has become a common way of
communication, it is a service provided through General System
for Mobile Communications (GSM) that allows users to send text
messages to others.
SMS is usually used to transport unclassified information, but
with the rise of mobile commerce it has become a popular tool for
transmitting sensitive information between the business and its
clients. By default SMS does not guarantee confidentiality and
integrity to the message content.
In the mobile communication systems, security (encryption)
offered by the network operator only applies on the wireless link.
Data delivered through the mobile core network may not be
protected. Existing end-to-end security mechanisms are provided
at application level and typically based on public key
cryptosystem.
The main concern in a public-key setting is the authenticity of
the public key; this issue can be resolved by identity-based (IDbased)
cryptography where the public key of a user can be derived
from public information that uniquely identifies the user.
This paper presents an encryption mechanism based on the IDbased
scheme using Elliptic curves to provide end-to-end security
for SMS. This mechanism has been implemented over the standard
SMS network architecture and the encryption overhead has been
estimated and compared with RSA scheme. This study indicates
that the ID-based mechanism has advantages over the RSA
mechanism in key distribution and scalability of increasing
security level for mobile service.
Abstract: This paper presented the potential of smart phone to
provide support on mapping the indoor asset. The advantage of using
the smart phone to generate the indoor map is that it has the ability to
capture, store and reproduces still or video images; indeed most of us
do have this powerful gadget. The captured images usually used by
maintenance team to save a record for future reference. Here, these
images are used to generate 3D models of an object precisely and
accurately for efficient and effective solution in data gathering. Thus,
it could be a resource for an informative database in asset
management.
Abstract: Auckland has a temperate climate with comfortable warm, dry summers and mild, wet winters. Auckland house design not only focus on winter thermal performance and indoor thermal condition, but also indoor moisture control, which is closely related to indirect health effects such as dust mites, fungi, etc. Most Auckland houses are designed to use temporary heating for winter indoor thermal comfort. Based on field study data of indoor microclimate conditions of two Auckland townhouses with a whole home mechanical ventilation system or a passive wind directional skylight vent, this study is to evaluate and compare indoor moisture conditions of two insulated townhouses only using temporary heating with different ventilation systems.
Abstract: The equilibrium, thermodynamics and kinetics of the
biosorption of Cd (II) and Pb(II) by a Spore Forming Bacillus (MGL
75) were investigated at different experimental conditions. The
Langmuir and Freundlich, and Dubinin-Radushkevich (D-R)
equilibrium adsorption models were applied to describe the
biosorption of the metal ions by MGL 75 biomass. The Langmuir
model fitted the equilibrium data better than the other models.
Maximum adsorption capacities q max for lead (II) and cadmium (II)
were found equal to 158.73mg/g and 91.74 mg/g by Langmuir model.
The values of the mean free energy determined with the D-R equation
showed that adsorption process is a physiosorption process. The
thermodynamic parameters Gibbs free energy (ΔG°), enthalpy (ΔH°),
and entropy (ΔS°) changes were also calculated, and the values
indicated that the biosorption process was exothermic and
spontaneous. Experiment data were also used to study biosorption
kinetics using pseudo-first-order and pseudo-second-order kinetic
models. Kinetic parameters, rate constants, equilibrium sorption
capacities and related correlation coefficients were calculated and
discussed. The results showed that the biosorption processes of both
metal ions followed well pseudo-second-order kinetics.
Abstract: Selection of the best possible set of suppliers has a
significant impact on the overall profitability and success of any
business. For this reason, it is usually necessary to optimize all
business processes and to make use of cost-effective alternatives for
additional savings. This paper proposes a new efficient context-aware
supplier selection model that takes into account possible changes of
the environment while significantly reducing selection costs. The
proposed model is based on data clustering techniques while
inspiring certain principles of online algorithms for an optimally
selection of suppliers. Unlike common selection models which re-run
the selection algorithm from the scratch-line for any decision-making
sub-period on the whole environment, our model considers the
changes only and superimposes it to the previously defined best set
of suppliers to obtain a new best set of suppliers. Therefore, any recomputation
of unchanged elements of the environment is avoided
and selection costs are consequently reduced significantly. A
numerical evaluation confirms applicability of this model and proves
that it is a more optimal solution compared with common static
selection models in this field.
Abstract: Predicting short term wind speed is essential in order
to prevent systems in-action from the effects of strong winds. It also
helps in using wind energy as an alternative source of energy, mainly
for Electrical power generation. Wind speed prediction has
applications in Military and civilian fields for air traffic control,
rocket launch, ship navigation etc. The wind speed in near future
depends on the values of other meteorological variables, such as
atmospheric pressure, moisture content, humidity, rainfall etc. The
values of these parameters are obtained from a nearest weather
station and are used to train various forms of neural networks. The
trained model of neural networks is validated using a similar set of
data. The model is then used to predict the wind speed, using the
same meteorological information. This paper reports an Artificial
Neural Network model for short term wind speed prediction, which
uses back propagation algorithm.
Abstract: This policy participation action research explores the
roles of Thai government units during its 2010 fiscal year on how to
create value added to recycling business in the central part of
Thailand. The research aims to a) study how the government plays a
role to support the business, and its problems and obstacles on
supporting the business, b) to design a strategic action – short,
medium, and long term plans -- to create value added to the recycling
business, particularly in local full-loop companies/organizations
licensed by Wongpanit Waste Separation Plant as well as those
licensed by the Department of Provincial Administration. Mixed
method research design, i.e., a combination of quantitative and
qualitative methods is utilized in the present study in both data
collection and analysis procedures. Quantitative data was analyzed
by frequency, percent value, mean scores, and standard deviation,
and aimed to note trend and generalizations. Qualitative data was
collected via semi-structured interviews/focus group interviews to
explore in-depth views of the operators. The sampling included 1,079
operators in eight provinces in the central part of Thailand.
Abstract: Evaluation and survey of curriculum quality as one of the most important components of universities system is necessary for different levels in higher education. The main purpose of this study was to survey of the curriculum quality of Actuarial science field. Case: University of SHahid Beheshti and Higher education institute of Eco insurance (according to viewpoint of students, alumni, employers and faculty members). Descriptive statistics (mean, tables, percentage, and frequency distribution) and inferential statistics (CHI SQUARE) were used to analyze the data. Six criteria considered for the Quality of curriculum: objectives, content, teaching and learning methods, space and facilities, Time, assessment of learning. Content, teaching and learning methods, space and facilities, assessment of learning criteria were relatively desirable level, objectives and time criterions were desirable level. The quality of curriculum of Actuarial Science field was relatively desirable level.
Abstract: Spatial trends are one of the valuable patterns in geo
databases. They play an important role in data analysis and
knowledge discovery from spatial data. A spatial trend is a regular
change of one or more non spatial attributes when spatially moving
away from a start object. Spatial trend detection is a graph search
problem therefore heuristic methods can be good solution. Artificial
immune system (AIS) is a special method for searching and
optimizing. AIS is a novel evolutionary paradigm inspired by the
biological immune system. The models based on immune system
principles, such as the clonal selection theory, the immune network
model or the negative selection algorithm, have been finding
increasing applications in fields of science and engineering.
In this paper, we develop a novel immunological algorithm based
on clonal selection algorithm (CSA) for spatial trend detection. We
are created neighborhood graph and neighborhood path, then select
spatial trends that their affinity is high for antibody. In an
evolutionary process with artificial immune algorithm, affinity of
low trends is increased with mutation until stop condition is satisfied.
Abstract: In this paper, the application of neural networks to study the design of short-term load forecasting (STLF) Systems for Illam state located in west of Iran was explored. One important architecture of neural networks named Multi-Layer Perceptron (MLP) to model STLF systems was used. Our study based on MLP was trained and tested using three years (2004-2006) data. The results show that MLP network has the minimum forecasting error and can be considered as a good method to model the STLF systems.
Abstract: The size, complexity and number of databases used
for protein information have caused bioinformatics to lag behind in
adapting to the need to handle this distributed information.
Integrating all the information from different databases into one
database is a challenging problem. Our main research is to develop a
tool which can be used to access and manipulate protein information
from difference databases. In our approach, we have integrated
difference databases such as Swiss-prot, PDB, Interpro, and EMBL
and transformed these databases in flat file format into relational
form using XML and Bioperl. As a result, we showed this tool can
search different sizes of protein information stored in relational
database and the result can be retrieved faster compared to flat file
database. A web based user interface is provided to allow user to
access or search for protein information in the local database.
Abstract: Wireless Sensor Networks (WSNs) are wireless
networks consisting of number of tiny, low cost and low power
sensor nodes to monitor various physical phenomena like
temperature, pressure, vibration, landslide detection, presence of any
object, etc. The major limitation in these networks is the use of nonrechargeable
battery having limited power supply. The main cause of
energy consumption WSN is communication subsystem. This paper
presents an efficient grid formation/clustering strategy known as Grid
based level Clustering and Aggregation of Data (GCAD). The
proposed clustering strategy is simple and scalable that uses low duty
cycle approach to keep non-CH nodes into sleep mode thus reducing
energy consumption. Simulation results demonstrate that our
proposed GCAD protocol performs better in various performance
metrics.
Abstract: This paper applies Bayesian Networks to support
information extraction from unstructured, ungrammatical, and
incoherent data sources for semantic annotation. A tool has been
developed that combines ontologies, machine learning, and
information extraction and probabilistic reasoning techniques to
support the extraction process. Data acquisition is performed with the
aid of knowledge specified in the form of ontology. Due to the
variable size of information available on different data sources, it is
often the case that the extracted data contains missing values for
certain variables of interest. It is desirable in such situations to
predict the missing values. The methodology, presented in this paper,
first learns a Bayesian network from the training data and then uses it
to predict missing data and to resolve conflicts. Experiments have
been conducted to analyze the performance of the presented
methodology. The results look promising as the methodology
achieves high degree of precision and recall for information
extraction and reasonably good accuracy for predicting missing
values.
Abstract: The join dependency provides the basis for obtaining
lossless join decomposition in a classical relational schema. The
existence of Join dependency shows that that the tables always
represent the correct data after being joined. Since the classical
relational databases cannot handle imprecise data, they were
extended to fuzzy relational databases so that uncertain, ambiguous,
imprecise and partially known information can also be stored in
databases in a formal way. However like classical databases, the
fuzzy relational databases also undergoes decomposition during
normalization, the issue of joining the decomposed fuzzy relations
remains intact. Our effort in the present paper is to emphasize on this
issue. In this paper we define fuzzy join dependency in the
framework of type-1 fuzzy relational databases & type-2 fuzzy
relational databases using the concept of fuzzy equality which is
defined using fuzzy functions. We use the fuzzy equi-join operator
for computing the fuzzy equality of two attribute values. We also
discuss the dependency preservation property on execution of this
fuzzy equi- join and derive the necessary condition for the fuzzy
functional dependencies to be preserved on joining the decomposed
fuzzy relations. We also derive the conditions for fuzzy join
dependency to exist in context of both type-1 and type-2 fuzzy
relational databases. We find that unlike the classical relational
databases even the existence of a trivial join dependency does not
ensure lossless join decomposition in type-2 fuzzy relational
databases. Finally we derive the conditions for the fuzzy equality to
be non zero and the qualification of an attribute for fuzzy key.
Abstract: The use of renewable energy sources becomes more
necessary and interesting. As wider applications of renewable energy
devices at domestic, commercial and industrial levels has not only
resulted in greater awareness, but also significantly installed
capacities. In addition, biomass principally is in the form of woods,
which is a form of energy by humans for a long time. Gasification is
a process of conversion of solid carbonaceous fuel into combustible
gas by partial combustion. Many gasifier models have various
operating conditions; the parameters kept in each model are different.
This study applied experimental data, which has three inputs, which
are; biomass consumption, temperature at combustion zone and ash
discharge rate. One output is gas flow rate. For this paper, neural
network was used to identify the gasifier system suitable for the
experimental data. In the result,neural networkis usable to attain the
answer.
Abstract: In this paper we study the use of a new code called
Random Diagonal (RD) code for Spectral Amplitude Coding (SAC)
optical Code Division Multiple Access (CDMA) networks, using
Fiber Bragg-Grating (FBG), FBG consists of a fiber segment whose
index of reflection varies periodically along its length. RD code is
constructed using code level and data level, one of the important
properties of this code is that the cross correlation at data level is
always zero, which means that Phase intensity Induced Phase (PIIN)
is reduced. We find that the performance of the RD code will be
better than Modified Frequency Hopping (MFH) and Hadamard code
It has been observed through experimental and theoretical simulation
that BER for RD code perform significantly better than other codes.
Proof –of-principle simulations of encoding with 3 channels, and 10
Gbps data transmission have been successfully demonstrated together
with FBG decoding scheme for canceling the code level from SAC-signal.
Abstract: The aim of this research is to use artificial neural networks computing technology for estimating the net heating value (NHV) of crude oil by its Properties. The approach is based on training the neural network simulator uses back-propagation as the learning algorithm for a predefined range of analytically generated well test response. The network with 8 neurons in one hidden layer was selected and prediction of this network has been good agreement with experimental data.
Abstract: Load forecasting has always been the essential part of
an efficient power system operation and planning. A novel approach
based on support vector machines is proposed in this paper for annual
power load forecasting. Different kernel functions are selected to
construct a combinatorial algorithm. The performance of the new
model is evaluated with a real-world dataset, and compared with two
neural networks and some traditional forecasting techniques. The
results show that the proposed method exhibits superior performance.