Abstract: An end-member selection method for spectral unmixing that is based on Particle Swarm Optimization (PSO) is developed in this paper. The algorithm uses the K-means clustering algorithm and a method of dynamic selection of end-members subsets to find the appropriate set of end-members for a given set of multispectral images. The proposed algorithm has been successfully applied to test image sets from various platforms such as LANDSAT 5 MSS and NOAA's AVHRR. The experimental results of the proposed algorithm are encouraging. The influence of different values of the algorithm control parameters on performance is studied. Furthermore, the performance of different versions of PSO is also investigated.
Abstract: Enzymatic saccharification of biomass for reducing
sugar production is one of the crucial processes in biofuel production
through biochemical conversion. In this study, enzymatic
saccharification of dilute potassium hydroxide (KOH) pre-treated
Tetraselmis suecica biomass was carried out by using cellulase
enzyme obtained from Trichoderma longibrachiatum. Initially, the
pre-treatment conditions were optimised by changing alkali reagent
concentration, retention time for reaction, and temperature. The T.
suecica biomass after pre-treatment was also characterized using
Fourier Transform Infrared Spectra and Scanning Electron
Microscope. These analyses revealed that the functional group such
as acetyl and hydroxyl groups, structure and surface of T. suecica
biomass were changed through pre-treatment, which is favourable for
enzymatic saccharification process. Comparison of enzymatic
saccharification of untreated and pre-treated microalgal biomass
indicated that higher level of reducing sugar can be obtained from
pre-treated T. suecica. Enzymatic saccharification of pre-treated T.
suecica biomass was optimised by changing temperature, pH, and
enzyme concentration to solid ratio ([E]/[S]). Highest conversion of
carbohydrate into reducing sugar of 95% amounted to reducing sugar
yield of 20 (wt%) from pre-treated T. suecica was obtained from
saccharification, at temperature: 40°C, pH: 4.5 and [E]/[S] of 0.1
after 72 h of incubation. Hydrolysate obtained from enzymatic
saccharification of pretreated T. suecica biomass was further
fermented into biobutanol using Clostridium saccharoperbutyliticum
as biocatalyst. The results from this study demonstrate a positive
prospect of application of dilute alkaline pre-treatment to enhance
enzymatic saccharification and biobutanol production from
microalgal biomass.
Abstract: The Mobile Ad-hoc Network (MANET) is a collection of self-configuring and rapidly deployed mobile nodes (routers) without any central infrastructure. Routing is one of the potential issues. Many routing protocols are reported but it is difficult to decide which one is best in all scenarios. In this paper on demand routing protocols DSR and DYMO based on IEEE 802.11 DCF MAC protocol are examined and characteristic summary of these routing protocols is presented. Their performance is analyzed and compared on performance measuring metrics throughput, dropped packets due to non availability of routes, duplicate RREQ generated for route discovery and normalized routing load by varying CBR data traffic load using QualNet 5.0.2 network simulator.
Abstract: This paper studies the mean square exponential synchronization problem of a class of stochastic neutral type chaotic neural networks with mixed delay. On the Basis of Lyapunov stability theory, some sufficient conditions ensuring the mean square exponential synchronization of two identical chaotic neural networks are obtained by using stochastic analysis and inequality technique. These conditions are expressed in the form of linear matrix inequalities (LMIs), whose feasibility can be easily checked by using Matlab LMI Toolbox. The feedback controller used in this paper is more general than those used in previous literatures. One simulation example is presented to demonstrate the effectiveness of the derived results.
Abstract: Data envelopment analysis (DEA) has gained great popularity in environmental performance measurement because it can provide a synthetic standardized environmental performance index when pollutants are suitably incorporated into the traditional DEA framework. Since some of the environmental performance indicators cannot be controlled by companies managers, it is necessary to develop the model in a way that it could be applied when discretionary and/or non-discretionary factors were involved. In this paper, we present a semi-radial DEA approach to measuring environmental performance, which consists of non-discretionary factors. The model, then, has been applied on a real case.
Abstract: The trends of design and development of information systems have undergone a variety of ongoing phases and stages. These variations have been evolved due to brisk changes in user requirements and business needs. To meet these requirements and needs, a flexible and agile business solution was required to come up with the latest business trends and styles. Another obstacle in agility of information systems was typically different treatment of same diseases of two patients: business processes and information services. After the emergence of information technology, the business processes and information systems have become counterparts. But these two business halves have been treated under totally different standards. There is need to streamline the boundaries of these both pillars that are equally sharing information system's burdens and liabilities. In last decade, the object orientation has evolved into one of the major solutions for modern business needs and now, SOA is the solution to shift business on ranks of electronic platform. BPM is another modern business solution that assists to regularize optimization of business processes. This paper discusses how object orientation can be conformed to incorporate or embed SOA in BPM for improved information systems.
Abstract: Every organization is continually subject to new damages and threats which can be resulted from their operations or their goal accomplishment. Methods of providing the security of space and applied tools have been widely changed with increasing application and development of information technology (IT). From this viewpoint, information security management systems were evolved to construct and prevent reiterating the experienced methods. In general, the correct response in information security management systems requires correct decision making, which in turn requires the comprehensive effort of managers and everyone involved in each plan or decision making. Obviously, all aspects of work or decision are not defined in all decision making conditions; therefore, the possible or certain risks should be considered when making decisions. This is the subject of risk management and it can influence the decisions. Investigation of different approaches in the field of risk management demonstrates their progress from quantitative to qualitative methods with a process approach.
Abstract: This paper is to investigate the impplementation of security
mechanism in object oriented database system. Formal methods
plays an essential role in computer security due to its powerful expressiveness
and concise syntax and semantics. In this paper, both issues
of specification and implementation in database security environment
will be considered; and the database security is achieved through
the development of an efficient implementation of the specification
without compromising its originality and expressiveness.
Abstract: Knowledge sharing in general and the contextual
access to knowledge in particular, still represent a key challenge in
the knowledge management framework. Researchers on semantic
web and human machine interface study techniques to enhance this
access. For instance, in semantic web, the information retrieval is
based on domain ontology. In human machine interface, keeping
track of user's activity provides some elements of the context that can
guide the access to information. We suggest an approach based on
these two key guidelines, whilst avoiding some of their weaknesses.
The approach permits a representation of both the context and the
design rationale of a project for an efficient access to knowledge. In
fact, the method consists of an information retrieval environment
that, in the one hand, can infer knowledge, modeled as a semantic
network, and on the other hand, is based on the context and the
objectives of a specific activity (the design). The environment we
defined can also be used to gather similar project elements in order to
build classifications of tasks, problems, arguments, etc. produced in a
company. These classifications can show the evolution of design
strategies in the company.
Abstract: This purpose of this paper is to develop and validate a
model to accurately predict the cell temperature of a PV module that
adapts to various mounting configurations, mounting locations, and
climates while only requiring readily available data from the module
manufacturer. Results from this model are also compared to results
from published cell temperature models. The models were used to
predict real-time performance from a PV water pumping systems in
the desert of Medenine, south of Tunisia using 60-min intervals of
measured performance data during one complete year. Statistical
analysis of the predicted results and measured data highlight possible
sources of errors and the limitations and/or adequacy of existing
models, to describe the temperature and efficiency of PV-cells and
consequently, the accuracy of performance of PV water pumping
systems prediction models.
Abstract: Results of Chilean wine classification based on the
information provided by an electronic nose are reported in this paper.
The classification scheme consists of two parts; in the first stage,
Principal Component Analysis is used as feature extraction method to
reduce the dimensionality of the original information. Then, Radial
Basis Functions Neural Networks is used as pattern recognition
technique to perform the classification. The objective of this study is
to classify different Cabernet Sauvignon, Merlot and Carménère wine
samples from different years, valleys and vineyards of Chile.
Abstract: The aim of this study was to investigate the
environmental conservation behavior of the Applied Health Science
students of Suranaree University of Technology, a green and clean
university. The sample group was 184 Applied Health Science
students (medical, nursing, and public health). A questionnaire was
used to collect information.
The result of the study found that the students had more negative
than positive behaviors towards energy, water, and forest
conservation. This result can be used as basic information for
designing long-term behavior modification activities or research
projects on environmental conservation. Thus Applied Health
Science students will be encouraged to be conscious and also be a
good example of environmental conservation behavior.
Abstract: Web-based technologies have created numerous
opportunities for electronic word-of-mouth (eWOM) communication.
There are many factors that affect customer adoption and decisionmaking
process. However, only a few researches focus on some
factors such as the membership time of forum and propensity to trust.
Using a discrete-time event simulation to simulate a diffusion model
along with a consumer decision model, the study shows the effect of
each factor on adoption of opinions on on-line discussion forum. The
purpose of this study is to examine the effect of factor affecting
information adoption and decision making process. The model is
constructed to test quantitative aspects of each factor. The simulation
study shows the membership time and the propensity to trust has an
effect on information adoption and purchasing decision. The result of
simulation shows that the longer the membership time in the
communities and the higher propensity to trust could lead to the
higher demand rates because consumers find it easier and faster to
trust the person in the community and then adopt the eWOM. Other
implications for both researchers and practitioners are provided.
Abstract: These days people love to travel around the world.
Regardless of their location and time, they especially Muslims still
need to perform their prayers. Normally for travelers, they need to
bring maps, compass and for Muslim, they even have to bring Qibla
pointer when they travel. It is slightly difficult to determine the Qibla
direction and to know the time for each prayer. As the technology
grows, many PDA equip with maps and GPS to locate their location.
In this paper we present a new electronic device called Mobile Qibla
and Prayer Time Finder to locate the Qibla direction and to
determine each prayer time based on the current user-s location using
PDA. This device use PIC microcontroller equipped with digital
compass where it will communicate with PDA using Bluetooth
technology and display the exact Qibla direction and prayer time
automatically at any place in the world. This device is reliable and
accurate in determining the Qibla direction and prayer time.
Abstract: Previous the 3D model texture generation from multi-view images and mapping algorithms has issues in the texture chart generation which are the self-intersection and the concentration of the texture in texture space. Also we may suffer from some problems due to the occluded areas, such as inside parts of thighs. In this paper we propose a texture mapping technique for 3D models using multi-view images on the GPU. We do texture mapping directly on the GPU fragment shader per pixel without generation of the texture map. And we solve for the occluded area using the 3D model depth information. Our method needs more calculation on the GPU than previous works, but it has shown real-time performance and previously mentioned problems do not occur.
Abstract: This research work is aimed at speech recognition
using scaly neural networks. A small vocabulary of 11 words were
established first, these words are “word, file, open, print, exit, edit,
cut, copy, paste, doc1, doc2". These chosen words involved with
executing some computer functions such as opening a file, print
certain text document, cutting, copying, pasting, editing and exit.
It introduced to the computer then subjected to feature extraction
process using LPC (linear prediction coefficients). These features are
used as input to an artificial neural network in speaker dependent
mode. Half of the words are used for training the artificial neural
network and the other half are used for testing the system; those are
used for information retrieval.
The system components are consist of three parts, speech
processing and feature extraction, training and testing by using neural
networks and information retrieval.
The retrieve process proved to be 79.5-88% successful, which is
quite acceptable, considering the variation to surrounding, state of
the person, and the microphone type.
Abstract: In ad hoc networks, the main issue about designing of protocols is quality of service, so that in wireless sensor networks the main constraint in designing protocols is limited energy of sensors. In fact, protocols which minimize the power consumption in sensors are more considered in wireless sensor networks. One approach of reducing energy consumption in wireless sensor networks is to reduce the number of packages that are transmitted in network. The technique of collecting data that combines related data and prevent transmission of additional packages in network can be effective in the reducing of transmitted packages- number. According to this fact that information processing consumes less power than information transmitting, Data Aggregation has great importance and because of this fact this technique is used in many protocols [5]. One of the Data Aggregation techniques is to use Data Aggregation tree. But finding one optimum Data Aggregation tree to collect data in networks with one sink is a NP-hard problem. In the Data Aggregation technique, related information packages are combined in intermediate nodes and form one package. So the number of packages which are transmitted in network reduces and therefore, less energy will be consumed that at last results in improvement of longevity of network. Heuristic methods are used in order to solve the NP-hard problem that one of these optimization methods is to solve Simulated Annealing problems. In this article, we will propose new method in order to build data collection tree in wireless sensor networks by using Simulated Annealing algorithm and we will evaluate its efficiency whit Genetic Algorithm.
Abstract: Researchers have been applying tional intelligence (AI/CI) methods to computer games. In this research field, further researchesare required to compare AI/CI
methods with respect to each game application. In th
our experimental result on the comparison of three evolutionary algorithms – evolution strategy, genetic algorithm, and their hybrid
applied to evolving controller agents for the CIG 2007 Simulated Car Racing competition. Our experimental result shows that, premature
convergence of solutions was observed in the case of ES, and GA outperformed ES in the last half of generations. Besides, a hybrid
which uses GA first and ES next evolved the best solution among the whole solutions being generated. This result shows the ability of GA in
globally searching promising areas in the early stage and the ability of ES in locally searching the focused area (fine-tuning solutions).
Abstract: A ten-year grazing study was conducted at the
Agriculture and Agri-Food Canada Brandon Research Centre in
Manitoba to study the effect of alfalfa inclusion and fertilizer (N, P,
K, and S) addition on economics and efficiency of non-renewable
energy use in meadow brome grass-based pasture systems for beef
production. Fertilizing grass-only or alfalfa-grass pastures to full soil
test recommendations improved pasture productivity, but did not
improve profitability compared to unfertilized pastures. Fertilizing
grass-only pastures resulted in the highest net loss of any pasture
management strategy in this study. Adding alfalfa at the time of
seeding, with no added fertilizer, was economically the best pasture
improvement strategy in this study. Because of moisture limitations,
adding commercial fertilizer to full soil test recommendations is
probably not economically justifiable in most years, especially with
the rising cost of fertilizer. Improving grass-only pastures by adding
fertilizer and/or alfalfa required additional non-renewable energy
inputs; however, the additional energy required for unfertilized
alfalfa-grass pastures was minimal compared to the fertilized
pastures. Of the four pasture management strategies, adding alfalfa
to grass pastures without adding fertilizer had the highest efficiency
of energy use. Based on energy use and economic performance, the
unfertilized alfalfa-grass pasture was the most efficient and
sustainable pasture system.
Abstract: Nevertheless the widespread application of finite
mixture models in segmentation, finite mixture model selection is
still an important issue. In fact, the selection of an adequate number
of segments is a key issue in deriving latent segments structures and
it is desirable that the selection criteria used for this end are effective.
In order to select among several information criteria, which may
support the selection of the correct number of segments we conduct a
simulation study. In particular, this study is intended to determine
which information criteria are more appropriate for mixture model
selection when considering data sets with only categorical
segmentation base variables. The generation of mixtures of
multinomial data supports the proposed analysis. As a result, we
establish a relationship between the level of measurement of
segmentation variables and some (eleven) information criteria-s
performance. The criterion AIC3 shows better performance (it
indicates the correct number of the simulated segments- structure
more often) when referring to mixtures of multinomial segmentation
base variables.