Abstract: Extended Kalman Filter (EKF) is probably the most
widely used estimation algorithm for nonlinear systems. However,
not only it has difficulties arising from linearization but also many
times it becomes numerically unstable because of computer round off
errors that occur in the process of its implementation. To overcome
linearization limitations, the unscented transformation (UT) was
developed as a method to propagate mean and covariance
information through nonlinear transformations. Kalman filter that
uses UT for calculation of the first two statistical moments is called
Unscented Kalman Filter (UKF). Square-root form of UKF (SRUKF)
developed by Rudolph van der Merwe and Eric Wan to
achieve numerical stability and guarantee positive semi-definiteness
of the Kalman filter covariances. This paper develops another
implementation of SR-UKF for sequential update measurement
equation, and also derives a new UD covariance factorization filter
for the implementation of UKF. This filter is equivalent to UKF but
is computationally more efficient.
Abstract: In this study, Li4SiO4 powder was successfully
synthesized via sol gel method followed by drying at 150oC. Lithium
oxide, Li2O and silicon oxide, SiO2 were used as the starting
materials with citric acid as the chelating agent. The obtained powder
was then sintered at various temperatures. Crystallographic phase
analysis, morphology and ionic conductivity were investigated
systematically employing X-ray diffraction, Fourier Transform
Infrared, Scanning Electron Microscopy and AC impedance
spectroscopy. XRD result showed the formation of pure monoclinic
Li4SiO4 crystal structure with lattice parameters a = 5.140 Å, b =
6.094 Å, c = 5.293 Å, β = 90o in the sample sintered at 750oC. This
observation was confirmed by FTIR analysis. The bulk conductivity
of this sample at room temperature was 3.35 × 10-6 S cm-1 and the
highest bulk conductivity of 1.16 × 10-4 S cm-1 was obtained at
100°C. The results indicated that, the Li4SiO4 compound has
potential to be used as host for LISICON structured solid electrolyte
for low temperature application.
Abstract: The present work analyses different parameters of pressure die casting to minimize the casting defects. Pressure diecasting is usually applied for casting of aluminium alloys. Good surface finish with required tolerances and dimensional accuracy can be achieved by optimization of controllable process parameters such as solidification time, molten temperature, filling time, injection pressure and plunger velocity. Moreover, by selection of optimum process parameters the pressure die casting defects such as porosity, insufficient spread of molten material, flash etc. are also minimized. Therefore, a pressure die casting component, carburetor housing of aluminium alloy (Al2Si2O5) has been considered. The effects of selected process parameters on casting defects and subsequent setting of parameters with the levels have been accomplished by Taguchi-s parameter design approach. The experiments have been performed as per the combination of levels of different process parameters suggested by L18 orthogonal array. Analyses of variance have been performed for mean and signal-to-noise ratio to estimate the percent contribution of different process parameters. Confidence interval has also been estimated for 95% consistency level and three conformational experiments have been performed to validate the optimum level of different parameters. Overall 2.352% reduction in defects has been observed with the help of suggested optimum process parameters.
Abstract: Some quality control tools use non metric subjective information coming from experts, who qualify the intensity of relations existing inside processes, but without quantifying them. In this paper we have developed a quality control analytic tool, measuring the impact or strength of the relationship between process operations and product characteristics. The tool includes two models: a qualitative model, allowing relationships description and analysis; and a formal quantitative model, by means of which relationship quantification is achieved. In the first one, concepts from the Graphs Theory were applied to identify those process elements which can be sources of variation, that is, those quality characteristics or operations that have some sort of prelacy over the others and that should become control items. Also the most dependent elements can be identified, that is those elements receiving the effects of elements identified as variation sources. If controls are focused in those dependent elements, efficiency of control is compromised by the fact that we are controlling effects, not causes. The second model applied adapts the multivariate statistical technique of Covariance Structural Analysis. This approach allowed us to quantify the relationships. The computer package LISREL was used to obtain statistics and to validate the model.
Abstract: The Ad Hoc on demand distance vector (AODV) routing protocol is designed for mobile ad hoc networks (MANETs). AODV offers quick adaptation to dynamic link conditions; it is characterized by low memory overhead and low network utilization. The security issues related to the protocol remain challenging for the wireless network designers. Numerous schemes have been proposed for establishing secure communication between end users, these schemes identify that the secure operation of AODV is a bi tier task (routing and secure exchange of information at separate levels). Our endeavor in this paper would focus on achieving the routing and secure data exchange in a single step. This will facilitate the user nodes to perform routing, mutual authentications, generation and secure exchange of session key in one step thus ensuring confidentiality, integrity and authentication of data exchange in a more suitable way.
Abstract: Wireless location is to determine the mobile station (MS) location in a wireless cellular communications system. When fewer base stations (BSs) may be available for location purposes or the measurements with large errors in non-line-of-sight (NLOS) environments, it is necessary to integrate all available heterogeneous measurements to achieve high location accuracy. This paper illustrates a hybrid proposed schemes that combine time of arrival (TOA) at three BSs and angle of arrival (AOA) information at the serving BS to give a location estimate of the MS. The proposed schemes mitigate the NLOS effect simply by the weighted sum of the intersections between three TOA circles and the AOA line without requiring a priori information about the NLOS error. Simulation results show that the proposed methods can achieve better accuracy when compare with Taylor series algorithm (TSA) and the hybrid lines of position algorithm (HLOP).
Abstract: Nagaland, the 16th state of India in order of
statehood, is situated between 25° 6' and 27° 4' latitude north and
between 93º 20' E and 95º 15' E longitude of equator in the North
Eastern part of the India. Endowed with varied topography, soil and
agro climatic conditions it is known for its potentiality to grow all
most all kinds of horticultural crops. Pineapple being grown since
long organically by default is one of the most promising crops of the
state with emphasis being laid for commercialization by the
government of Nagaland. In light of commercialization, globalization
and scope of setting small-scale industries, a research study was
undertaken to examine the socio-economic and personal
characteristics, entrepreneurial characteristics and attitude of the
pineapple growers towards improved package of practices of
pineapple cultivation. The study was conducted in Medziphema
block of Dimapur district of the Nagaland state of India following ex
post facto research design. Ninety pineapple growers were selected
from four different villages of Medziphema block based on
proportionate random selection procedure. Findings of the study
revealed that majority of the respondents had medium level of
entrepreneurial characteristics in terms of knowledge level, risk
orientation, self confidence, management orientation, farm decision
making ability and leadership ability and most of them had
favourable attitude towards improved package of practices of
pineapple cultivation. The variables age, education, farm size, risk
orientation, management orientation and sources of information
utilized were found important to influence the attitude of the
respondents. The study revealed that favourable attitude and
entrepreneurial characteristics of the pineapple cultivators might be
harnessed for increased production of pineapple in the state thereby
bringing socio economic upliftment of the marginal and small-scale
farmers.
Abstract: In this paper we propose an NLP-based method for
Ontology Population from texts and apply it to semi automatic
instantiate a Generic Knowledge Base (Generic Domain Ontology) in
the risk management domain. The approach is semi-automatic and
uses a domain expert intervention for validation. The proposed
approach relies on a set of Instances Recognition Rules based on
syntactic structures, and on the predicative power of verbs in the
instantiation process. It is not domain dependent since it heavily
relies on linguistic knowledge.
A description of an experiment performed on a part of the
ontology of the PRIMA1 project (supported by the European
community) is given. A first validation of the method is done by
populating this ontology with Chemical Fact Sheets from
Environmental Protection Agency2. The results of this experiment
complete the paper and support the hypothesis that relying on the
predicative power of verbs in the instantiation process improves the
performance.
Abstract: The successful implementation of Service-Oriented Architecture (SOA) is not confined to Information Technology systems and required changes of the whole enterprise. In order to adapt IT and business, the enterprise requires adequate and measurable methods. The adoption of SOA creates new problem with regard to measuring and analysis the performance. In fact the enterprise should investigate to what extent the development of services will increase the value of business. It is required for every business to measure the extent of SOA adaptation with the goals of enterprise. Moreover, precise performance metrics and their combination with the advanced evaluation methodologies as a solution should be defined. The aim of this paper is to present a systematic methodology for designing a measurement system at the technical and business levels, so that: (1) it will determine measurement metrics precisely (2) the results will be analysed by mapping identified metrics to the measurement tools.
Abstract: The wireless link can be unreliable in realistic wireless
sensor networks (WSNs). Energy efficient and reliable data
forwarding is important because each node has limited resources.
Therefore, we must suggest an optimal solution that considers using
the information of the node-s characteristics. Previous routing
protocols were unsuited to realistic asymmetric WSNs. In this paper,
we propose a Protocol that considers Both sides of Link-quality and
Energy (PBLE), an optimal routing protocol that balances modified
link-quality, distance and energy. Additionally, we propose a node
scheduling method. PBLE achieves a longer lifetime than previous
routing protocols and is more energy-efficient. PBLE uses energy,
local information and both sides of PRR in a 1-hop distance. We
explain how to send data packets to the destination node using the
node's information. Simulation shows PBLE improves delivery rate
and network lifetime compared to previous schemes. Moreover, we
show the improvement in various WSN environments.
Abstract: In general, small-scale vegetables farmers experience
problems in improving the safety and quality of vegetables supplied
to high-class consumers in modern retailers. They also lack of
information to access market. The farmers group and/or cooperative
(FGC) should be able to assist its members by providing training in
handling and packing vegetables and enhancing marketing
capabilities to sell commodities to the modern retailers. This study
proposes an agri-food supply chain (ASC) model that involves the
corporate social responsibility (CSR) activities to cultivate the
capabilities of farmers to access market. Multi period ASC model is
formulated as Weighted Goal Programming (WGP) to analyze the
impacts of CSR programs to empower the FGCs in managing the
small-scale vegetables farmers. The results show that the proposed
model can be used to determine the priority of programs in order to
maximize the four goals to be achieved in the CSR programs.
Abstract: The ability of the brain to organize information and generate the functional structures we use to act, think and communicate, is a common and easily observable natural phenomenon. In object-oriented analysis, these structures are represented by objects. Objects have been extensively studied and documented, but the process that creates them is not understood. In this work, a new class of discrete, deterministic, dissipative, host-guest dynamical systems is introduced. The new systems have extraordinary self-organizing properties. They can host information representing other physical systems and generate the same functional structures as the brain does. A simple mathematical model is proposed. The new systems are easy to simulate by computer, and measurements needed to confirm the assumptions are abundant and readily available. Experimental results presented here confirm the findings. Applications are many, but among the most immediate are object-oriented engineering, image and voice recognition, search engines, and Neuroscience.
Abstract: With increasing complexity in electronic systems
there is a need for system level anomaly detection and fault isolation.
Anomaly detection based on vector similarity to a training set is used
in this paper through two approaches, one the preserves the original
information, Mahalanobis Distance (MD), and the other that
compresses the data into its principal components, Projection Pursuit
Analysis. These methods have been used to detect deviations in
system performance from normal operation and for critical parameter
isolation in multivariate environments. The study evaluates the
detection capability of each approach on a set of test data with known
faults against a baseline set of data representative of such “healthy"
systems.
Abstract: In the literature of information theory, there is
necessity for comparing the different measures of fuzzy entropy and
this consequently, gives rise to the need for normalizing measures of
fuzzy entropy. In this paper, we have discussed this need and hence
developed some normalized measures of fuzzy entropy. It is also
desirable to maximize entropy and to minimize directed divergence
or distance. Keeping in mind this idea, we have explained the method
of optimizing different measures of fuzzy entropy.
Abstract: In this study, the Multi-Layer Perceptron (MLP)with Back-Propagation learning algorithm are used to classify to effective diagnosis Parkinsons disease(PD).It-s a challenging problem for medical community.Typically characterized by tremor, PD occurs due to the loss of dopamine in the brains thalamic region that results in involuntary or oscillatory movement in the body. A feature selection algorithm along with biomedical test values to diagnose Parkinson disease.Clinical diagnosis is done mostly by doctor-s expertise and experience.But still cases are reported of wrong diagnosis and treatment. Patients are asked to take number of tests for diagnosis.In many cases,not all the tests contribute towards effective diagnosis of a disease.Our work is to classify the presence of Parkinson disease with reduced number of attributes.Original,22 attributes are involved in classify.We use Information Gain to determine the attributes which reduced the number of attributes which is need to be taken from patients.The Artificial neural networks is used to classify the diagnosis of patients.Twenty-Two attributes are reduced to sixteen attributes.The accuracy is in training data set is 82.051% and in the validation data set is 83.333%.
Abstract: Majority of researches conducted on Iranian urban
development plans indicate that they have been almost unsuccessful
in terms of draft, execution and goal achievement. Lack or shortage
of essential statistics and information can be listed as an important
reason of the failure of these plans. Lack of figures and information
has turned into an obvious part of the country-s statistics officials.
This problem has made urban planner themselves to embark on
physical surveys including real estate and land pricing, population
and economic census of the city. Apart from the problems facing
urban developers, the possibility of errors is high in such surveys.
In the present article, applying the interview technique, it has
been mentioned that utilizing multipurpose cadastre system as a land
information system is essential for urban development plans in Iran.
It can minimize or even remove the failures facing urban
development plans.
Abstract: An automatic speech recognition system for the
formal Arabic language is needed. The Quran is the most formal
spoken book in Arabic, it is spoken all over the world. In this
research, an automatic speech recognizer for Quranic based speakerindependent
was developed and tested. The system was developed
based on the tri-phone Hidden Markov Model and Maximum
Likelihood Linear Regression (MLLR). The MLLR computes a set
of transformations which reduces the mismatch between an initial
model set and the adaptation data. It uses the regression class tree, as
well as, estimates a set of linear transformations for the mean and
variance parameters of a Gaussian mixture HMM system. The 30th
Chapter of the Quran, with five of the most famous readers of the
Quran, was used for the training and testing of the data. The chapter
includes about 2000 distinct words. The advantages of using the
Quranic verses as the database in this developed recognizer are the
uniqueness of the words and the high level of orderliness between
verses. The level of accuracy from the tested data ranged 68 to 85%.
Abstract: Rapid advancement in computing technology brings
computers and humans to be seamlessly integrated in future. The
emergence of smartphone has driven computing era towards
ubiquitous and pervasive computing. Recognizing human activity has
garnered a lot of interest and has raised significant researches-
concerns in identifying contextual information useful to human
activity recognition. Not only unobtrusive to users in daily life,
smartphone has embedded built-in sensors that capable to sense
contextual information of its users supported with wide range
capability of network connections. In this paper, we will discuss the
classification algorithms used in smartphone-based human activity.
Existing technologies pertaining to smartphone-based researches in
human activity recognition will be highlighted and discussed. Our
paper will also present our findings and opinions to formulate
improvement ideas in current researches- trends. Understanding
research trends will enable researchers to have clearer research
direction and common vision on latest smartphone-based human
activity recognition area.
Abstract: Information and Communication Technologies (ICT) in mathematical education is a very active field of research and innovation, where learning is understood to be meaningful and grasping multiple linked representation rather than rote memorization, a great amount of literature offering a wide range of theories, learning approaches, methodologies and interpretations, are generally stressing the potentialities for teaching and learning using ICT. Despite the utilization of new learning approaches with ICT, students experience difficulties in learning concepts relevant to understanding mathematics, much remains unclear about the relationship between the computer environment, the activities it might support, and the knowledge that might emerge from such activities. Many questions that might arise in this regard: to what extent does the use of ICT help students in the process of understanding and solving tasks or problems? Is it possible to identify what aspects or features of students' mathematical learning can be enhanced by the use of technology? This paper will highlight the interest of the integration of information and communication technologies (ICT) into the teaching and learning of mathematics (quadratic functions), it aims to investigate the effect of four instructional methods on students- mathematical understanding and problem solving. Quantitative and qualitative methods are used to report about 43 students in middle school. Results showed that mathematical thinking and problem solving evolves as students engage with ICT activities and learn cooperatively.
Abstract: A concern that researchers usually face in different
applications of Artificial Neural Network (ANN) is determination of
the size of effective domain in time series. In this paper, trial and
error method was used on groundwater depth time series to determine
the size of effective domain in the series in an observation well in
Union County, New Jersey, U.S. different domains of 20, 40, 60, 80,
100, and 120 preceding day were examined and the 80 days was
considered as effective length of the domain. Data sets in different
domains were fed to a Feed Forward Back Propagation ANN with
one hidden layer and the groundwater depths were forecasted. Root
Mean Square Error (RMSE) and the correlation factor (R2) of
estimated and observed groundwater depths for all domains were
determined. In general, groundwater depth forecast improved, as
evidenced by lower RMSEs and higher R2s, when the domain length
increased from 20 to 120. However, 80 days was selected as the
effective domain because the improvement was less than 1% beyond
that. Forecasted ground water depths utilizing measured daily data
(set #1) and data averaged over the effective domain (set #2) were
compared. It was postulated that more accurate nature of measured
daily data was the reason for a better forecast with lower RMSE
(0.1027 m compared to 0.255 m) in set #1. However, the size of input
data in this set was 80 times the size of input data in set #2; a factor
that may increase the computational effort unpredictably. It was
concluded that 80 daily data may be successfully utilized to lower the
size of input data sets considerably, while maintaining the effective
information in the data set.