Abstract: Financial forecasting is an example of signal processing problems. A number of ways to train/learn the network are available. We have used Levenberg-Marquardt algorithm for error back-propagation for weight adjustment. Pre-processing of data has reduced much of the variation at large scale to small scale, reducing the variation of training data.
Abstract: Wireless channels are characterized by more serious
bursty and location-dependent errors. Many packet scheduling
algorithms have been proposed for wireless networks to guarantee
fairness and delay bounds. However, most existing schemes do not
consider the difference of traffic natures among packet flows. This
will cause the delay-weight coupling problem. In particular, serious
queuing delays may be incurred for real-time flows. In this paper, it
is proposed a scheduling algorithm that takes traffic types of flows
into consideration when scheduling packets and also it is provided
scheduling flexibility by trading off video quality to meet the
playback deadline.
Abstract: The feature of HIV genome is in a wide range because
of it is highly heterogeneous. Hence, the infection ability of the virus changes related with different chemokine receptors. From this point,
R5 and X4 HIV viruses use CCR5 and CXCR5 coreceptors respectively while R5X4 viruses can utilize both coreceptors. Recently, in Bioinformatics, R5X4 viruses have been studied to
classify by using the coreceptors of HIV genome.
The aim of this study is to develop the optimal Multilayer
Perceptron (MLP) for high classification accuracy of HIV sub-type viruses. To accomplish this purpose, the unit number in hidden layer
was incremented one by one, from one to a particular number. The statistical data of R5X4, R5 and X4 viruses was preprocessed by the
signal processing methods. Accessible residues of these virus sequences were extracted and modeled by Auto-Regressive Model
(AR) due to the dimension of residues is large and different from each other. Finally the pre-processed dataset was used to evolve MLP with various number of hidden units to determine R5X4
viruses. Furthermore, ROC analysis was used to figure out the optimal MLP structure.
Abstract: In this article, we introduce a mechanism by which the same concept of differentiated services used in network transmission can be applied to provide quality of service levels to pervasive systems applications. The classical DiffServ model, including marking and classification, assured forwarding, and expedited forwarding, are all utilized to create quality of service guarantees for various pervasive applications requiring different levels of quality of service. Through a collection of various sensors, personal devices, and data sources, the transmission of contextsensitive data can automatically occur within a pervasive system with a given quality of service level. Triggers, initiators, sources, and receivers are four entities labeled in our mechanism. An explanation of the role of each is provided, and how quality of service is guaranteed.
Abstract: Systems Analysis and Design is a key subject in
Information Technology courses, but students do not find it easy to
cope with, since it is not “precise" like programming and not exact
like Mathematics. It is a subject working with many concepts,
modeling ideas into visual representations and then translating the
pictures into a real life system. To complicate matters users who are
not necessarily familiar with computers need to give their inputs to
ensure that they get the system the need. Systems Analysis and
Design also covers two fields, namely Analysis, focusing on the
analysis of the existing system and Design, focusing on the design of
the new system. To be able to test the analysis and design of a
system, it is necessary to develop a system or at least a prototype of
the system to test the validity of the analysis and design. The skills
necessary in each aspect differs vastly. Project Management Skills,
Database Knowledge and Object Oriented Principles are all
necessary. In the context of a developing country where students
enter tertiary education underprepared and the digital divide is alive
and well, students need to be motivated to learn the necessary skills,
get an opportunity to test it in a “live" but protected environment –
within the framework of a university. The purpose of this article is to
improve the learning experience in Systems Analysis and Design
through reviewing the underlying teaching principles used, the
teaching tools implemented, the observations made and the
reflections that will influence future developments in Systems
Analysis and Design. Action research principles allows the focus to
be on a few problematic aspects during a particular semester.
Abstract: In this paper we address the issue of classifying the fluorescent intensity of a sample in Indirect Immuno-Fluorescence (IIF). Since IIF is a subjective, semi-quantitative test in its very nature, we discuss a strategy to reliably label the image data set by using the diagnoses performed by different physicians. Then, we discuss image pre-processing, feature extraction and selection. Finally, we propose two ANN-based classifiers that can separate intrinsically dubious samples and whose error tolerance can be flexibly set. Measured performance shows error rates less than 1%, which candidates the method to be used in daily medical practice either to perform pre-selection of cases to be examined, or to act as a second reader.
Abstract: Sharing consistent and correct master data among
disparate applications in a reverse-logistics chain has long been
recognized as an intricate problem. Although a master data
management (MDM) system can surely assume that responsibility,
applications that need to co-operate with it must comply with
proprietary query interfaces provided by the specific MDM system. In
this paper, we present a RFID-ready MDM system which makes
master data readily available for any participating applications in a
reverse-logistics chain. We propose a RFID-wrapper as a part of our
MDM. It acts as a gateway between any data retrieval request and
query interfaces that process it. With the RFID-wrapper, any
participating applications in a reverse-logistics chain can easily
retrieve master data in a way that is analogous to retrieval of any other
RFID-based logistics transactional data.
Abstract: Nowadays, the increase of human population every
year results in increasing of water usage and demand. Saen Saep
canal is important canal in Bangkok. The main objective of this study
is using Artificial Neural Network (ANN) model to estimate the
Chemical Oxygen Demand (COD) on data from 11 sampling sites.
The data is obtained from the Department of Drainage and Sewerage,
Bangkok Metropolitan Administration, during 2007-2011. The
twelve parameters of water quality are used as the input of the
models. These water quality indices affect the COD. The
experimental results indicate that the ANN model provides a high
correlation coefficient (R=0.89).
Abstract: This report aims to utilize existing and future Multiple-Input Multiple-Output Orthogonal Frequency Division Multiplexing Wireless Local Area Network (MIMO-OFDM WLAN) systems characteristics–such as multiple subcarriers, multiple antennas, and channel estimation characteristics–for indoor location estimation systems based on the Direction of Arrival (DOA) and Radio Signal Strength Indication (RSSI) methods. Hybrid of DOA-RSSI methods also evaluated. In the experimental data result, we show that location estimation accuracy performances can be increased by minimizing the multipath fading effect. This is done using multiple subcarrier frequencies over wideband frequencies to estimate one location. The proposed methods are analyzed in both a wide indoor environment and a typical room-sized office. In the experiments, WLAN terminal locations are estimated by measuring multiple subcarriers from arrays of three dipole antennas of access points (AP). This research demonstrates highly accurate, robust and hardware-free add-on software for indoor location estimations based on a MIMO-OFDM WLAN system.
Abstract: We have developed a distributed asynchronous Web
based training system. In order to improve the scalability and robustness
of this system, all contents and functions are realized on mobile
agents. These agents are distributed to computers, and they can use
a Peer to Peer network that modified Content-Addressable Network.
In the proposed system, only text data can be included in a exercise.
To make our proposed system more useful, the mechanism that it not
only adapts to multimedia data but also it doesn-t influence the user-s
learning even if the size of exercise becomes large is necessary.
Abstract: In this paper developed and realized absolutely new
algorithm for solving three-dimensional Poisson equation. This
equation used in research of turbulent mixing, computational fluid
dynamics, atmospheric front, and ocean flows and so on. Moreover in
the view of rising productivity of difficult calculation there was
applied the most up-to-date and the most effective parallel
programming technology - MPI in combination with OpenMP
direction, that allows to realize problems with very large data
content. Resulted products can be used in solving of important
applications and fundamental problems in mathematics and physics.
Abstract: Detecting protein-protein interactions is a central problem in computational biology and aberrant such interactions may have implicated in a number of neurological disorders. As a result, the prediction of protein-protein interactions has recently received considerable attention from biologist around the globe. Computational tools that are capable of effectively identifying protein-protein interactions are much needed. In this paper, we propose a method to detect protein-protein interaction based on substring similarity measure. Two protein sequences may interact by the mean of the similarities of the substrings they contain. When applied on the currently available protein-protein interaction data for the yeast Saccharomyces cerevisiae, the proposed method delivered reasonable improvement over the existing ones.
Abstract: The volume of XML data exchange is explosively increasing, and the need for efficient mechanisms of XML data management is vital. Many XML storage models have been proposed for storing XML DTD-independent documents in relational database systems. Benchmarking is the best way to highlight pros and cons of different approaches. In this study, we use a common benchmarking scheme, known as XMark to compare the most cited and newly proposed DTD-independent methods in terms of logical reads, physical I/O, CPU time and duration. We show the effect of Label Path, extracting values and storing in another table and type of join needed for each method's query answering.
Abstract: The scale, complexity and worldwide geographical
spread of the LHC computing and data analysis problems are
unprecedented in scientific research. The complexity of processing
and accessing this data is increased substantially by the size and
global span of the major experiments, combined with the limited
wide area network bandwidth available. We present the latest
generation of the MONARC (MOdels of Networked Analysis at
Regional Centers) simulation framework, as a design and modeling
tool for large scale distributed systems applied to HEP experiments.
We present simulation experiments designed to evaluate the
capabilities of the current real-world distributed infrastructure to
support existing physics analysis processes and the means by which
the experiments bands together to meet the technical challenges
posed by the storage, access and computing requirements of LHC
data analysis within the CMS experiment.
Abstract: In the past decade, artificial neural networks (ANNs)
have been regarded as an instrument for problem-solving and
decision-making; indeed, they have already done with a substantial
efficiency and effectiveness improvement in industries and businesses.
In this paper, the Back-Propagation neural Networks (BPNs) will be
modulated to demonstrate the performance of the collaborative
forecasting (CF) function of a Collaborative Planning, Forecasting and
Replenishment (CPFR®) system. CPFR functions the balance between
the sufficient product supply and the necessary customer demand in a
Supply and Demand Chain (SDC). Several classical standard BPN will
be grouped, collaborated and exploited for the easy implementation of
the proposed modular ANN framework based on the topology of a
SDC. Each individual BPN is applied as a modular tool to perform the
task of forecasting SKUs (Stock-Keeping Units) levels that are
managed and supervised at a POS (point of sale), a wholesaler, and a
manufacturer in an SDC. The proposed modular BPN-based CF
system will be exemplified and experimentally verified using lots of
datasets of the simulated SDC. The experimental results showed that a
complex CF problem can be divided into a group of simpler
sub-problems based on the single independent trading partners
distributed over SDC, and its SKU forecasting accuracy was satisfied
when the system forecasted values compared to the original simulated
SDC data. The primary task of implementing an autonomous CF
involves the study of supervised ANN learning methodology which
aims at making “knowledgeable" decision for the best SKU sales plan
and stocks management.
Abstract: The Ad Hoc on demand distance vector (AODV) routing protocol is designed for mobile ad hoc networks (MANETs). AODV offers quick adaptation to dynamic link conditions; it is characterized by low memory overhead and low network utilization. The security issues related to the protocol remain challenging for the wireless network designers. Numerous schemes have been proposed for establishing secure communication between end users, these schemes identify that the secure operation of AODV is a bi tier task (routing and secure exchange of information at separate levels). Our endeavor in this paper would focus on achieving the routing and secure data exchange in a single step. This will facilitate the user nodes to perform routing, mutual authentications, generation and secure exchange of session key in one step thus ensuring confidentiality, integrity and authentication of data exchange in a more suitable way.
Abstract: The wireless link can be unreliable in realistic wireless
sensor networks (WSNs). Energy efficient and reliable data
forwarding is important because each node has limited resources.
Therefore, we must suggest an optimal solution that considers using
the information of the node-s characteristics. Previous routing
protocols were unsuited to realistic asymmetric WSNs. In this paper,
we propose a Protocol that considers Both sides of Link-quality and
Energy (PBLE), an optimal routing protocol that balances modified
link-quality, distance and energy. Additionally, we propose a node
scheduling method. PBLE achieves a longer lifetime than previous
routing protocols and is more energy-efficient. PBLE uses energy,
local information and both sides of PRR in a 1-hop distance. We
explain how to send data packets to the destination node using the
node's information. Simulation shows PBLE improves delivery rate
and network lifetime compared to previous schemes. Moreover, we
show the improvement in various WSN environments.
Abstract: The available data on the cross sections of electronimpact
excitation of krypton 5s and 5p configuration levels out of the
ground state are represented in convenient and compact form. The
results are obtained by regression through all known published data
related to this process.
Abstract: With increasing complexity in electronic systems
there is a need for system level anomaly detection and fault isolation.
Anomaly detection based on vector similarity to a training set is used
in this paper through two approaches, one the preserves the original
information, Mahalanobis Distance (MD), and the other that
compresses the data into its principal components, Projection Pursuit
Analysis. These methods have been used to detect deviations in
system performance from normal operation and for critical parameter
isolation in multivariate environments. The study evaluates the
detection capability of each approach on a set of test data with known
faults against a baseline set of data representative of such “healthy"
systems.
Abstract: Dengue fever is prevalent in Malaysia with numerous
cases including mortality recorded over the years. Public education
on the prevention of the desease through various means has been
carried out besides the enforcement of legal means to eradicate
Aedes mosquitoes, the dengue vector breeding ground. Hence, other
means need to be explored, such as predicting the seasonal peak
period of the dengue outbreak and identifying related climate factors
contributing to the increase in the number of mosquitoes. Simulation
model can be employed for this purpose. In this study, we created a
simulation of system dynamic to predict the spread of dengue
outbreak in Hulu Langat, Selangor Malaysia. The prototype was
developed using STELLA 9.1.2 software. The main data input are
rainfall, temperature and denggue cases. Data analysis from the graph
showed that denggue cases can be predicted accurately using these
two main variables- rainfall and temperature. However, the model
will be further tested over a longer time period to ensure its
accuracy, reliability and efficiency as a prediction tool for dengue
outbreak.