Abstract: This paper provides an in-depth study of Wireless
Sensor Network (WSN) application to monitor and control the
swiftlet habitat. A set of system design is designed and developed
that includes the hardware design of the nodes, Graphical User
Interface (GUI) software, sensor network, and interconnectivity for
remote data access and management. System architecture is proposed
to address the requirements for habitat monitoring. Such applicationdriven
design provides and identify important areas of further work
in data sampling, communications and networking. For this
monitoring system, a sensor node (MTS400), IRIS and Micaz radio
transceivers, and a USB interfaced gateway base station of Crossbow
(Xbow) Technology WSN are employed. The GUI of this monitoring
system is written using a Laboratory Virtual Instrumentation
Engineering Workbench (LabVIEW) along with Xbow Technology
drivers provided by National Instrument. As a result, this monitoring
system is capable of collecting data and presents it in both tables and
waveform charts for further analysis. This system is also able to send
notification message by email provided Internet connectivity is
available whenever changes on habitat at remote sites (swiftlet farms)
occur. Other functions that have been implemented in this system
are the database system for record and management purposes; remote
access through the internet using LogMeIn software. Finally, this
research draws a conclusion that a WSN for monitoring swiftlet
habitat can be effectively used to monitor and manage swiftlet
farming industry in Sarawak.
Abstract: In the recent past Learning Classifier Systems have
been successfully used for data mining. Learning Classifier System
(LCS) is basically a machine learning technique which combines
evolutionary computing, reinforcement learning, supervised or
unsupervised learning and heuristics to produce adaptive systems. A
LCS learns by interacting with an environment from which it
receives feedback in the form of numerical reward. Learning is
achieved by trying to maximize the amount of reward received. All
LCSs models more or less, comprise four main components; a finite
population of condition–action rules, called classifiers; the
performance component, which governs the interaction with the
environment; the credit assignment component, which distributes the
reward received from the environment to the classifiers accountable
for the rewards obtained; the discovery component, which is
responsible for discovering better rules and improving existing ones
through a genetic algorithm. The concatenate of the production rules
in the LCS form the genotype, and therefore the GA should operate
on a population of classifier systems. This approach is known as the
'Pittsburgh' Classifier Systems. Other LCS that perform their GA at
the rule level within a population are known as 'Mitchigan' Classifier
Systems. The most predominant representation of the discovered
knowledge is the standard production rules (PRs) in the form of IF P
THEN D. The PRs, however, are unable to handle exceptions and do
not exhibit variable precision. The Censored Production Rules
(CPRs), an extension of PRs, were proposed by Michalski and
Winston that exhibit variable precision and supports an efficient
mechanism for handling exceptions. A CPR is an augmented
production rule of the form: IF P THEN D UNLESS C, where
Censor C is an exception to the rule. Such rules are employed in
situations, in which conditional statement IF P THEN D holds
frequently and the assertion C holds rarely. By using a rule of this
type we are free to ignore the exception conditions, when the
resources needed to establish its presence are tight or there is simply
no information available as to whether it holds or not. Thus, the IF P
THEN D part of CPR expresses important information, while the
UNLESS C part acts only as a switch and changes the polarity of D
to ~D. In this paper Pittsburgh style LCSs approach is used for
automated discovery of CPRs. An appropriate encoding scheme is
suggested to represent a chromosome consisting of fixed size set of
CPRs. Suitable genetic operators are designed for the set of CPRs
and individual CPRs and also appropriate fitness function is proposed
that incorporates basic constraints on CPR. Experimental results are
presented to demonstrate the performance of the proposed learning
classifier system.
Abstract: Link reliability and transmitted power are two important design constraints in wireless network design. Error control coding (ECC) is a classic approach used to increase link reliability and to lower the required transmitted power. It provides coding gain, resulting in transmitter energy savings at the cost of added decoder power consumption. But the choice of ECC is very critical in the case of wireless sensor network (WSN). Since the WSNs are energy constraint in nature, both the BER and power consumption has to be taken into count. This paper develops a step by step approach in finding suitable error control codes for WSNs. Several simulations are taken considering different error control codes and the result shows that the RS(31,21) fits both in BER and power consumption criteria.
Abstract: The paper deals with the estimation of amplitude and phase of an analogue multi-harmonic band-limited signal from irregularly spaced sampling values. To this end, assuming the signal fundamental frequency is known in advance (i.e., estimated at an independent stage), a complexity-reduced algorithm for signal reconstruction in time domain is proposed. The reduction in complexity is achieved owing to completely new analytical and summarized expressions that enable a quick estimation at a low numerical error. The proposed algorithm for the calculation of the unknown parameters requires O((2M+1)2) flops, while the straightforward solution of the obtained equations takes O((2M+1)3) flops (M is the number of the harmonic components). It is applied in signal reconstruction, spectral estimation, system identification, as well as in other important signal processing problems. The proposed method of processing can be used for precise RMS measurements (for power and energy) of a periodic signal based on the presented signal reconstruction. The paper investigates the errors related to the signal parameter estimation, and there is a computer simulation that demonstrates the accuracy of these algorithms.
Abstract: In the past few years there is a change in the view of high performance applications and parallel computing. Initially such applications were targeted towards dedicated parallel machines. Recently trend is changing towards building meta-applications composed of several modules that exploit heterogeneous platforms and employ hybrid forms of parallelism. The aim of this paper is to propose a model of virtual parallel computing. Virtual parallel computing system provides a flexible object oriented software framework that makes it easy for programmers to write various parallel applications.
Abstract: In recent years, scanning probe atomic force
microscopy SPM AFM has gained acceptance over a wide spectrum
of research and science applications. Most fields focuses on physical,
chemical, biological while less attention is devoted to manufacturing
and machining aspects. The purpose of the current study is to assess
the possible implementation of the SPM AFM features and its
NanoScope software in general machining applications with special
attention to the tribological aspects of cutting tool. The surface
morphology of coated and uncoated as-received carbide inserts is
examined, analyzed, and characterized through the determination of
the appropriate scanning setting, the suitable data type imaging
techniques and the most representative data analysis parameters
using the MultiMode SPM AFM in contact mode. The NanoScope
operating software is used to capture realtime three data types
images: “Height", “Deflection" and “Friction". Three scan sizes are
independently performed: 2, 6, and 12 μm with a 2.5 μm vertical
range (Z). Offline mode analysis includes the determination of three
functional topographical parameters: surface “Roughness", power
spectral density “PSD" and “Section". The 12 μm scan size in
association with “Height" imaging is found efficient to capture every
tiny features and tribological aspects of the examined surface. Also,
“Friction" analysis is found to produce a comprehensive explanation
about the lateral characteristics of the scanned surface. Configuration
of many surface defects and drawbacks has been precisely detected
and analyzed.
Abstract: This paper discusses the designing of knowledge
integration of clinical information extracted from distributed medical
ontologies in order to ameliorate a machine learning-based multilabel
coding assignment system. The proposed approach is
implemented using a decision tree technique of the machine learning
on the university hospital data for patients with Coronary Heart
Disease (CHD). The preliminary results obtained show a satisfactory
finding that the use of medical ontologies improves the overall
system performance.
Abstract: IETF defines mobility support in IPv6, i.e. MIPv6, to
allow nodes to remain reachable while moving around in the IPv6
internet. When a node moves and visits a foreign network, it is still
reachable through the indirect packet forwarding from its home
network. This triangular routing feature provides node mobility but
increases the communication latency between nodes. This deficiency
can be overcome by using a Binding Update (BU) scheme, which let
nodes keep up-to-date IP addresses and communicate with each other
through direct IP routing. To further protect the security of BU, a
Return Routability (RR) procedure was developed. However, it has
been found that RR procedure is vulnerable to many attacks. In this
paper, we will propose a lightweight RR procedure based on
geometric computing. In consideration of the inherent limitation of
computing resources in mobile node, the proposed scheme is
developed to minimize the cost of computations and to eliminate the
overhead of state maintenance during binding updates. Compared with
other CGA-based BU schemes, our scheme is more efficient and
doesn-t need nonce tables in nodes.
Abstract: This study was carried out in Ankara, the capital city of Turkey, in order to determine how people living in the slums of Ankara benefit from educational equality. Within the scope of the research, interviews were made with 64 families whose children have been getting education from the primary schools of these parts and the data of the study was collected by the researcher. The results of the research demonstrate that the children getting education in the slums of Ankara can not experience educational equality and justice. The results of this study show that the opportunities of the schools in the slums of Ankara are very limited, so the individuals in these districts can not equally benefit from the education. The families are aware of the problem they are faced with. KeywordsDiscrimination, inequality, primary education, slums of Turkey.
Abstract: High quality requirements analysis is one of the most
crucial activities to ensure the success of a software project, so that
requirements verification for software system becomes more and more
important in Requirements Engineering (RE) and it is one of the most
helpful strategies for improving the quality of software system.
Related works show that requirement elicitation and analysis can be
facilitated by ontological approaches and semantic web technologies.
In this paper, we proposed a hybrid method which aims to verify
requirements with structural and formal semantics to detect
interactions. The proposed method is twofold: one is for modeling
requirements with the semantic web language OWL, to construct a
semantic context; the other is a set of interaction detection rules which
are derived from scenario-based analysis and represented with
semantic web rule language (SWRL). SWRL based rules are working
with rule engines like Jess to reason in semantic context for
requirements thus to detect interactions. The benefits of the proposed
method lie in three aspects: the method (i) provides systematic steps
for modeling requirements with an ontological approach, (ii) offers
synergy of requirements elicitation and domain engineering for
knowledge sharing, and (3)the proposed rules can systematically assist
in requirements interaction detection.
Abstract: Work stress causes the organizational work-life
imbalance of employees. Because of this imbalance, workers perform
with lower effort to finish assignments and thus an organization will
experience reduced productivity. In order to investigate the problem
of an organizational work-life imbalance, this qualitative case study
focuses on an organizational work-life imbalance among Thai
software developers in a German-owned company in Chiang Mai,
Thailand. In terms of knowledge management, fishbone diagram is
useful analysis tool to investigate the root causes of an organizational
work-life imbalance systematically in focus-group discussions.
Furthermore, fishbone diagram shows the relationship between
causes and effects clearly. It was found that an organizational worklife
imbalance among Thai software developers is influenced by
management team, work environment, and information tools used in
the company over time.
Abstract: Logistics is part of the supply chain processes that plans, implements, and controls the efficient and effective forward and reverse flow and storage of goods, services, and related information between the point of origin and the point of consumption in order to meet customer requirements. This research aims to investigate the current status and future direction of the use of Information Technology (IT) for logistics, focusing on Supply Chain Management (SCM) and E-Commerce adoption in Johor. Therefore, this research stresses on the type of technology being adopted, factors, benefits and barriers affecting the innovation in SCM and ECommerce technology adoption among Logistics Service Providers (LSP). A mailed questionnaire survey was conducted to collect data from 265 logistics companies in Johor. The research revealed that SCM technology adoption among LSP was higher as they had adopted SCM technology in various business processes while they perceived a high level of benefits from SCM adoption. Obviously, ECommerce technology adoption among LSP is relatively low.
Abstract: In this paper, a new formulation for acoustics coupled with linear elasticity is presented. The primary objective of the work is to develop a three dimensional hp adaptive finite element method code destinated for modeling of acoustics of human head. The code will have numerous applications e.g. in designing hearing protection devices for individuals working in high noise environments. The presented work is in the preliminary stage. The variational formulation has been implemented and tested on a sequence of meshes with concentric multi-layer spheres, with material data representing the tissue (the brain), skull and the air. Thus, an efficient solver for coupled elasticity/acoustics problems has been developed, and tested on high contrast material data representing the human head.
Abstract: Recently studies in area of supply chain network
(SCN) have focused on the disruption issues in distribution systems.
Also this paper extends the previous literature by providing a new biobjective
model for cost minimization of designing a three echelon
SCN across normal and failure scenarios with considering multi
capacity option for manufacturers and distribution centers. Moreover,
in order to solve the problem by means of LINGO software, novel
model will be reformulated through a branch of LP-Metric method
called Min-Max approach.
Abstract: Obfuscation is a low cost software protection
methodology to avoid reverse engineering and re engineering of
applications. Source code obfuscation aims in obscuring the source
code to hide the functionality of the codes. This paper proposes an
Array data transformation in order to obfuscate the source code
which uses arrays. The applications using the proposed data
structures force the programmer to obscure the logic manually. It
makes the developed obscured codes hard to reverse engineer and
also protects the functionality of the codes.
Abstract: This paper describes an optimal approach for feature
subset selection to classify the leaves based on Genetic Algorithm
(GA) and Kernel Based Principle Component Analysis (KPCA). Due
to high complexity in the selection of the optimal features, the
classification has become a critical task to analyse the leaf image
data. Initially the shape, texture and colour features are extracted
from the leaf images. These extracted features are optimized through
the separate functioning of GA and KPCA. This approach performs
an intersection operation over the subsets obtained from the
optimization process. Finally, the most common matching subset is
forwarded to train the Support Vector Machine (SVM). Our
experimental results successfully prove that the application of GA
and KPCA for feature subset selection using SVM as a classifier is
computationally effective and improves the accuracy of the classifier.
Abstract: The group mutual exclusion (GME) problem is a
variant of the mutual exclusion problem. In the present paper a
token-based group mutual exclusion algorithm, capable of handling
transient faults, is proposed. The algorithm uses the concept of
dynamic request sets. A time out mechanism is used to detect the
token loss; also, a distributed scheme is used to regenerate the token.
The worst case message complexity of the algorithm is n+1. The
maximum concurrency and forum switch complexity of the
algorithm are n and min (n, m) respectively, where n is the number of
processes and m is the number of groups. The algorithm also satisfies
another desirable property called smooth admission. The scheme can
also be adapted to handle the extended group mutual exclusion
problem.
Abstract: This paper describes how the correct endian mode of
the TMS320C6713 DSK board can be identified. It also explains how
the TMS320C6713 DSK board can be used in the little endian and in
the big endian modes for assembly language programming in
particular and for signal processing in general. Similarly, it discusses
how crucially important it is for a user of the TMS320C6713 DSK
board to identify the mode of operation and then use it correctly
during the development stages of the assembly language
programming; otherwise, it will cause unnecessary confusion and
erroneous results as far as storing data into the memory and loading
data from the memory is concerned. Furthermore, it highlights and
strongly recommends to the users of the TMS320C6713 DSK board
to be aware of the availability and importance of various display
options in the Code Composer Studio (CCS) for correctly
interpreting and displaying the desired data in the memory. The
information presented in this paper will be of great importance and
interest to those practitioners and developers who wants to use the
TMS320C6713 DSK board for assembly language programming as
well as input-output signal processing manipulations. Finally,
examples that clearly illustrate the concept are presented.
Abstract: Transient simulation of power electronic circuits is of
considerable interest to the designer. The switching nature of the
devices used permits development of specialized algorithms which
allow a considerable reduction in simulation time compared to
general purpose simulation algorithms. This paper describes a
method used to simulate a power electronic circuits using the
SIMULINK toolbox within MATLAB software. Theoretical results
are presented provides the basis of transient analysis of a power
electronic circuits.
Abstract: Sedimentation is a hydraulic phenomenon that is
emerging as a serious challenge in river engineering. When the flow
reaches a certain state that gather potential energy, it shifts the
sediment load along channel bed. The transport of such materials can
be in the form of suspended and bed loads. The movement of these
along the river course and channels and the ways in which this could
influence the water intakes is considered as the major challenges for
sustainable O&M of hydraulic structures. This could be very serious
in arid and semi-arid regions like Iran, where inappropriate watershed
management could lead to shifting a great deal of sediments into the
reservoirs and irrigation systems. This paper aims to investigate
sedimentation in the Western Canal of Dez Diversion Weir in Iran,
identifying factors which influence the process and provide ways in
which to mitigate its detrimental effects by using the SHARC
Software.
For the purpose of this paper, data from the Dezful water authority
and Dezful Hydrometric Station pertinent to a river course of about 6
Km were used.
Results estimated sand and silt bed loads concentrations to be 193
ppm and 827ppm respectively. Given the available data on average
annual bed loads and average suspended sediment loads of 165ppm
and 837ppm, there was a significant statistical difference (16%)
between the sand grains, whereas no significant difference (1.2%)
was find in the silt grain sizes. One explanation for such finding
being that along the 6 Km river course there was considerable
meandering effects which explains recent shift in the hydraulic
behavior along the stream course under investigation. The sand
concentration in downstream relative to present state of the canal
showed a steep descending curve. Sediment trapping on the other
hand indicated a steep ascending curve. These occurred because the
diversion weir was not considered in the simulation model.