Abstract: Logistics is part of the supply chain processes that plans, implements, and controls the efficient and effective forward and reverse flow and storage of goods, services, and related information between the point of origin and the point of consumption in order to meet customer requirements. This research aims to investigate the current status and future direction of the use of Information Technology (IT) for logistics, focusing on Supply Chain Management (SCM) and E-Commerce adoption in Johor. Therefore, this research stresses on the type of technology being adopted, factors, benefits and barriers affecting the innovation in SCM and ECommerce technology adoption among Logistics Service Providers (LSP). A mailed questionnaire survey was conducted to collect data from 265 logistics companies in Johor. The research revealed that SCM technology adoption among LSP was higher as they had adopted SCM technology in various business processes while they perceived a high level of benefits from SCM adoption. Obviously, ECommerce technology adoption among LSP is relatively low.
Abstract: In this paper, a new formulation for acoustics coupled with linear elasticity is presented. The primary objective of the work is to develop a three dimensional hp adaptive finite element method code destinated for modeling of acoustics of human head. The code will have numerous applications e.g. in designing hearing protection devices for individuals working in high noise environments. The presented work is in the preliminary stage. The variational formulation has been implemented and tested on a sequence of meshes with concentric multi-layer spheres, with material data representing the tissue (the brain), skull and the air. Thus, an efficient solver for coupled elasticity/acoustics problems has been developed, and tested on high contrast material data representing the human head.
Abstract: Obfuscation is a low cost software protection
methodology to avoid reverse engineering and re engineering of
applications. Source code obfuscation aims in obscuring the source
code to hide the functionality of the codes. This paper proposes an
Array data transformation in order to obfuscate the source code
which uses arrays. The applications using the proposed data
structures force the programmer to obscure the logic manually. It
makes the developed obscured codes hard to reverse engineer and
also protects the functionality of the codes.
Abstract: This paper describes an optimal approach for feature
subset selection to classify the leaves based on Genetic Algorithm
(GA) and Kernel Based Principle Component Analysis (KPCA). Due
to high complexity in the selection of the optimal features, the
classification has become a critical task to analyse the leaf image
data. Initially the shape, texture and colour features are extracted
from the leaf images. These extracted features are optimized through
the separate functioning of GA and KPCA. This approach performs
an intersection operation over the subsets obtained from the
optimization process. Finally, the most common matching subset is
forwarded to train the Support Vector Machine (SVM). Our
experimental results successfully prove that the application of GA
and KPCA for feature subset selection using SVM as a classifier is
computationally effective and improves the accuracy of the classifier.
Abstract: The thermal, epithermal and fast fluxes were
calculated for three irradiation channels at Egypt Second Research
Reactor (ETRR-2) using CITVAP code. The validity of the
calculations was verified by experimental measurements. There are
some deviations between measurements and calculations. This is due
to approximations in the calculation models used, homogenization of
regions, condensation of energy groups and uncertainty in nuclear
data used. Neutron flux data for the three irradiation channels are
now available. This would enable predicting the irradiation
conditions needed for future radioisotope production.
Abstract: Nowadays, where most of the leading economies are
service oriented and e-business is being widely used for their
management, supply chain management has become one of the most
studied and practiced fields. Quality has an important role on today-s
business processes, so it is important to understand the impact of IT
service quality on the performance of supply chains. This paper will
start by analyzing the Supply Chain Operations Reference (SCOR)
model and each of its five activities: Plan, Source, Make, Delivery,
and Return. This article proposes a framework for analyzing Effect of
IT Service Quality on Supply Chain Performance. Using the
proposed framework, hypotheses are framed for the direct effect of IT
service quality on Supply Chain Performance and its indirect effect
through effective Supply Chain Management. The framework will be
validated empirically based on the surveys of executives of various
organizations and statistical analyses of the data collected.
Abstract: This paper describes how the correct endian mode of
the TMS320C6713 DSK board can be identified. It also explains how
the TMS320C6713 DSK board can be used in the little endian and in
the big endian modes for assembly language programming in
particular and for signal processing in general. Similarly, it discusses
how crucially important it is for a user of the TMS320C6713 DSK
board to identify the mode of operation and then use it correctly
during the development stages of the assembly language
programming; otherwise, it will cause unnecessary confusion and
erroneous results as far as storing data into the memory and loading
data from the memory is concerned. Furthermore, it highlights and
strongly recommends to the users of the TMS320C6713 DSK board
to be aware of the availability and importance of various display
options in the Code Composer Studio (CCS) for correctly
interpreting and displaying the desired data in the memory. The
information presented in this paper will be of great importance and
interest to those practitioners and developers who wants to use the
TMS320C6713 DSK board for assembly language programming as
well as input-output signal processing manipulations. Finally,
examples that clearly illustrate the concept are presented.
Abstract: Sedimentation is a hydraulic phenomenon that is
emerging as a serious challenge in river engineering. When the flow
reaches a certain state that gather potential energy, it shifts the
sediment load along channel bed. The transport of such materials can
be in the form of suspended and bed loads. The movement of these
along the river course and channels and the ways in which this could
influence the water intakes is considered as the major challenges for
sustainable O&M of hydraulic structures. This could be very serious
in arid and semi-arid regions like Iran, where inappropriate watershed
management could lead to shifting a great deal of sediments into the
reservoirs and irrigation systems. This paper aims to investigate
sedimentation in the Western Canal of Dez Diversion Weir in Iran,
identifying factors which influence the process and provide ways in
which to mitigate its detrimental effects by using the SHARC
Software.
For the purpose of this paper, data from the Dezful water authority
and Dezful Hydrometric Station pertinent to a river course of about 6
Km were used.
Results estimated sand and silt bed loads concentrations to be 193
ppm and 827ppm respectively. Given the available data on average
annual bed loads and average suspended sediment loads of 165ppm
and 837ppm, there was a significant statistical difference (16%)
between the sand grains, whereas no significant difference (1.2%)
was find in the silt grain sizes. One explanation for such finding
being that along the 6 Km river course there was considerable
meandering effects which explains recent shift in the hydraulic
behavior along the stream course under investigation. The sand
concentration in downstream relative to present state of the canal
showed a steep descending curve. Sediment trapping on the other
hand indicated a steep ascending curve. These occurred because the
diversion weir was not considered in the simulation model.
Abstract: The electromagnetic imaging of inhomogeneous
dielectric cylinders buried in a slab medium by transverse electric
(TE) wave illumination is investigated. Dielectric cylinders of
unknown permittivities are buried in second space and scattered a
group of unrelated waves incident from first space where the scattered
field is recorded. By proper arrangement of the various unrelated
incident fields, the difficulties of ill-posedness and nonlinearity are
circumvented, and the permittivity distribution can be reconstructed
through simple matrix operations. The algorithm is based on the
moment method and the unrelated illumination method. Numerical
results are given to demonstrate the capability of the inverse
algorithm. Good reconstruction is obtained even in the presence of
additive Gaussian random noise in measured data. In addition, the
effect of noise on the reconstruction result is also investigated.
Abstract: In this paper, we proposed an efficient data
compression strategy exploiting the multi-resolution characteristic of
the wavelet transform. We have developed a sensor node called
“Smart Sensor Node; SSN". The main goals of the SSN design are
lightweight, minimal power consumption, modular design and robust
circuitry. The SSN is made up of four basic components which are a
sensing unit, a processing unit, a transceiver unit and a power unit.
FiOStd evaluation board is chosen as the main controller of the SSN
for its low costs and high performance. The software coding of the
implementation was done using Simulink model and MATLAB
programming language. The experimental results show that the
proposed data compression technique yields recover signal with good
quality. This technique can be applied to compress the collected data
to reduce the data communication as well as the energy consumption
of the sensor and so the lifetime of sensor node can be extended.
Abstract: Personal name matching system is the core of
essential task in national citizen database, text and web mining,
information retrieval, online library system, e-commerce and record
linkage system. It has necessitated to the all embracing research in
the vicinity of name matching. Traditional name matching methods
are suitable for English and other Latin based language. Asian
languages which have no word boundary such as Myanmar language
still requires sounds alike matching system in Unicode based
application. Hence we proposed matching algorithm to get analogous
sounds alike (phonetic) pattern that is convenient for Myanmar
character spelling. According to the nature of Myanmar character, we
consider for word boundary fragmentation, collation of character.
Thus we use pattern conversion algorithm which fabricates words in
pattern with fragmented and collated. We create the Myanmar sounds
alike phonetic group to help in the phonetic matching. The
experimental results show that fragmentation accuracy in 99.32% and
processing time in 1.72 ms.
Abstract: This paper describes the design and implementation of cyber video consultation systems(CVCS) using hybrid P2P for video consultation between remote sites. The proposed system is based on client-server and P2P(Peer to Peer) architecture, where client-server is used for communication with the MCU(Multipoint Control Unit) and P2P is used for the cyber video consultation. The developed video consultation system decreases server traffic, and cuts down network expenses, as the multimedia data decentralizes to the client by hybrid P2P architecture. Also the developed system is tested by the group-type video consultation system using communication protocol and application software through Ethernet networks.
Abstract: Baseball is unique among other sports in Taiwan.
Baseball has become a “symbol of the Taiwanese spirit and Taiwan-s
national sport". Taiwan-s first professional sports league, the Chinese
Professional Baseball League (CPBL), was established in 1989.
Starters pitch many more innings over the course of a season and for
a century teams have made all their best pitchers starters. In this
study, we attempt to determine the on-field performance these
pitchers and which won the most CPBL games in 2009. We utilize
the discriminate analysis approach to solve the problem, examining
winning pitchers and their statistics, to reliably find the best starting
pitcher. The data employed in this paper include innings pitched (IP),
earned runs allowed (ERA) and walks plus hits per inning pitched
(WPHIP) provided by the official website of the CPBL. The results
show that Aaron Rakers was the best starting pitcher of the CPBL.
The top 10 CPBL starting pitchers won 14 games to 8 games in the
2009 season. Though Fisher Discriminant Analysis, predicted to top
10 CPBL starting pitchers probably won 20 games to 9 games, more
1 game to 7 games in actually counts in 2009 season.
Abstract: This Paper proposes a new facial feature extraction approach, Wash-Hadamard Transform (WHT). This approach is based on correlation between local pixels of the face image. Its primary advantage is the simplicity of its computation. The paper compares the proposed approach, WHT, which was traditionally used in data compression with two other known approaches: the Principal Component Analysis (PCA) and the Discrete Cosine Transform (DCT) using the face database of Olivetti Research Laboratory (ORL). In spite of its simple computation, the proposed algorithm (WHT) gave very close results to those obtained by the PCA and DCT. This paper initiates the research into WHT and the family of frequency transforms and examines their suitability for feature extraction in face recognition applications.
Abstract: It is important problems to increase the detection rates
and reduce false positive rates in Intrusion Detection System (IDS).
Although preventative techniques such as access control and
authentication attempt to prevent intruders, these can fail, and as a
second line of defence, intrusion detection has been introduced. Rare
events are events that occur very infrequently, detection of rare
events is a common problem in many domains. In this paper we
propose an intrusion detection method that combines Rough set and
Fuzzy Clustering. Rough set has to decrease the amount of data and
get rid of redundancy. Fuzzy c-means clustering allow objects to
belong to several clusters simultaneously, with different degrees of
membership. Our approach allows us to recognize not only known
attacks but also to detect suspicious activity that may be the result of
a new, unknown attack. The experimental results on Knowledge
Discovery and Data Mining-(KDDCup 1999) Dataset show that the
method is efficient and practical for intrusion detection systems.
Abstract: This study examines the issue of recommendation
sources from the perspectives of gender and consumers- perceived
risk, and validates a model for the antecedents of consumer online
purchases. The method of obtaining quantitative data was that of the
instrument of a survey questionnaire. Data were collected via
questionnaires from 396 undergraduate students aged 18-24, and a
multiple regression analysis was conducted to identify causal
relationships. Empirical findings established the link between
recommendation sources (word-of-mouth, advertising, and
recommendation systems) and the likelihood of making online
purchases and demonstrated the role of gender and perceived risk as
moderators in this context. The results showed that the effects of
word-of-mouth on online purchase intentions were stronger than those
of advertising and recommendation systems. In addition, female
consumers have less experience with online purchases, so they may be
more likely than males to refer to recommendations during the
decision-making process. The findings of the study will help
marketers to address the recommendation factor which influences
consumers- intention to purchase and to improve firm performances to
meet consumer needs.
Abstract: The CMLP building was developed to be a model for
sustainability with strategies to reduce water, energy and pollution,
and to provide a healthy environment for the building occupants. The
aim of this paper is to investigate the environmental effects of energy
used by this building. A LCA (life cycle analysis) was led to measure
the real environmental effects produced by the use of energy. The
impact categories most affected by the energy use were found to be
the human health effects, as well as ecotoxicity. Natural gas
extraction, uranium milling for nuclear energy production, and the
blasting for mining and infrastructure construction are the processes
contributing the most to emissions in the human health effect. Data
comparing LCA results of CMLP building with a conventional
building results showed that energy used by the CMLP building has
less damage for the environment and human health than a
conventional building.
Abstract: In recent years, the use of vector variance as a
measure of multivariate variability has received much attention in
wide range of statistics. This paper deals with a more economic
measure of multivariate variability, defined as vector variance minus
all duplication elements. For high dimensional data, this will increase
the computational efficiency almost 50 % compared to the original
vector variance. Its sampling distribution will be investigated to make
its applications possible.
Abstract: The Eulerian numerical method is proposed to analyze
the explosion in tunnel. Based on this method, an original software
M-MMIC2D is developed by Cµ program language. With this
software, the explosion problem in the tunnel with three
expansion-chambers is numerically simulated, and the results are
found to be in full agreement with the observed experimental data.
Abstract: The purpose of this paper is to demonstrate the ability
of a genetic programming (GP) algorithm to evolve a team of data
classification models. The GP algorithm used in this work is
“multigene" in nature, i.e. there are multiple tree structures (genes)
that are used to represent team members. Each team member assigns
a data sample to one of a fixed set of output classes. A majority vote,
determined using the mode (highest occurrence) of classes predicted
by the individual genes, is used to determine the final class
prediction. The algorithm is tested on a binary classification problem.
For the case study investigated, compact classification models are
obtained with comparable accuracy to alternative approaches.