Abstract: Wireless sensor networks (WSNs) have gained
tremendous attention in recent years due to their numerous
applications. Due to the limited energy resource, energy efficient
operation of sensor nodes is a key issue in wireless sensor networks.
Cooperative caching which ensures sharing of data among various
nodes reduces the number of communications over the wireless
channels and thus enhances the overall lifetime of a wireless sensor
network. In this paper, we propose a cooperative caching scheme
called ZCS (Zone Cooperation at Sensors) for wireless sensor
networks. In ZCS scheme, one-hop neighbors of a sensor node form a
cooperative cache zone and share the cached data with each other.
Simulation experiments show that the ZCS caching scheme achieves
significant improvements in byte hit ratio and average query latency
in comparison with other caching strategies.
Abstract: Currently, there has been a 3G mobile networks data
traffic explosion due to the large increase in the number of smartphone
users. Unlike a traditional wired infrastructure, 3G mobile networks
have limited wireless resources and signaling procedures for complex
wireless resource management. And mobile network security for
various abnormal and malicious traffic technologies was not ready. So
Malicious or potentially malicious traffic originating from mobile
malware infected smart devices can cause serious problems to the 3G
mobile networks, such as DoS and scanning attack in wired networks.
This paper describes the DoS security threat in the 3G mobile network
and proposes a detection technology.
Abstract: In this paper we propose a new traffic simulation
package, TDMSim, which supports both macroscopic and
microscopic simulation on free-flowing and regulated traffic systems.
Both simulators are based on travel demands, which specify the
numbers of vehicles departing from origins to arrive at different
destinations. The microscopic simulator implements the carfollowing
model given the pre-defined routes of the vehicles but also
supports the rerouting of vehicles. We also propose a macroscopic
simulator which is built in integration with the microscopic simulator
to allow the simulation to be scaled for larger networks without
sacrificing the precision achievable through the microscopic
simulator. The macroscopic simulator also enables the reuse of
previous simulation results when simulating traffic on the same
networks at later time. Validations have been conducted to show the
correctness of both simulators.
Abstract: Linear stochastic estimation and quadratic stochastic
estimation techniques were applied to estimate the entire velocity
flow-field of an open cavity with a length to depth ratio of 2. The
estimations were done through the use of instantaneous velocity
magnitude as estimators. These measurements were obtained by
Particle Image Velocimetry. The predicted flow was compared
against the original flow-field in terms of the Reynolds stresses and
turbulent kinetic energy. Quadratic stochastic estimation proved to be
more superior than linear stochastic estimation in resolving the shear
layer flow. When the velocity fluctuations were scaled up in the
quadratic estimate, both the time-averaged quantities and the
instantaneous cavity flow can be predicted to a rather accurate extent.
Abstract: I/O workload is a critical and important factor to
analyze I/O pattern and to maximize file system performance.
However to measure I/O workload on running distributed parallel file
system is non-trivial due to collection overhead and large volume of
data. In this paper, we measured and analyzed file system activities on
two large-scale cluster systems which had TFlops level high
performance computation resources. By comparing file system
activities of 2009 with those of 2006, we analyzed the change of I/O
workloads by the development of system performance and high-speed
network technology.
Abstract: The current study begins with an awareness that
today-s media environment is characterized by technological
development and a new way of reading caused by the introduction of
the Internet. The researcher conducted a meta analysis framed within
Technological Determinism to investigate the process of hypertext
reading, its differences from linear reading and the effects such
differences can have on people-s ways of mentally structuring their
world. The relationship between literacy and the comprehension
achieved by reading hypertexts is also investigated. The results show
hypertexts are not always user friendly. People experience hyperlinks
as interruptions that distract their attention generating comprehension
and disorientation. On one hand hypertextual jumping reading
generates interruptions that finally make people lose their
concentration. On the other hand hypertexts fascinate people who
would rather read a document in such a format even though the
outcome is often frustrating and affects their ability to elaborate and
retain information.
Abstract: Optimization of rational geometrical and mechanical
parameters of panel with curved plywood ribs is considered in this
paper. The panel consists of cylindrical plywood ribs manufactured
from Finish plywood, upper and bottom plywood flange, stiffness
diaphragms. Panel is filled with foam. Minimal ratio of structure self
weight and load that could be applied to structure is considered as
rationality criteria. Optimization is done, by using classical beam
theory without nonlinearities. Optimization of discreet design
variables is done by Genetic algorithm.
Abstract: Business scenario is an important technique that may be used at various stages of the enterprise architecture to derive its characteristics based on the high-level requirements of the business. In terms of wireless deployments, they are used to help identify and understand business needs involving wireless services, and thereby to derive the business requirements that the architecture development has to address by taking into account of various wireless challenges. This study assesses the deployment of Wireless Local Area Network (WLAN) and Broadband Wireless Access (BWA) solutions for several business scenarios in Asia Pacific region. This paper focuses on the overview of the business and technology environments, whereby examples of existing (or suggested) wireless solutions (to be) adopted in Asia Pacific region will be discussed. Interactions of several players, enabling technologies, and key processes in the wireless environments are studied. The analysis and discussions associated to this study are divided into two divisions: healthcare and education, where the merits of wireless solutions in improving living quality are highlighted.
Abstract: The backpropagation algorithm in general employs quadratic error function. In fact, most of the problems that involve minimization employ the Quadratic error function. With alternative error functions the performance of the optimization scheme can be improved. The new error functions help in suppressing the ill-effects of the outliers and have shown good performance to noise. In this paper we have tried to evaluate and compare the relative performance of complex valued neural network using different error functions. During first simulation for complex XOR gate it is observed that some error functions like Absolute error, Cauchy error function can replace Quadratic error function. In the second simulation it is observed that for some error functions the performance of the complex valued neural network depends on the architecture of the network whereas with few other error functions convergence speed of the network is independent of architecture of the neural network.
Abstract: In this paper, the optimum weight and cost of a laminated composite plate is seeked, while it undergoes the heaviest load prior to a complete failure. Various failure criteria are defined for such structures in the literature. In this work, the Tsai-Hill theory is used as the failure criterion. The theory of analysis was based on the Classical Lamination Theory (CLT). A newly type of Genetic Algorithm (GA) as an optimization technique with a direct use of real variables was employed. Yet, since the optimization via GAs is a long process, and the major time is consumed through the analysis, Radial Basis Function Neural Networks (RBFNN) was employed in predicting the output from the analysis. Thus, the process of optimization will be carried out through a hybrid neuro-GA environment, and the procedure will be carried out until a predicted optimum solution is achieved.
Abstract: Nowadays, computer worms, viruses and Trojan horse
become popular, and they are collectively called malware. Those
malware just spoiled computers by deleting or rewriting important
files a decade ago. However, recent malware seems to be born to earn
money. Some of malware work for collecting personal information so
that malicious people can find secret information such as password for
online banking, evidence for a scandal or contact address which relates
with the target. Moreover, relation between money and malware
becomes more complex. Many kinds of malware bear bots to get
springboards. Meanwhile, for ordinary internet users,
countermeasures against malware come up against a blank wall.
Pattern matching becomes too much waste of computer resources,
since matching tools have to deal with a lot of patterns derived from
subspecies. Virus making tools can automatically bear subspecies of
malware. Moreover, metamorphic and polymorphic malware are no
longer special. Recently there appears malware checking sites that
check contents in place of users' PC. However, there appears a new
type of malicious sites that avoids check by malware checking sites. In
this paper, existing protocols and methods related with the web are
reconsidered in terms of protection from current attacks, and new
protocol and method are indicated for the purpose of security of the
web.
Abstract: The feasibility of applying a simple and cost effective sliding friction testing apparatus to study the friction behaviour of a clutch facing material, effected by the variation of temperature and contact pressure, was investigated. It was found that the method used in this work was able to give a convenient and cost effective measurement of friction coefficients and their transitions of a clutch facing material. The obtained results will be useful for the development process of new facing materials.
Abstract: Independent spanning trees (ISTs) provide a number of advantages in data broadcasting. One can cite the use in fault tolerance network protocols for distributed computing and bandwidth. However, the problem of constructing multiple ISTs is considered hard for arbitrary graphs. In this paper we present an efficient algorithm to construct ISTs on hypercubes that requires minimum resources to be performed.
Abstract: This paper presents the design, analysis and
development of permanent magnet (PM) torque couplers. These
couplers employ rare-earth magnets. Based on finite element analysis
and earlier analytical works both concentric and face-type
synchronous type couplers have been designed and fabricated. The
experimental performance has good correlation with finite element
calculations.
Abstract: This paper presents an equivalent circuit model based on piecewise linear parallel branches (PLPB) to study solar cell modules which are partially shaded. The PLPB model can easily be used in circuit simulation software such as the ElectroMagnetic Transients Program (EMTP). This PLPB model allows the user to simulate several different configurations of solar cells, the influence of partial shadowing on a single or multiple cells, the influence of the number of solar cells protected by a bypass diode and the effect of the cell connection configuration on partial shadowing.
Abstract: Medical image modalities such as computed
tomography (CT), magnetic resonance imaging (MRI), ultrasound
(US), X-ray are adapted to diagnose disease. These modalities
provide flexible means of reviewing anatomical cross-sections and
physiological state in different parts of the human body. The raw
medical images have a huge file size and need large storage
requirements. So it should be such a way to reduce the size of those
image files to be valid for telemedicine applications. Thus the image
compression is a key factor to reduce the bit rate for transmission or
storage while maintaining an acceptable reproduction quality, but it is
natural to rise the question of how much an image can be compressed
and still preserve sufficient information for a given clinical
application. Many techniques for achieving data compression have
been introduced. In this study, three different MRI modalities which
are Brain, Spine and Knee have been compressed and reconstructed
using wavelet transform. Subjective and objective evaluation has
been done to investigate the clinical information quality of the
compressed images. For the objective evaluation, the results show
that the PSNR which indicates the quality of the reconstructed image
is ranging from (21.95 dB to 30.80 dB, 27.25 dB to 35.75 dB, and
26.93 dB to 34.93 dB) for Brain, Spine, and Knee respectively. For
the subjective evaluation test, the results show that the compression
ratio of 40:1 was acceptable for brain image, whereas for spine and
knee images 50:1 was acceptable.
Abstract: Extraction of laccase produced by L. polychrous in an
aqueous two-phase system, composed of polyethylene glycol and
phosphate salt at pH 7.0 and 250C was investigated. The effect of
PEG molecular weight, PEG concentration and phosphate
concentration was determined. Laccase preferentially partitioned to
the top phase. Good extraction of laccase to the top phase was
observed with PEG 4000. The optimum system was found in the
system containing 12% w/w PEG 4000 and 16% w/w phosphate salt
with KE of 88.3, purification factor of 3.0-fold and 99.1% yield.
Some properties of the enzyme such as thermal stability, effect of
heavy metal ions and kinetic constants were also presented in this
work. The thermal stability decreased sharply with high temperature
above 60 0C. The enzyme was inhibited by Cd2+, Pb2+, Zn2+ and
Cu2+. The Vmax and Km values of the enzyme were 74.70
μmol/min/ml and 9.066 mM respectively.
Abstract: The convergence of heterogeneous wireless access technologies characterizes the 4G wireless networks. In such converged systems, the seamless and efficient handoff between
different access technologies (vertical handoff) is essential and remains a challenging problem. The heterogeneous co-existence of access technologies with largely different characteristics creates a decision problem of determining the “best" available network at
“best" time to reduce the unnecessary handoffs. This paper proposes a dynamic decision model to decide the “best" network at “best"
time moment to handoffs. The proposed dynamic decision model make the right vertical handoff decisions by determining the “best"
network at “best" time among available networks based on, dynamic
factors such as “Received Signal Strength(RSS)" of network and
“velocity" of mobile station simultaneously with static factors like Usage Expense, Link capacity(offered bandwidth) and power
consumption. This model not only meets the individual user needs but also improve the whole system performance by reducing the unnecessary handoffs.
Abstract: Sediment and mangrove root samples from Iko River
Estuary, Nigeria were analyzed for microbial and polycyclic
aromatic hydrocarbon (PAH) content. The total heterotrophic
bacterial (THB) count ranged from 1.1x107 to 5.1 x107 cfu/g, total
fungal (TF) count ranged from 1.0x106 to 2.7x106 cfu/g, total
coliform (TC) count ranged from 2.0x104 to 8.0x104cfu/g while
hydrocarbon utilizing bacterial (HUB) count ranged from 1.0x 105 to
5.0 x 105cfu/g. There was a range of positive correlation (r = 0.72 to
0.93) between THB count and total HUB count, respectively. The
organisms were Staphylococcus aureus, Bacillus cereus,
Flavobacterium breve, Pseudomonas aeruginosa, Erwinia
amylovora, Escherichia coli, Enterobacter sp, Desulfovibrio sp,
Acinetobacter iwoffii, Chromobacterium violaceum, Micrococcus
sedentarius, Corynebacterium sp, and Pseudomonas putrefaciens.
The PAH were Naphthalene, 2-Methylnaphthalene, Acenapthylene,
Acenaphthene, Fluorene, Phenanthene, Anthracene, Fluoranthene,
Pyrene, Benzo(a)anthracene, Chrysene, Benzo(b)fluoranthene,
Benzo(k)fluoranthene, Benzo(a)pyrene, Dibenzo(a,h)anthracene,
Benzo(g,h,l)perylene ,Indeno(1,2,3-d)pyrene with individual PAH
concentrations that ranged from 0.20mg/kg to 1.02mg/kg, 0.20mg/kg
to 1.07mg/kg and 0.2mg/kg to 4.43mg/kg in the benthic sediment,
epipellic sediment and mangrove roots, respectively. Total PAH
ranged from 6.30 to 9.93mg/kg, 6.30 to 9.13mg/kg and 9.66 to
16.68mg/kg in the benthic sediment, epipellic sediment and
mangrove roots, respectively. The high concentrations in the
mangrove roots are indicative of bioaccumulation of the pollutant in
the plant tissue. The microorganisms are of ecological significance
and the detectable quantities of polycyclic aromatic hydrocarbon
could be partitioned and accumulated in tissues of infaunal and
epifaunal organisms in the study area.
Abstract: Tumor classification is a key area of research in the
field of bioinformatics. Microarray technology is commonly used in
the study of disease diagnosis using gene expression levels. The
main drawback of gene expression data is that it contains thousands
of genes and a very few samples. Feature selection methods are used
to select the informative genes from the microarray. These methods
considerably improve the classification accuracy. In the proposed
method, Genetic Algorithm (GA) is used for effective feature
selection. Informative genes are identified based on the T-Statistics,
Signal-to-Noise Ratio (SNR) and F-Test values. The initial candidate
solutions of GA are obtained from top-m informative genes. The
classification accuracy of k-Nearest Neighbor (kNN) method is used
as the fitness function for GA. In this work, kNN and Support Vector
Machine (SVM) are used as the classifiers. The experimental results
show that the proposed work is suitable for effective feature
selection. With the help of the selected genes, GA-kNN method
achieves 100% accuracy in 4 datasets and GA-SVM method
achieves in 5 out of 10 datasets. The GA with kNN and SVM
methods are demonstrated to be an accurate method for microarray
based tumor classification.