Abstract: Access control is a critical security service in Wire- less
Sensor Networks (WSNs). To prevent malicious nodes from joining
the sensor network, access control is required. On one hand, WSN
must be able to authorize and grant users the right to access to the
network. On the other hand, WSN must organize data collected by
sensors in such a way that an unauthorized entity (the adversary)
cannot make arbitrary queries. This restricts the network access only
to eligible users and sensor nodes, while queries from outsiders will
not be answered or forwarded by nodes. In this paper we presentee
different access control schemes so as to ?nd out their objectives,
provision, communication complexity, limits, etc. Using the node
density parameter, we also provide a comparison of these proposed
access control algorithms based on the network topology which can
be flat or hierarchical.
Abstract: The stability of Newtonian and Non-Newtonian extending films under local or global heating or cooling conditions are considered. The thickness-averaged mass, momentum and energy equations with convective and radiative heat transfer are derived, both for Newtonian and non-Newtonian fluids (Maxwell, PTT and Giesekus models considered). The stability of the system is explored using either eigenvalue analysis or transient simulations. The results showed that the influence of heating and cooling on stability strongly depends on the magnitude of the Peclet number. Examples of stabilization or destabilization of heating or cooling are shown for Pe
Abstract: Recently there has been a growing interest in the field
of bio-mimetic robots that resemble the behaviors of an insect or an
aquatic animal, among many others. One of various bio-mimetic robot
applications is to explore pipelines, spotting any troubled areas or
malfunctions and reporting its data. Moreover, the robot is able to
prepare for and react to any abnormal routes in the pipeline. Special
types of mobile robots are necessary for the pipeline monitoring tasks.
In order to move effectively along a pipeline, the robot-s movement
will resemble that of insects or crawling animals. When situated in
massive pipelines with complex routes, the robot places fixed sensors
in several important spots in order to complete its monitoring. This
monitoring task is to prevent a major system failure by preemptively
recognizing any minor or partial malfunctions. Areas uncovered by
fixed sensors are usually impossible to provide real-time observation
and examination, and thus are dependent on periodical offline
monitoring. This paper proposes a monitoring system that is able to
monitor the entire area of pipelines–with and without fixed
sensors–by using the bio-mimetic robot.
Abstract: Biofuels, like biobutanol, have been recognized for
being renewable and sustainable fuels which can be produced from
lignocellulosic biomass. To convert lignocellulosic biomass to
biofuel, pretreatment process is an important step to remove
hemicelluloses and lignin to improve enzymatic hydrolysis. Dilute
acid pretreatment has been successful developed for pretreatment of
corncobs and the optimum conditions of dilute sulfuric and
phosphoric acid pretreatment were obtained at 120 °C for 5 min with
15:1 liquid to solid ratio and 140 °C for 10 min with 10:1 liquid to
solid ratio, respectively. The result shows that both of acid
pretreatments gave the content of total sugar approximately 34–35
g/l. In case of inhibitor content (furfural), phosphoric acid
pretreatment gives higher than sulfuric acid pretreatment.
Characterizations of corncobs after pretreatment indicate that both of
acid pretreatments can improve enzymatic accessibility and the better
results present in corncobs pretreated with sulfuric acid in term of
surface area, crystallinity, and composition analysis.
Abstract: Paper presents simple sixport principle and its frequency bandwidth. The novel multisixport approach is presented with its possibilities, typical parameters and frequency bandwidth. Practical implementation is shown with its measurement parameters and calibration. The bandwidth circa 1:100 is obtained.
Abstract: Independent component analysis (ICA) is a computational method for finding underlying signals or components from multivariate statistical data. The ICA method has been successfully applied in many fields, e.g. in vision research, brain imaging, geological signals and telecommunications. In this paper, we apply the ICA method to an analysis of mass spectra of oligomeric species emerged from aluminium sulphate. Mass spectra are typically complex, because they are linear combinations of spectra from different types of oligomeric species. The results show that ICA can decomposite the spectral components for useful information. This information is essential in developing coagulation phases of water treatment processes.
Abstract: In this paper, a new probability density function (pdf)
is proposed to model the statistics of wavelet coefficients, and a
simple Kalman-s filter is derived from the new pdf using Bayesian
estimation theory. Specifically, we decompose the speckled image
into wavelet subbands, we apply the Kalman-s filter to the high
subbands, and reconstruct a despeckled image from the modified
detail coefficients. Experimental results demonstrate that our method
compares favorably to several other despeckling methods on test
synthetic aperture radar (SAR) images.
Abstract: Chicken fat was employed as a feedstock for
producing of biodiesel by trasesterification reaction with methanol
and alkali catalyst (KOH). In this study chicken fat biodiesel with
1.4% free fatty acid, methanol and various amount of potassium
hydroxide for 2 hour were studied. The progression of reaction and
conversion of triglycerides to methyl ester were checked by IR
spectrum method.
Abstract: Introduction: Obesity is a major health risk issue in
the present day of life for one and all globally. Obesity is one of the
major concerns for public health according to recent increasing trends
in obesity-related diseases such as Type 2 diabetes. ( Kazuya,
1994).and hyperlipidemia, (Sakata,1990) .which are more prevalent
in Japanese adults with body mass index (BMI) values Z25 kg/m2.(
Japanese Ministry of Health and Welfare,1997). The purpose of the
study was to assess the effect of twelve weeks of brisk walking on
blood pressure and body mass index, anthropometric measurements
of obese males. Method: Thirty obese (BMI= above 30) males, aged
18 to 22 years, were selected from King Fahd University of
Petroleum & Minerals, Saudi Arabia. The subject-s height (cm) was
measured using a stadiometer and body mass (kg) was measured with
a electronic weighing machine. BMI was subsequently calculated
(kg/m2). The blood pressure was measured with standardized
sphygmomanometer in mm of Hg. All the measurements were taken
twice before and twice after the experimental period. The pre and
post anthropometric measurements of waist and hip circumference
were measured with the steel tape in cm. The subjects underwent
walking schedule two times in a week for 12 weeks. The 45 minute
sessions of brisk walking were undertaken at an average intensity of
65% to 85% of maximum HR (HRmax; calculated as 220-age).
Results & Discussion: Statistical findings revealed significant
changes from pre test to post test in case of both systolic blood
pressure and diastolic blood pressure in the walking group. Results
also showed significant decrease in their body mass index and
anthropometric measurements i.e. (waist & hip circumference).
Conclusion: It was concluded that twelve weeks brisk walking is
beneficial for lowering of blood pressure, body mass index, and
anthropometric circumference of obese males.
Abstract: Most of the image watermarking methods, using the properties of the human visual system (HVS), have been proposed in literature. The component of the visual threshold is usually related to either the spatial contrast sensitivity function (CSF) or the visual masking. Especially on the contrast masking, most methods have not mention to the effect near to the edge region. Since the HVS is sensitive what happens on the edge area. This paper proposes ultrasound image watermarking using the visual threshold corresponding to the HVS in which the coefficients in a DCT-block have been classified based on the texture, edge, and plain area. This classification method enables not only useful for imperceptibility when the watermark is insert into an image but also achievable a robustness of watermark detection. A comparison of the proposed method with other methods has been carried out which shown that the proposed method robusts to blockwise memoryless manipulations, and also robust against noise addition.
Abstract: Seaweed farming is emerging as a viable alternative
activity in the Indonesian fisheries sector. This paper aims to
investigate people-s perceptions of seaweed farming, to analyze its
social and economic impacts and to identify the problems and
obstacles hindering its continued development. Structured and
semi-structured questionnaires were prepared to obtain qualitative
data, and interviews were conducted with fishermen who also plant
seaweed. The findings showed that fishermen in the Laikang Bay were
enthusiastic about cultivating seaweeds and that seaweed plays a major
role in supporting the household economy of fishermen. However,
current seaweed drying technologies cannot support increased
seaweed production on a farm or plot, especially in the rainy season.
Additionally, variable monsoon seasons and long marketing channels
are still major constraints on the development of the industry. Finally,
capture fisheries, the primary economic livelihood of fishermen of
older generations, is being slowly replaced by seaweed farming.
Abstract: In this paper we present a soft timing phase estimation (STPE) method for wireless mobile receivers operating in low signal to noise ratios (SNRs). Discrete Polyphase Matched (DPM) filters, a Log-maximum a posterior probability (MAP) and/or a Soft-output Viterbi algorithm (SOVA) are combined to derive a new timing recovery (TR) scheme. We apply this scheme to wireless cellular communication system model that comprises of a raised cosine filter (RCF), a bit-interleaved turbo-coded multi-level modulation (BITMM) scheme and the channel is assumed to be memory-less. Furthermore, no clock signals are transmitted to the receiver contrary to the classical data aided (DA) models. This new model ensures that both the bandwidth and power of the communication system is conserved. However, the computational complexity of ideal turbo synchronization is increased by 50%. Several simulation tests on bit error rate (BER) and block error rate (BLER) versus low SNR reveal that the proposed iterative soft timing recovery (ISTR) scheme outperforms the conventional schemes.
Abstract: Knowledge Discovery of Databases (KDD) is the
process of extracting previously unknown but useful and significant
information from large massive volume of databases. Data Mining is
a stage in the entire process of KDD which applies an algorithm to
extract interesting patterns. Usually, such algorithms generate huge
volume of patterns. These patterns have to be evaluated by using
interestingness measures to reflect the user requirements.
Interestingness is defined in different ways, (i) Objective measures
(ii) Subjective measures. Objective measures such as support and
confidence extract meaningful patterns based on the structure of the
patterns, while subjective measures such as unexpectedness and
novelty reflect the user perspective. In this report, we try to brief the
more widely spread and successful subjective measures and propose
a new subjective measure of interestingness, i.e. shocking.
Abstract: A trend in agent community or enterprises is that they are shifting from closed to open architectures composed of a large number of autonomous agents. One of its implications could be that interface agent framework is getting more important in multi-agent system (MAS); so that systems constructed for different application domains could share a common understanding in human computer interface (HCI) methods, as well as human-agent and agent-agent interfaces. However, interface agent framework usually receives less attention than other aspects of MAS. In this paper, we will propose an interface web agent framework which is based on our former project called WAF and a Distributed HCI template. A group of new functionalities and implications will be discussed, such as web agent presentation, off-line agent reference, reconfigurable activation map of agents, etc. Their enabling techniques and current standards (e.g. existing ontological framework) are also suggested and shown by examples from our own implementation in WAF.
Abstract: This study created new graphical icons and operating
functions in a CAD/CAM software system by analyzing icons in some
of the popular systems, such as AutoCAD, AlphaCAM, Mastercam
and the 1st edition of LiteCAM. These software systems all focused on
geometric design and editing, thus how to transmit messages
intuitively from icon itself to users is an important function of
graphical icons. The primary purpose of this study is to design
innovative icons and commands for new software.
This study employed the TRIZ method, an innovative design
method, to generate new concepts systematically. Through literature
review, it then investigated and analyzed the relationship between
TRIZ and idea development. Contradiction Matrix and 40 Principles
were used to develop an assisting tool suitable for icon design in
software development. We first gathered icon samples from the
selected CAD/CAM systems. Then grouped these icons by
meaningful functions, and compared useful and harmful properties.
Finally, we developed new icons for new software systems in order to
avoid intellectual property problem.
Abstract: Medical imaging uses the advantage of digital
technology in imaging and teleradiology. In teleradiology systems
large amount of data is acquired, stored and transmitted. A major
technology that may help to solve the problems associated with the
massive data storage and data transfer capacity is data compression
and decompression. There are many methods of image compression
available. They are classified as lossless and lossy compression
methods. In lossy compression method the decompressed image
contains some distortion. Fractal image compression (FIC) is a lossy
compression method. In fractal image compression an image is
coded as a set of contractive transformations in a complete metric
space. The set of contractive transformations is guaranteed to
produce an approximation to the original image. In this paper FIC is
achieved by PIFS using quadtree partitioning. PIFS is applied on
different images like , Ultrasound, CT Scan, Angiogram, X-ray,
Mammograms. In each modality approximately twenty images are
considered and the average values of compression ratio and PSNR
values are arrived. In this method of fractal encoding, the
parameter, tolerance factor Tmax, is varied from 1 to 10, keeping the
other standard parameters constant. For all modalities of images the
compression ratio and Peak Signal to Noise Ratio (PSNR) are
computed and studied. The quality of the decompressed image is
arrived by PSNR values. From the results it is observed that the
compression ratio increases with the tolerance factor and
mammogram has the highest compression ratio. The quality of the
image is not degraded upto an optimum value of tolerance factor,
Tmax, equal to 8, because of the properties of fractal compression.
Abstract: In this paper, we use nonlinear system identification method to predict and detect process fault of a cement rotary kiln. After selecting proper inputs and output, an input-output model is identified for the plant. To identify the various operation points in the
kiln, Locally Linear Neuro-Fuzzy (LLNF) model is used. This model is trained by LOLIMOT algorithm which is an incremental treestructure
algorithm. Then, by using this method, we obtained 3
distinct models for the normal and faulty situations in the kiln. One of the models is for normal condition of the kiln with 15 minutes
prediction horizon. The other two models are for the two faulty situations in the kiln with 7 minutes prediction horizon are presented.
At the end, we detect these faults in validation data. The data collected from White Saveh Cement Company is used for in this study.
Abstract: Hydrogen that used as fuel in fuel cell vehicles can be
produced from renewable sources such as wind, solar, and hydro
technologies. PV-electrolyzer is one of the promising methods to
produce hydrogen with zero pollution emission. Hydrogen
production from a PV-electrolyzer system depends on the efficiency
of the electrolyzer and photovoltaic array, and sun irradiance at that
site. In this study, the amount of hydrogen is obtained using
mathematical equations for difference driving distance and sun peak
hours. The results show that the minimum of 99 PV modules are used
to generate 1.75 kgH2 per day for two vehicles.
Abstract: This study presents an exact general solution for
steady-state conductive heat transfer in cylindrical composite
laminates. Appropriate Fourier transformation has been obtained
using Sturm-Liouville theorem. Series coefficients are achieved by
solving a set of equations that related to thermal boundary conditions
at inner and outer of the cylinder, also related to temperature
continuity and heat flux continuity between each layer. The solution
of this set of equations are obtained using Thomas algorithm. In this
paper, the effect of fibers- angle on temperature distribution of
composite laminate is investigated under general boundary
conditions. Here, we show that the temperature distribution for any
composite laminates is between temperature distribution for
laminates with θ = 0° and θ = 90° .
Abstract: Schema matching plays a key role in many different
applications, such as schema integration, data integration, data
warehousing, data transformation, E-commerce, peer-to-peer data
management, ontology matching and integration, semantic Web,
semantic query processing, etc. Manual matching is expensive and
error-prone, so it is therefore important to develop techniques to
automate the schema matching process. In this paper, we present a
solution for XML schema automated matching problem which
produces semantic mappings between corresponding schema
elements of given source and target schemas. This solution
contributed in solving more comprehensively and efficiently XML
schema automated matching problem. Our solution based on
combining linguistic similarity, data type compatibility and structural
similarity of XML schema elements. After describing our solution,
we present experimental results that demonstrate the effectiveness of
this approach.