Abstract: Optimum communication and performance in
Wireless Sensor Networks, constitute multi-facet challenges due to
the specific networking characteristics as well as the scarce resource
availability. Furthermore, it is becoming increasingly apparent that
isolated layer based approaches often do not meet the demands posed
by WSNs applications due to omission of critical inter-layer
interactions and dependencies. As a counterpart, cross-layer is
receiving high interest aiming to exploit these interactions and
increase network performance. However, in order to clearly identify
existing dependencies, comprehensive performance studies are
required evaluating the effect of different critical network parameters
on system level performance and behavior.This paper-s main
objective is to address the need for multi-parametric performance
evaluations considering critical network parameters using a well
known network simulator, offering useful and practical conclusions
and guidelines. The results reveal strong dependencies among
considered parameters which can be utilized by and drive future
research efforts, towards designing and implementing highly efficient
protocols and architectures.
Abstract: People usually have a telephone voice, which means
they adjust their speech to fit particular situations and to blend in with
other interlocutors. The question is: Do we speak differently to
different people? This possibility has been suggested by social
psychologists within Accommodation Theory [1]. Converging toward
the speech of another person can be regarded as a polite speech
strategy while choosing a language not used by the other interlocutor
can be considered as the clearest example of speech divergence [2].
The present study sets out to investigate such processes in the course
of everyday telephone conversations. Using Joos-s [3] model of
formality in spoken English, the researchers try to explore
convergence to or divergence from the addressee. The results
propound the actuality that lexical choice, and subsequently, patterns
of style vary intriguingly in concordance with the person being
addressed.
Abstract: Nowadays, consumption of whole flours and flours
with high extraction rate is recommended, because of their high
amount of fibers, vitamins and minerals. Despite nutritional benefits
of whole flour, concentration of some undesirable components such
as phytic acid is higher than white flour. In this study, effect of
several lactic acid bacteria sourdough on Toast bread is investigated.
Sourdough from lactic acid bacteria (Lb. plantarum, Lb. reuteri) with
different dough yield (250 and 300) is made and incubated at 30°C
for 20 hour, then added to dough in the ratio of 10, 20 and 30%
replacement. Breads that supplemented with Lb. plantarum
sourdough had lower phytic acid. Higher replacement of sourdough
and higher DY cause higher decrease in phytic acid content.
Sourdough from Lb. plantarum, DY = 300 and 30% replacement
cause the highest decrease in phytic acid content (49.63 mg/100g).
As indicated by panelists, Lb. reuteri sourdough can present the
greatest effect on overall quality score of the breads. DY reduction
cause a decrease in bread quality score. Sensory score of Toast bread
is 81.71 in the samples that treated with Lb. reuteri sourdough with
DY = 250 and 20% replacement.
Abstract: The growing outsourcing of logistics services
resulting from the ongoing current in firms of costs
reduction/increased efficiency means that it is becoming more and
more important for the companies doing the outsourcing to carry out
a proper evaluation.
The multiple definitions and measures of logistics service
performance found in research on the topic create a certain degree of
confusion and do not clear the way towards the proper measurement
of their performance. Do a model and a specific set of indicators exist
that can be considered appropriate for measuring the performance of
logistics services outsourcing in industrial environments? Are said
indicators in keeping with the objectives pursued by outsourcing? We
aim to answer these and other research questions in the study we have
initiated in the field within the framework of the international High
Performance Manufacturing (HPM) project of which this paper
forms part.
As the first stage of this research, this paper reviews articles
dealing with the topic published in the last 15 years with the aim of
detecting the models most used to make this measurement and
determining which performance indicators are proposed as part of
said models and which are most used. The first steps are also taken in
determining whether these indicators, financial and operational, cover
the aims that are being pursued when outsourcing logistics services.
The findings show there is a wide variety of both models and
indicators used. This would seem to testify to the need to continue
with our research in order to try to propose a model and a set of
indicators for measuring the performance of logistics services
outsourcing in industrial environments.
Abstract: This paper presents a simple approach for load
flow analysis of a radial distribution network. The proposed
approach utilizes forward and backward sweep algorithm
based on Kirchoff-s current law (KCL) and Kirchoff-s voltage
law (KVL) for evaluating the node voltages iteratively. In this
approach, computation of branch current depends only on the
current injected at the neighbouring node and the current in
the adjacent branch. This approach starts from the end nodes
of sub lateral line, lateral line and main line and moves
towards the root node during branch current computation. The
node voltage evaluation begins from the root node and moves
towards the nodes located at the far end of the main, lateral
and sub lateral lines. The proposed approach has been tested
using four radial distribution systems of different size and
configuration and found to be computationally efficient.
Abstract: Mitochondria are dynamic organelles, capable to
interact with each other. While the number of mitochondria in a cell
varies, their quality and functionality depends on the operation of
fusion, fission, motility and mitophagy. Nowadays, several
researches declare as an important factor in neurogenerative diseases
the disruptions in the regulation of mitochondrial dynamics. In this
paper a stochastic model in BioAmbients calculus is presented,
concerning mitochondrial fusion and its distribution in the renewal of
mitochondrial population in a cell. This model describes the
successive and dependent stages of protein synthesis, protein-s
activation and merging of two independent mitochondria.
Abstract: Tip vortex cavitation is one of well known patterns of
cavitation phenomenon which occurs in axial pumps. This pattern of
cavitation occurs due to pressure difference between the pressure and
suction sides of blades of an axial pump. Since the pressure in the
pressure side of the blade is higher than the pressure in its suction
side, thus a very small portion of liquid flow flows back from
pressure side to the suction side. This fact is cause of tip vortex
cavitation and gap cavitation that may occur in axial pumps. In this
paper the results of our experimental investigation about movement
of tip vortex cavitation along blade edge due to reduction of pump
flow rate in an axial pump is reported. Results show that reduction of
pump flow rate in conjunction with increasing of outlet pressure
causes movement of tip vortex cavitation along blade edge towards
the blade tip. Results also show that by approaching tip vortex
cavitation to the blade tip, vortex tip pattern of cavitation replaces
with a cavitation phenomenon on the blade tip. Furthermore by
further reduction of pump flow rate and increasing of outlet pressure,
an unstable cavitation phenomenon occurs between each blade
leading edge and the next blade trailing edge.
Abstract: Wireless LAN technologies have picked up
momentum in the recent years due to their ease of deployment, cost
and availability. The era of wireless LAN has also given rise to
unique applications like VOIP, IPTV and unified messaging.
However, these real-time applications are very sensitive to network
and handoff latencies. To successfully support these applications,
seamless roaming during the movement of mobile station has become
crucial. Nowadays, centralized architecture models support roaming
in WLANs. They have the ability to manage, control and
troubleshoot large scale WLAN deployments. This model is managed
by Control and Provision of Wireless Access Point protocol
(CAPWAP). This paper covers the CAPWAP architectural solution
along with its proposals that have emerged. Based on the literature
survey conducted in this paper, we found that the proposed
algorithms to reduce roaming latency in CAPWAP architecture do
not support seamless roaming. Additionally, they are not sufficient
during the initial period of the network. This paper also suggests
important design consideration for mobility support in future
centralized IEEE 802.11 networks.
Abstract: Nowadays there is a growing interest in biofuel production in most countries because of the increasing concerns about hydrocarbon fuel shortage and global climate changes, also for enhancing agricultural economy and producing local needs for transportation fuel. Ethanol can be produced from biomass by the hydrolysis and sugar fermentation processes. In this study ethanol was produced without using expensive commercial enzymes from sugarcane bagasse. Alkali pretreatment was used to prepare biomass before enzymatic hydrolysis. The comparison between NaOH, KOH and Ca(OH)2 shows NaOH is more effective on bagasse. The required enzymes for biomass hydrolysis were produced from sugarcane solid state fermentation via two fungi: Trichoderma longibrachiatum and Aspergillus niger. The results show that the produced enzyme solution via A. niger has functioned better than T. longibrachiatum. Ethanol was produced by simultaneous saccharification and fermentation (SSF) with crude enzyme solution from T. longibrachiatum and Saccharomyces cerevisiae yeast. To evaluate this procedure, SSF of pretreated bagasse was also done using Celluclast 1.5L by Novozymes. The yield of ethanol production by commercial enzyme and produced enzyme solution via T. longibrachiatum was 81% and 50% respectively.
Abstract: Segmentation, filtering out of measurement errors and
identification of breakpoints are integral parts of any analysis of
microarray data for the detection of copy number variation (CNV).
Existing algorithms designed for these tasks have had some successes
in the past, but they tend to be O(N2) in either computation time or
memory requirement, or both, and the rapid advance of microarray
resolution has practically rendered such algorithms useless. Here we
propose an algorithm, SAD, that is much faster and much less thirsty
for memory – O(N) in both computation time and memory requirement
-- and offers higher accuracy. The two key ingredients of SAD are the
fundamental assumption in statistics that measurement errors are
normally distributed and the mathematical relation that the product of
two Gaussians is another Gaussian (function). We have produced a
computer program for analyzing CNV based on SAD. In addition to
being fast and small it offers two important features: quantitative
statistics for predictions and, with only two user-decided parameters,
ease of use. Its speed shows little dependence on genomic profile.
Running on an average modern computer, it completes CNV analyses
for a 262 thousand-probe array in ~1 second and a 1.8 million-probe
array in 9 seconds
Abstract: In this research, the use of light beam size to design the adjustable mirror bender is presented. The focused beam line characterized by its size towards the synchrotron light beam line is investigated. The COSMOSWorks is used in all simulation components of curvature adjustment system to analyze in finite element method. The results based on simulation covers the use of applied forces during adjustment of the mirror radius are presented.
Abstract: Internet is nowadays included to all National Curriculums of the elementary school. A comparative study of their
goals leads to the conclusion that a complete curriculum should aim to student-s acquisition of the abilities to navigate and search for
information and additionally to emphasize on the evaluation of the information provided by the World Wide Web. In a constructivistic knowledge framework the design of a course has to take under
consideration the conceptual representations of students. The following paper presents the conceptual representation of students of eleven years old, attending the Sixth Grade of Greek Elementary School about World Wide Web and their use in the design and
implementation of an innovative course.
Abstract: Human amniotic membrane (HAM) is a useful
biological material for the reconstruction of damaged ocular surface.
The processing and preservation of HAM is critical to prevent the
patients undergoing amniotic membrane transplant (AMT) from cross
infections. For HAM preparation human placenta is obtained after an
elective cesarean delivery. Before collection, the donor is screened
for seronegativity of HCV, Hbs Ag, HIV and Syphilis. After
collection, placenta is washed in balanced salt solution (BSS) in
sterile environment. Amniotic membrane is then separated from the
placenta as well as chorion while keeping the preparation in BSS.
Scrapping of HAM is then carried out manually until all the debris is
removed and clear transparent membrane is acquired. Nitrocellulose
membrane filters are then placed on the stromal side of HAM, cut
around the edges with little membrane folded towards other side
making it easy to separate during surgery. HAM is finally stored in
solution of glycerine and Dulbecco-s Modified Eagle Medium
(DMEM) in 1:1 ratio containing antibiotics. The capped borosil vials
containing HAM are kept at -80°C until use. This vial is thawed to
room temperature and opened under sterile operation theatre
conditions at the time of surgery.
Abstract: In Algeria, now, the oil pumping plants are fed with electric power by independent local sources. This type of feeding has many advantages (little climatic influence, independent operation). However it requires a qualified maintenance staff, a rather high frequency of maintenance and repair and additional fuel costs. Taking into account the increasing development of the national electric supply network (Sonelgaz), a real possibility of transfer of the local sources towards centralized sources appears.These latter cannot only be more economic but more reliable than the independent local sources as well. In order to carry out this transfer, it is necessary to work out an optimal strategy to rebuilding these networks taking in account the economic parameters and the indices of reliability.
Abstract: Nowadays, we are facing with network threats that
cause enormous damage to the Internet community day by day. In
this situation, more and more people try to prevent their network
security using some traditional mechanisms including firewall,
Intrusion Detection System, etc. Among them honeypot is a versatile
tool for a security practitioner, of course, they are tools that are meant
to be attacked or interacted with to more information about attackers,
their motives and tools. In this paper, we will describe usefulness of
low-interaction honeypot and high-interaction honeypot and
comparison between them. And then we propose hybrid honeypot
architecture that combines low and high -interaction honeypot to
mitigate the drawback. In this architecture, low-interaction honeypot
is used as a traffic filter. Activities like port scanning can be
effectively detected by low-interaction honeypot and stop there.
Traffic that cannot be handled by low-interaction honeypot is handed
over to high-interaction honeypot. In this case, low-interaction
honeypot is used as proxy whereas high-interaction honeypot offers
the optimal level realism. To prevent the high-interaction honeypot
from infections, containment environment (VMware) is used.
Abstract: This paper presents a comparative analysis of a new
unsupervised PCA-based technique for steel plates texture segmentation
towards defect detection. The proposed scheme called Variance
Based Component Analysis or VBCA employs PCA for feature
extraction, applies a feature reduction algorithm based on variance of
eigenpictures and classifies the pixels as defective and normal. While
the classic PCA uses a clusterer like Kmeans for pixel clustering,
VBCA employs thresholding and some post processing operations to
label pixels as defective and normal. The experimental results show
that proposed algorithm called VBCA is 12.46% more accurate and
78.85% faster than the classic PCA.
Abstract: The main objectives of this paper are to measure
pollutants concentrations in the oil refinery area in Kuwait over three
periods during one year, obtain recent emission inventory for the
three refineries of Kuwait, use AERMOD and the emission inventory
to predict pollutants concentrations and distribution, compare model
predictions against measured data, and perform numerical
experiments to determine conditions at which emission rates and the
resulting pollutant dispersion is below maximum allowable limits.
Abstract: Brain Computer Interface (BCI) has been recently
increased in research. Functional Near Infrared Spectroscope (fNIRs)
is one the latest technologies which utilize light in the near-infrared
range to determine brain activities. Because near infrared technology
allows design of safe, portable, wearable, non-invasive and wireless
qualities monitoring systems, fNIRs monitoring of brain
hemodynamics can be value in helping to understand brain tasks. In
this paper, we present results of fNIRs signal analysis indicating that
there exist distinct patterns of hemodynamic responses which
recognize brain tasks toward developing a BCI. We applied two
different mathematics tools separately, Wavelets analysis for
preprocessing as signal filters and feature extractions and Neural
networks for cognition brain tasks as a classification module. We
also discuss and compare with other methods while our proposals
perform better with an average accuracy of 99.9% for classification.
Abstract: Place is a where dimension formed by people-s
relationship with physical settings, individual and group activities,
and meanings. 'Place Attachment', 'Place Identity'and 'Sense of
Place' are some concepts that could describe the quality of people-s
relationships with a place. The concept of Sense of place is used in
studying human-place bonding, attachment and place meaning. Sense
of Place usually is defined as an overarching impression
encompassing the general ways in which people feel about places,
senses it, and assign concepts and values to it. Sense of place is
highlighted in this article as one of the prevailing concepts among
place-based researches. Considering dimensions of sense of place has
always been beneficial for investigating public place attachment and
pro-environmental attitudes towards these places. The creation or
preservation of Sense of place is important in maintaining the quality
of the environment as well as the integrity of human life within it.
While many scholars argued that sense of place is a vague concept,
this paper will summarize and analyze the existing seminal literature.
Therefore, in this paper first the concept of Sense of place and its
characteristics will be examined afterward the scales of Sense of
place will be reviewed and the factors that contribute to form Sense
of place will be evaluated and finally Place Attachment as an
objective dimension for measuring the sense of place will be
described.
Abstract: Speckled images arise when coherent microwave,
optical, and acoustic imaging techniques are used to image an object, surface or scene. Examples of coherent imaging systems include synthetic aperture radar, laser imaging systems, imaging sonar
systems, and medical ultrasound systems. Speckle noise is a form of object or target induced noise that results when the surface of the object is Rayleigh rough compared to the wavelength of the illuminating radiation. Detection and estimation in images corrupted
by speckle noise is complicated by the nature of the noise and is not
as straightforward as detection and estimation in additive noise. In
this work, we derive stochastic models for speckle noise, with an emphasis on speckle as it arises in medical ultrasound images. The
motivation for this work is the problem of segmentation and tissue classification using ultrasound imaging. Modeling of speckle in this
context involves partially developed speckle model where an underlying Poisson point process modulates a Gram-Charlier series
of Laguerre weighted exponential functions, resulting in a doubly
stochastic filtered Poisson point process. The statistical distribution of partially developed speckle is derived in a closed canonical form.
It is observed that as the mean number of scatterers in a resolution cell is increased, the probability density function approaches an
exponential distribution. This is consistent with fully developed speckle noise as demonstrated by the Central Limit theorem.