Abstract: We propose a decoy-pulse protocol for frequency-coded implementation of B92 quantum key distribution protocol. A direct extension of decoy-pulse method to frequency-coding scheme results in security loss as an eavesdropper can distinguish between signal and decoy pulses by measuring the carrier photon number without affecting other statistics. We overcome this problem by optimizing the ratio of carrier photon number of decoy-to-signal pulse to be as close to unity as possible. In our method the switching between signal and decoy pulses is achieved by changing the amplitude of RF signal as opposed to modulating the intensity of optical signal thus reducing system cost. We find an improvement by a factor of 100 approximately in the key generation rate using decoy-state protocol. We also study the effect of source fluctuation on key rate. Our simulation results show a key generation rate of 1.5×10-4/pulse for link lengths up to 70km. Finally, we discuss the optimum value of average photon number of signal pulse for a given key rate while also optimizing the carrier ratio.
Abstract: Wireless Mesh Networking is a promising proposal
for broadband data transmission in a large area with low cost and
acceptable QoS. These features- trade offs in WMNs is a hot research
field nowadays. In this paper a mathematical optimization framework
has been developed to maximize throughput according to upper
bound delay constraints. IEEE 802.11 based infrastructure
backhauling mode of WMNs has been considered to formulate the
MINLP optimization problem. Proposed method gives the full
routing and scheduling procedure in WMN in order to obtain
mentioned goals.
Abstract: Dhaka, the capital city of Bangladesh, is one of the
densely populated cities in the world. Due to rapid urbanization 60%
of its population lives in slum and squatter settlements. The reason
behind this poverty is low economic growth, inequitable distribution
of income, unequal distribution of productive assets, unemployment
and underemployment, high rate of population growth, low level of
human resource development, natural disasters, and limited access to
public services. Along with poverty, creating pressure on urban land,
shelter, plots, open spaces this creates environmental and ecological
degradation. These constraints are mostly resulted from the failures
of the government policies and measures and only Government can
solve this problem. This is now prime time to establish planning and
environmental management policy and sustainable urban
development for the city and for the urban slum dwellers which are
free from eviction, criminals, rent seekers and other miscreants.
Abstract: A new SUZ-4 zeolite membrane with
tetraethlyammonium hydroxide as the template was fabricated on
mullite tube via hydrothermal sol-gel synthesis in a rotating
autoclave reactor. The suitable synthesis condition was SiO2:Al2O3
ratio of 21.2 for 4 days at 155 °C crystallization under autogenous
pressure. The obtained SUZ-4 possessed a high BET surface area of
396.4 m2/g, total pore volume at 2.611 cm3/g, and narrow pore size
distribution with 97 nm mean diameter and 760 nm long of needle
crystal shape. The SUZ-4 layer obtained from seeding crystallization
was thicker than that of without seeds or in situ crystallization.
Abstract: Modeling product configurations needs large amounts of knowledge about technical and marketing restrictions on the product. Previous attempts to automate product configurations concentrate on representations and management of the knowledge for specific domains in fixed and isolated computing environments. Since the knowledge about product configurations is subject to continuous change and hard to express, these attempts often failed to efficiently manage and exchange the knowledge in collaborative product development. In this paper, XML Topic Map (XTM) is introduced to represent and exchange the knowledge about product configurations in collaborative product development. A product configuration model based on XTM along with its merger and inference facilities enables configuration engineers in collaborative product development to manage and exchange their knowledge efficiently. A prototype implementation is also presented to demonstrate the proposed model can be applied to engineering information systems to exchange the product configuration knowledge.
Abstract: Discrete particle swarm optimization (DPSO) is a
powerful stochastic evolutionary algorithm that is used to solve the
large-scale, discrete and nonlinear optimization problems. However,
it has been observed that standard DPSO algorithm has premature
convergence when solving a complex optimization problem like
transmission expansion planning (TEP). To resolve this problem an
advanced discrete particle swarm optimization (ADPSO) is proposed
in this paper. The simulation result shows that optimization of lines
loading in transmission expansion planning with ADPSO is better
than DPSO from precision view point.
Abstract: Leading topic of this article is description of Lorentz
forces in the container with cuboid and cylindrical shape. Inside of
the container is an electrically conductive melt. This melt is driven by
rotating magnetic field. Input data for comparing Lorentz forces in
the container with cuboid shape were obtained from the computing
program NS-FEM3D, which uses DDS method of computing. Values
of Lorentz forces for container with cylindrical shape were obtained
from inferred analytical formula.
Abstract: This paper describes the design and results of FROID,
an outbound intrusion detection system built with agent technology
and supported by an attacker-centric ontology. The prototype
features a misuse-based detection mechanism that identifies remote
attack tools in execution. Misuse signatures composed of attributes
selected through entropy analysis of outgoing traffic streams and
process runtime data are derived from execution variants of attack
programs. The core of the architecture is a mesh of self-contained
detection cells organized non-hierarchically that group agents in a
functional fashion. The experiments show performance gains when
the ontology is enabled as well as an increase in accuracy achieved
when correlation cells combine detection evidence received from
independent detection cells.
Abstract: How to efficiently assign system resource to route the
Client demand by Gateway servers is a tricky predicament. In this
paper, we tender an enhanced proposal for autonomous recital of
Gateway servers under highly vibrant traffic loads. We devise a
methodology to calculate Queue Length and Waiting Time utilizing
Gateway Server information to reduce response time variance in
presence of bursty traffic.
The most widespread contemplation is performance, because
Gateway Servers must offer cost-effective and high-availability
services in the elongated period, thus they have to be scaled to meet
the expected load. Performance measurements can be the base for
performance modeling and prediction. With the help of performance
models, the performance metrics (like buffer estimation, waiting
time) can be determined at the development process.
This paper describes the possible queue models those can be
applied in the estimation of queue length to estimate the final value
of the memory size. Both simulation and experimental studies using
synthesized workloads and analysis of real-world Gateway Servers
demonstrate the effectiveness of the proposed system.
Abstract: Most file systems overwrite modified file data and
metadata in their original locations, while the Log-structured File
System (LFS) dynamically relocates them to other locations. We
design and implement the Evergreen file system that can select
between overwriting or relocation for each block of a file or metadata.
Therefore, the Evergreen file system can achieve superior write
performance by sequentializing write requests (similar to LFS-style
relocation) when space utilization is low and overwriting when
utilization is high. Another challenging issue is identifying
performance benefits of LFS-style relocation over overwriting on a
newly introduced SSD (Solid State Drive) which has only
Flash-memory chips and control circuits without mechanical parts.
Our experimental results measured on a SSD show that relocation
outperforms overwriting when space utilization is below 80% and vice
versa.
Abstract: The estimation of overall on-site and off-site greenhouse gas (GHG) emissions by wastewater treatment plants revealed that in anaerobic and hybrid treatment systems greater emissions result from off-site processes compared to on-site processes. However, in aerobic treatment systems, onsite processes make a higher contribution to the overall GHG emissions. The total GHG emissions were estimated to be 1.6, 3.3 and 3.8 kg CO2-e/kg BOD in the aerobic, anaerobic and hybrid treatment systems, respectively. In the aerobic treatment system without the recovery and use of the generated biogas, the off-site GHG emissions were 0.65 kg CO2-e/kg BOD, accounting for 40.2% of the overall GHG emissions. This value changed to 2.3 and 2.6 kg CO2-e/kg BOD, and accounted for 69.9% and 68.1% of the overall GHG emissions in the anaerobic and hybrid treatment systems, respectively. The increased off-site GHG emissions in the anaerobic and hybrid treatment systems are mainly due to material usage and energy demand in these systems. The anaerobic digester can contribute up to 100%, 55% and 60% of the overall energy needs of plants in the aerobic, anaerobic and hybrid treatment systems, respectively.
Abstract: Accurate timing alignment and stability is important
to maximize the true counts and minimize the random counts in
positron emission tomography So signals output from detectors must
be centering with the two isotopes to pre-operation and fed signals
into four units of pulse-processing units, each unit can accept up to
eight inputs. The dual source computed tomography consist two units
on the left for 15 detector signals of Cs-137 isotope and two units on
the right are for 15 detectors signals of Co-60 isotope. The gamma
spectrum consisting of either single or multiple photo peaks. This
allows for the use of energy discrimination electronic hardware
associated with the data acquisition system to acquire photon counts
data with a specific energy, even if poor energy resolution detectors
are used. This also helps to avoid counting of the Compton scatter
counts especially if a single discrete gamma photo peak is emitted by
the source as in the case of Cs-137. In this study the polyenergetic
version of the alternating minimization algorithm is applied to the
dual energy gamma computed tomography problem.
Abstract: A mathematical model for the hydrodynamics of a
surface water treatment pilot plant was developed and validated by
the determination of the residence time distribution (RTD) for the
main equipments of the unit. The well known models of ideal/real
mixing, ideal displacement (plug flow) and (one-dimensional axial)
dispersion model were combined in order to identify the structure
that gives the best fitting of the experimental data for each equipment
of the pilot plant. RTD experimental results have shown that pilot
plant hydrodynamics can be quite well approximated by a
combination of simple mathematical models, structure which is
suitable for engineering applications. Validated hydrodynamic
models will be further used in the evaluation and selection of the
most suitable coagulation-flocculation reagents, optimum operating
conditions (injection point, reaction times, etc.), in order to improve
the quality of the drinking water.
Abstract: Response surface methodology was used for
quantitative investigation of water and solids transfer during osmotic
dehydration of beetroot in aqueous solution of salt. Effects of
temperature (25 – 45oC), processing time (30–150 min), salt
concentration (5–25%, w/w) and solution to sample ratio (5:1 – 25:1)
on osmotic dehydration of beetroot were estimated. Quadratic
regression equations describing the effects of these factors on the
water loss and solids gain were developed. It was found that effects
of temperature and salt concentrations were more significant on the
water loss than the effects of processing time and solution to sample
ratio. As for solids gain processing time and salt concentration were
the most significant factors. The osmotic dehydration process was
optimized for water loss, solute gain, and weight reduction. The
optimum conditions were found to be: temperature – 35oC,
processing time – 90 min, salt concentration – 14.31% and solution
to sample ratio 8.5:1. At these optimum values, water loss, solid gain
and weight reduction were found to be 30.86 (g/100 g initial sample),
9.43 (g/100 g initial sample) and 21.43 (g/100 g initial sample)
respectively.
Abstract: This paper will present the initial findings of a
research into distributed computer rendering. The goal of the
research is to create a distributed computer system capable of
rendering a 3D model into an MPEG-4 stream. This paper outlines
the initial design, software architecture and hardware setup for the
system.
Distributed computing means designing and implementing
programs that run on two or more interconnected computing systems.
Distributed computing is often used to speed up the rendering of
graphical imaging. Distributed computing systems are used to
generate images for movies, games and simulations.
A topic of interest is the application of distributed computing to
the MPEG-4 standard. During the course of the research, a
distributed system will be created that can render a 3D model into an
MPEG-4 stream. It is expected that applying distributed computing
principals will speed up rendering, thus improving the usefulness and
efficiency of the MPEG-4 standard
Abstract: Wavelets have provided the researchers with
significant positive results, by entering the texture defect detection domain. The weak point of wavelets is that they are one-dimensional
by nature so they are not efficient enough to describe and analyze two-dimensional functions. In this paper we present a new method to
detect the defect of texture images by using curvelet transform.
Simulation results of the proposed method on a set of standard
texture images confirm its correctness. Comparing the obtained results indicates the ability of curvelet transform in describing
discontinuity in two-dimensional functions compared to wavelet
transform
Abstract: Much has been written about the difficulties students
have with producing traditional dissertations. This includes both
native English speakers (L1) and students with English as a second
language (L2). The main emphasis of these papers has been on the
structure of the dissertation, but in all cases, even when electronic
versions are discussed, the dissertation is still in what most would
regard as a traditional written form.
Master of Science Degrees in computing disciplines require
students to gain technical proficiency and apply their knowledge to a
range of scenarios. The basis of this paper is that if a dissertation is a
means of showing that such a student has met the criteria for a pass,
which should be based on the learning outcomes of the dissertation
module, does meeting those outcomes require a student to
demonstrate their skills in a solely text based form, particularly in a
highly technical research project? Could it be possible for a student
to produce a series of related artifacts which form a cohesive package
that meets the learning out comes of the dissertation?
Abstract: An important problem in speech research is the automatic extraction of information about the shape and dimensions of the vocal tract during real-time speech production. We have previously developed Southampton dynamic magnetic resonance imaging (SDMRI) as an approach to the solution of this problem.However, the SDMRI images are very noisy so that shape extraction is a major challenge. In this paper, we address the problem of tongue shape extraction, which poses difficulties because this is a highly deforming non-parametric shape. We show that combining active shape models with the dynamic Hough transform allows the tongue shape to be reliably tracked in the image sequence.
Abstract: RoboCup Rescue simulation as a large-scale Multi
agent system (MAS) is one of the challenging environments for
keeping coordination between agents to achieve the objectives
despite sensing and communication limitations. The dynamicity of
the environment and intensive dependency between actions of
different kinds of agents make the problem more complex. This point
encouraged us to use learning-based methods to adapt our decision
making to different situations. Our approach is utilizing
reinforcement leaning. Using learning in rescue simulation is one of
the current ways which has been the subject of several researches in
recent years. In this paper we present an innovative learning method
implemented for Police Force (PF) Agent. This method can cope
with the main difficulties that exist in other learning approaches.
Different methods used in the literature have been examined. Their
drawbacks and possible improvements have led us to the method
proposed in this paper which is fast and accurate. The Brain
Emotional Learning Based Intelligent Controller (BELBIC) is our
solution for learning in this environment. BELBIC is a
physiologically motivated approach based on a computational model
of amygdale and limbic system. The paper presents the results
obtained by the proposed approach, showing the power of BELBIC
as a decision making tool in complex and dynamic situation.
Abstract: One of the most basic functions of control engineers is
tuning of controllers. There are always several process loops in the
plant necessitate of tuning. The auto tuned Proportional Integral
Derivative (PID) Controllers are designed for applications where
large load changes are expected or the need for extreme accuracy and
fast response time exists. The algorithm presented in this paper is
used for the tuning PID controller to obtain its parameters with a
minimum computing complexity. It requires continuous analysis of
variation in few parameters, and let the program to do the plant test
and calculate the controller parameters to adjust and optimize the
variables for the best performance. The algorithm developed needs
less time as compared to a normal step response test for continuous
tuning of the PID through gain scheduling.