Abstract: Biochemical Oxygen Demand (BOD) is a measure of
the oxygen used in bacteria mediated oxidation of organic substances
in water and wastewater. Theoretically an infinite time is required for
complete biochemical oxidation of organic matter, but the
measurement is made over 5-days at 20 0C or 3-days at 27 0C test
period with or without dilution. Researchers have worked to further
reduce the time of measurement.
The objective of this paper is to review advancement made in
BOD measurement primarily to minimize the time and negate the
measurement difficulties. Survey of literature review in four such
techniques namely BOD-BARTTM, Biosensors, Ferricyanidemediated
approach, luminous bacterial immobilized chip method.
Basic principle, method of determination, data validation and their
advantage and disadvantages have been incorporated of each of the
methods.
In the BOD-BARTTM method the time lag is calculated for the
system to change from oxidative to reductive state. BIOSENSORS
are the biological sensing element with a transducer which produces
a signal proportional to the analyte concentration. Microbial species
has its metabolic deficiencies. Co-immobilization of bacteria using
sol-gel biosensor increases the range of substrate. In ferricyanidemediated
approach, ferricyanide has been used as e-acceptor instead
of oxygen. In Luminous bacterial cells-immobilized chip method,
bacterial bioluminescence which is caused by lux genes was
observed. Physiological responses is measured and correlated to
BOD due to reduction or emission.
There is a scope to further probe into the rapid estimation of BOD.
Abstract: In this paper the application of neuro-fuzzy system for equalization of channel distortion is considered. The structure and operation algorithm of neuro-fuzzy equalizer are described. The use of neuro-fuzzy equalizer in digital signal transmission allows to decrease training time of parameters and decrease the complexity of the network. The simulation of neuro-fuzzy equalizer is performed. The obtained result satisfies the efficiency of application of neurofuzzy technology in channel equalization.
Abstract: As the demand for higher capacity in a cellular environment increases, the cell size decreases. This fact makes the role of suitable handoff algorithms to reduce both number of handoffs and handoff delay more important. In this paper we show that applying the grey prediction technique for handoff leads to considerable decrease in handoff delay with using a small number of handoffs, compared with traditional hystersis based handoff algorithms.
Abstract: Hepatocellular carcinoma, also called hepatoma, most
commonly appears in a patient with chronic viral hepatitis. In
patients with a higher suspicion of HCC, such as small or subtle
rising of serum enzymes levels, the best method of diagnosis
involves a CT scan of the abdomen, but only at high cost. The aim of
this study was to increase the ability of the physician to early detect
HCC, using a probabilistic neural network-based approach, in order
to save time and hospital resources.
Abstract: Web 2.0 (social networking, blogging and online
forums) can serve as a data source for social science research because
it contains vast amount of information from many different users.
The volume of that information has been growing at a very high rate
and becoming a network of heterogeneous data; this makes things
difficult to find and is therefore not almost useful. We have proposed
a novel theoretical model for gathering and processing data from
Web 2.0, which would reflect semantic content of web pages in
better way. This article deals with the analysis part of the model and
its usage for content analysis of blogs. The introductory part of the
article describes methodology for the gathering and processing data
from blogs. The next part of the article is focused on the evaluation
and content analysis of blogs, which write about specific trend.
Abstract: In this paper, Differential Evolution (DE) algorithm, a new promising evolutionary algorithm, is proposed to train Radial Basis Function (RBF) network related to automatic configuration of network architecture. Classification tasks on data sets: Iris, Wine, New-thyroid, and Glass are conducted to measure the performance of neural networks. Compared with a standard RBF training algorithm in Matlab neural network toolbox, DE achieves more rational architecture for RBF networks. The resulting networks hence obtain strong generalization abilities.
Abstract: The Beijing road traffic system, as a typical huge
urban traffic system, provides a platform for analyzing the complex
characteristics and the evolving mechanisms of urban traffic systems.
Based on dynamic network theory, we construct the dynamic model
of the Beijing road traffic system in which the dynamical properties
are described completely. Furthermore, we come into the conclusion
that urban traffic systems can be viewed as static networks, stochastic
networks and complex networks at different system phases by
analyzing the structural randomness. As well as, we demonstrate the
evolving process of the Beijing road traffic network based on real
traffic data, validate the stochastic characteristics and the scale-free
property of the network at different phases
Abstract: Group work, projects and discussions are important
components of teacher education courses whether they are face-toface,
blended or exclusively online formats. This paper examines the varieties of tasks and challenges with this learning format in a face to
face class teacher education class providing specific examples of both
failure and success from both the student and instructor perspective.
The discussion begins with a brief history of collaborative and cooperative learning, moves to an exploration of the promised
benefits and then takes a look at some of the challenges which can
arise specifically from the use of new technologies. The discussion concludes with guidelines and specific suggestions.
Abstract: We propose a multi-agent based utilitarian approach
to model and understand information flows in social networks that
lead to Pareto optimal informational exchanges. We model the
individual expected utility function of the agents to reflect the net
value of information received. We show how this model, adapted
from a theorem by Karl Borch dealing with an actuarial Risk
Exchange concept in the Insurance industry, can be used for social
network analysis. We develop a utilitarian framework that allows us
to interpret Pareto optimal exchanges of value as potential
information flows, while achieving a maximization of a sum of
expected utilities of information of the group of agents. We examine
some interesting conditions on the utility function under which the
flows are optimal. We illustrate the promise of this new approach to
attach economic value to information in networks with a synthetic
example.
Abstract: Pressure driven microscale gas flow-separation has
been investigated by solving the compressible Navier-Stokes (NS)
system of equations. A two dimensional explicit finite volume (FV)
compressible flow solver has been developed using modified
advection upwind splitting methods (AUSM+) with no-slip/first
order Maxwell-s velocity slip conditions to predict the flowseparation
behavior in microdimensions. The effects of scale-factor
of the flow geometry and gas species on the microscale gas flowseparation
have been studied in this work. The intensity of flowseparation
gets reduced with the decrease in scale of the flow
geometry. In reduced dimension, flow-separation may not at all be
present under similar flow conditions compared to the larger flow
geometry. The flow-separation patterns greatly depend on the
properties of the medium under similar flow conditions.
Abstract: There are many researches to detect collision between real object and virtual object in 3D space. In general, these techniques are need to huge computing power. So, many research and study are constructed by using cloud computing, network computing, and distribute computing. As a reason of these, this paper proposed a novel fast 3D collision detection algorithm between real and virtual object using 2D intersection area. Proposed algorithm uses 4 multiple cameras and coarse-and-fine method to improve accuracy and speed performance of collision detection. In the coarse step, this system examines the intersection area between real and virtual object silhouettes from all camera views. The result of this step is the index of virtual sensors which has a possibility of collision in 3D space. To decide collision accurately, at the fine step, this system examines the collision detection in 3D space by using the visual hull algorithm. Performance of the algorithm is verified by comparing with existing algorithm. We believe proposed algorithm help many other research, study and application fields such as HCI, augmented reality, intelligent space, and so on.
Abstract: In this work, we examine fluid mixing in a full three-stream mixing channel with longitudinal vortex generators (LVGs) built on the channel bottom by numerical simulation and experiment. The effects of the asymmetrical arrangement and the attack angle of the LVGs on fluid mixing are investigated. The results show that the micromixer with LVGs at a small asymmetry index (defined by the ratio of the distance from the center plane of the gap between the winglets to the center plane of the main channel to the width of the main channel) is superior to the micromixer with symmetric LVGs and that with LVGs at a large asymmetry index. The micromixer using five mixing modules of the LVGs with an attack angle between 16.5 degrees and 22.5 degrees can achieve excellent mixing over a wide range of Reynolds numbers. Here, we call a section of channel with two pairs of staggered asymmetrical LVGs a mixing module. Besides, the micromixer with LVGs at a small attack angle is more efficient than that with a larger attack angle when pressure losses are taken into account.
Abstract: This study extends research on the relationship
between marketing strategy and market segmentation by
investigating on market segments in the cement industry.
Competitive strength and rivals distance from the factory were used
as business environment. A three segment (positive, neutral or
indifferent and zero zones) were identified as strategic segments. For
each segment a marketing strategy (aggressive, defensive and
decline) were developed. This study employed data from cement
industry to fulfill two objectives, the first is to give a framework to
the segmentation of cement industry and the second is developing
marketing strategy with varying competitive strength. Fifty six
questionnaires containing close-and open-ended questions were
collected and analyzed. Results supported the theory that segments
tend to be more aggressive than defensive when competitive strength
increases. It is concluded that high strength segments follow total
market coverage, concentric diversification and frontal attack to their
competitors. With decreased competitive strength, Business tends to
follow multi-market strategy, product modification/improvement and
flank attack to direct competitors for this kind of segments. Segments
with weak competitive strength followed focus strategy and decline
strategy.
Abstract: Recently, the improvements in processing performance
of a computer and in high speed communication of an optical fiber
have been achieved, so that the amount of data which are processed
by a computer and flowed on a network has been increasing greatly.
However, in a client-server system, since the server receives and
processes the amount of data from the clients through the network, a
load on the server is increasing. Thus, there are needed to introduce
a server with high processing ability and to have a line with high
bandwidth. In this paper, concerning to P2P networks to resolve the
load on a specific server, a criterion called an Indexed-Priority Metric
is proposed and its performance is evaluated. The proposed metric is
to allocate some files to each node. As a result, the load on a specific
server can distribute them to each node equally well. A P2P file
sharing system using the proposed metric is implemented. Simulation
results show that the proposed metric can make it distribute files on
the specific server.
Abstract: Along with forward supply chain organization needs
to consider the impact of reverse logistics due to its economic
advantage, social awareness and strict legislations. In this paper, we
develop a system dynamics framework for a closed-loop supply
chain with fuzzy demand and fuzzy collection rate by incorporating
product exchange policy in forward channel and various recovery
options in reverse channel. The uncertainty issues associated with
acquisition and collection of used product have been quantified using
possibility measures. In the simulation study, we analyze order
variation at both retailer and distributor level and compare bullwhip
effects of different logistics participants over time between the
traditional forward supply chain and the closed-loop supply chain.
Our results suggest that the integration of reverse logistics can reduce
order variation and bullwhip effect of a closed-loop system. Finally,
sensitivity analysis is performed to examine the impact of various
parameters on recovery process and bullwhip effect.
Abstract: The peng-Robinson (PR), a cubic equation of state (EoS), is extended to polymers by using a single set of energy (A1, A2, A3) and co-volume (b) parameters per polymer fitted to experimental volume data. Excellent results for the volumetric behavior of the 11 polymer up to 2000 bar pressure are obtained. The EoS is applied to the correlation and prediction of Henry constants in polymer solutions comprising three polymer and many nonpolar and polar solvents, including supercritical gases. The correlation achieved with two adjustable parameter is satisfactory compared with the experimental data. As a result, the present work provides a simple and useful model for the prediction of Henry's constant for polymer containing systems including those containing polar, nonpolar and supercritical fluids.
Abstract: Context awareness is a capability whereby mobile
computing devices can sense their physical environment and adapt
their behavior accordingly. The term context-awareness, in
ubiquitous computing, was introduced by Schilit in 1994 and has
become one of the most exciting concepts in early 21st-century
computing, fueled by recent developments in pervasive computing
(i.e. mobile and ubiquitous computing). These include computing
devices worn by users, embedded devices, smart appliances, sensors
surrounding users and a variety of wireless networking technologies.
Context-aware applications use context information to adapt
interfaces, tailor the set of application-relevant data, increase the
precision of information retrieval, discover services, make the user
interaction implicit, or build smart environments. For example: A
context aware mobile phone will know that the user is currently in a
meeting room, and reject any unimportant calls. One of the major
challenges in providing users with context-aware services lies in
continuously monitoring their contexts based on numerous sensors
connected to the context aware system through wireless
communication. A number of context aware frameworks based on
sensors have been proposed, but many of them have neglected the
fact that monitoring with sensors imposes heavy workloads on
ubiquitous devices with limited computing power and battery. In this
paper, we present CALEEF, a lightweight and energy efficient
context aware framework for resource limited ubiquitous devices.
Abstract: The current study describes a multi-objective optimization technique for positioning of houses in a residential neighborhood. The main task is the placement of residential houses in a favorable configuration satisfying a number of objectives. Solving the house layout problem is a challenging task. It requires an iterative approach to satisfy design requirements (e.g. energy efficiency, skyview, daylight, roads network, visual privacy, and clear access to favorite views). These design requirements vary from one project to another based on location and client preferences. In the Gulf region, the most important socio-cultural factor is the visual privacy in indoor space. Hence, most of the residential houses in this region are surrounded by high fences to provide privacy, which has a direct impact on other requirements (e.g. daylight and direction to favorite views). This investigation introduces a novel technique to optimally locate and orient residential buildings to satisfy a set of design requirements. The developed technique explores the search space for possible solutions. This study considers two dimensional house planning problems. However, it can be extended to solve three dimensional cases.
Abstract: Finger spelling is an art of communicating by signs
made with fingers, and has been introduced into sign language to serve
as a bridge between the sign language and the verbal language.
Previous approaches to finger spelling recognition are classified into
two categories: glove-based and vision-based approaches. The
glove-based approach is simpler and more accurate recognizing work
of hand posture than vision-based, yet the interfaces require the user to
wear a cumbersome and carry a load of cables that connected the
device to a computer. In contrast, the vision-based approaches provide
an attractive alternative to the cumbersome interface, and promise
more natural and unobtrusive human-computer interaction. The
vision-based approaches generally consist of two steps: hand
extraction and recognition, and two steps are processed independently.
This paper proposes real-time vision-based Korean finger spelling
recognition system by integrating hand extraction into recognition.
First, we tentatively detect a hand region using CAMShift algorithm.
Then fill factor and aspect ratio estimated by width and height
estimated by CAMShift are used to choose candidate from database,
which can reduce the number of matching in recognition step. To
recognize the finger spelling, we use DTW(dynamic time warping)
based on modified chain codes, to be robust to scale and orientation
variations. In this procedure, since accurate hand regions, without
holes and noises, should be extracted to improve the precision, we use
graph cuts algorithm that globally minimize the energy function
elegantly expressed by Markov random fields (MRFs). In the
experiments, the computational times are less than 130ms, and the
times are not related to the number of templates of finger spellings in
database, as candidate templates are selected in extraction step.
Abstract: Dielectric sheet perturbation to the dominant TE111
mode resonant frequency of a circular cavity is studied and presented
in this paper. The dielectric sheet, placed at the middle of the airfilled
cavity, introduces discontinuities and disturbs the configuration
of electromagnetic fields in the cavity. For fixed dimensions of cavity
and fixed thickness of the loading dielectric, the dominant resonant
frequency varies quite linearly with the permittivity of the dielectric.
This quasi-linear relationship is plotted using Maple software and
verified using 3D electromagnetic simulations. Two probes are used
in the simulation for wave excitation into and from the cavity. The
best length of probe is found to be 3 mm, giving the closest resonant
frequency to the one calculated using Maple. A total of fourteen
different dielectrics of permittivity ranging from 1 to 12.9 are tested
one by one in the simulation. The works show very close agreement
between the results from Maple and the simulation. A constant
difference of 0.04 GHz is found between the resonant frequencies
collected during simulation and the ones from Maple. The success of
this project may lead to the possibility of using the middle loaded
cavity at TE111 mode as a microwave non-destructive testing of solid
materials.