Abstract: The conventional assessment of human semen is a
highly subjective assessment, with considerable intra- and interlaboratory
variability. Computer-Assisted Sperm Analysis (CASA)
systems provide a rapid and automated assessment of the sperm
characteristics, together with improved standardization and quality
control. However, the outcome of CASA systems is sensitive to the
method of experimentation. While conventional CASA systems use
digital microscopes with phase-contrast accessories, producing
higher contrast images, we have used raw semen samples (no
staining materials) and a regular light microscope, with a digital
camera directly attached to its eyepiece, to insure cost benefits and
simple assembling of the system. However, since the accurate finding
of sperms in the semen image is the first step in the examination and
analysis of the semen, any error in this step can affect the outcome of
the analysis. This article introduces and explains an algorithm for
finding sperms in low contrast images: First, an image enhancement
algorithm is applied to remove extra particles from the image. Then,
the foreground particles (including sperms and round cells) are
segmented form the background. Finally, based on certain features
and criteria, sperms are separated from other cells.
Abstract: Biometrics, which refers to identifying an individual
based on his or her physiological or behavioral characteristics, has
the capability to reliably distinguish between an authorized person
and an imposter. Signature verification systems can be categorized as
offline (static) and online (dynamic). This paper presents a neural
network based recognition of offline handwritten signatures system
that is trained with low-resolution scanned signature images.
Abstract: This paper highlights the controversial socioscientific
issues and their misconceptions in Nigeria as well as in some other
low literate societies around the world. It states the relevance of the
issues or problems in Nigeria, which might be neutral or absent in
other countries. The need to understand the issues and how such an
understanding can contribute to the achievement of the Millennium
Development Goals (MDGs) is also being discussed. The paper
concludes by suggesting the responsibilities of science teachers to
remove the misconceptions surrounding the socioscientific issues.
Abstract: Fault detection determines faultexistence and detecting
time. This paper discusses two layered fault detection methods to
enhance the reliability and safety. Two layered fault detection methods
consist of fault detection methods of component level controllers and
system level controllers. Component level controllers detect faults by
using limit checking, model-based detection, and data-driven
detection and system level controllers execute detection by stability
analysis which can detect unknown changes. System level controllers
compare detection results via stability with fault signals from lower
level controllers. This paper addresses fault detection methods via
stability and suggests fault detection criteria in nonlinear systems. The
fault detection method applies tothe hybrid control unit of a military
hybrid electric vehicleso that the hybrid control unit can detect faults
of the traction motor.
Abstract: In this paper we introduce an effective ECG compression algorithm based on two dimensional multiwavelet transform. Multiwavelets offer simultaneous orthogonality, symmetry and short support, which is not possible with scalar two-channel wavelet systems. These features are known to be important in signal processing. Thus multiwavelet offers the possibility of superior performance for image processing applications. The SPIHT algorithm has achieved notable success in still image coding. We suggested applying SPIHT algorithm to 2-D multiwavelet transform of2-D arranged ECG signals. Experiments on selected records of ECG from MIT-BIH arrhythmia database revealed that the proposed algorithm is significantly more efficient in comparison with previously proposed ECG compression schemes.
Abstract: Microscopic emission and fuel consumption models
have been widely recognized as an effective method to quantify real
traffic emission and energy consumption when they are applied with
microscopic traffic simulation models. This paper presents a
framework for developing the Microscopic Emission (HC, CO, NOx,
and CO2) and Fuel consumption (MEF) models for light-duty
vehicles. The variable of composite acceleration is introduced into
the MEF model with the purpose of capturing the effects of historical
accelerations interacting with current speed on emission and fuel
consumption. The MEF model is calibrated by multivariate
least-squares method for two types of light-duty vehicle using
on-board data collected in Beijing, China by a Portable Emission
Measurement System (PEMS). The instantaneous validation results
shows the MEF model performs better with lower Mean Absolute
Percentage Error (MAPE) compared to other two models. Moreover,
the aggregate validation results tells the MEF model produces
reasonable estimations compared to actual measurements with
prediction errors within 12%, 10%, 19%, and 9% for HC, CO, NOx
emissions and fuel consumption, respectively.
Abstract: The porous silicon (PS), formed from the anodization
of a p+ type substrate silicon, consists of a network organized in a
pseudo-column as structure of multiple side ramifications. Structural
micro-topology can be interpreted as the fraction of the interconnected
solid phase contributing to thermal transport. The
reduction of dimensions of silicon of each nanocristallite during the
oxidation induced a reduction in thermal conductivity. Integration of
thermal sensors in the Microsystems silicon requires an effective
insulation of the sensor element. Indeed, the low thermal conductivity
of PS consists in a very promising way in the fabrication of integrated
thermal Microsystems.In this work we are interesting in the
measurements of thermal conductivity (on the surface and in depth)
of PS by the micro-Raman spectroscopy. The thermal conductivity is
studied according to the parameters of anodization (initial doping and
current density. We also, determine porosity of samples by
spectroellipsometry.
Abstract: The main objective of this paper is to identify and
disseminate good practice in quality assurance and enhancement as
well as in teaching and learning at master level. This paper focuses
on the experience of the Erasmus Mundus Master program CIMET
(Color in Informatics and Media Technology). Amongst topics
covered, we discuss the adjustments necessary to a curriculum
designed for excellent international students and their preparation for
a global labor market.
Abstract: This paper proposes a stroke extraction method for use in off-line signature verification. After giving a brief overview of the current ongoing researches an algorithm is introduced for detecting and following strokes in static images of signatures. Problems like the handling of junctions and variations in line width and line intensity are discussed in detail. Results are validated by both using an existing on-line signature database and by employing image registration methods.
Abstract: Let Xi be a Lacunary System, we established large
deviations inequality for Lacunary System. Furthermore, we gained
Marcinkiewicz Larger Number Law with dependent random variables
sequences.
Abstract: Superelastic Shape Memory Alloy (SMA) is accepted
when it used as connection in steel structures. The seismic behaviour
of steel frames with SMA is being assessed in this study. Three eightstorey
steel frames with different SMA systems are suggested, the
first one of which is braced with diagonal bracing system, the second
one is braced with nee bracing system while the last one is which the
SMA is used as connection at the plastic hinge regions of beams.
Nonlinear time history analyses of steel frames with SMA subjected
to two different ground motion records have been performed using
Seismostruct software. To evaluate the efficiency of suggested
systems, the dynamic responses of the frames were compared. From
the comparison results, it can be concluded that using SMA element
is an effective way to improve the dynamic response of structures
subjected to earthquake excitations. Implementing the SMA braces
can lead to a reduction in residual roof displacement. The shape
memory alloy is effective in reducing the maximum displacement at
the frame top and it provides a large elastic deformation range. SMA
connections are very effective in dissipating energy and reducing the
total input energy of the whole frame under severe seismic ground
motion. Using of the SMA connection system is more effective in
controlling the reaction forces at the base frame than other bracing
systems. Using SMA as bracing is more effective in reducing the
displacements. The efficiency of SMA is dependant on the input
wave motions and the construction system as well.
Abstract: We evaluate the average energy consumption per bit
in Optical Packet Switches equipped with BENES switching fabric
realized in Semiconductor Optical Amplifier (SOA) technology. We
also study the impact that the Amplifier Spontaneous Emission
(ASE) noise generated by a transmission system has on the power
consumption of the BENES switches due to the gain saturation of the
SOAs used to realize the switching fabric. As a matter of example for
32×32 switches supporting 64 wavelengths and offered traffic equal
to 0,8, the average energy consumption per bit is 2, 34 · 10-1 nJ/bit
and increases if ASE noise introduced by the transmission systems
is increased.
Abstract: In the present study the efficiency of Big Bang-Big
Crunch (BB-BC) algorithm is investigated in discrete structural
design optimization. It is shown that a standard version of the BB-BC
algorithm is sometimes unable to produce reasonable solutions to
problems from discrete structural design optimization. Two
reformulations of the algorithm, which are referred to as modified
BB-BC (MBB-BC) and exponential BB-BC (EBB-BC), are
introduced to enhance the capability of the standard algorithm in
locating good solutions for steel truss and frame type structures,
respectively. The performances of the proposed algorithms are
experimented and compared to its standard version as well as some
other algorithms over several practical design examples. In these
examples, steel structures are sized for minimum weight subject to
stress, stability and displacement limitations according to the
provisions of AISC-ASD.
Abstract: Fine-grained data replication over the Internet allows duplication of frequently accessed data objects, as opposed to entire sites, to certain locations so as to improve the performance of largescale content distribution systems. In a distributed system, agents representing their sites try to maximize their own benefit since they are driven by different goals such as to minimize their communication costs, latency, etc. In this paper, we will use game theoretical techniques and in particular auctions to identify a bidding mechanism that encapsulates the selfishness of the agents, while having a controlling hand over them. In essence, the proposed game theory based mechanism is the study of what happens when independent agents act selfishly and how to control them to maximize the overall performance. A bidding mechanism asks how one can design systems so that agents- selfish behavior results in the desired system-wide goals. Experimental results reveal that this mechanism provides excellent solution quality, while maintaining fast execution time. The comparisons are recorded against some well known techniques such as greedy, branch and bound, game theoretical auctions and genetic algorithms.
Abstract: Fault tolerance is critical in many of today's large computer systems. This paper focuses on improving fault tolerance through testing. Moreover, it concentrates on the memory faults: how to access the editable part of a process memory space and how this part is affected. A special Software Fault Injection Technique (SFIT) is proposed for this purpose. This is done by sequentially scanning the memory of the target process, and trying to edit maximum number of bytes inside that memory. The technique was implemented and tested on a group of programs in software packages such as jet-audio, Notepad, Microsoft Word, Microsoft Excel, and Microsoft Outlook. The results from the test sample process indicate that the size of the scanned area depends on several factors. These factors are: process size, process type, and virtual memory size of the machine under test. The results show that increasing the process size will increase the scanned memory space. They also show that input-output processes have more scanned area size than other processes. Increasing the virtual memory size will also affect the size of the scanned area but to a certain limit.
Abstract: Synchronous cooperative systems (SCS) bring together users that are geographically distributed and connected through a network to carry out a task. Examples of SCS include Tele- Immersion and Tele-Conferences. In SCS, the coordination is the core of the system, and it has been defined as the act of managing interdependencies between activities performed to achieve a goal. Some of the main problems that SCS present deal with the management of constraints between simultaneous activities and the execution ordering of these activities. In order to resolve these problems, orderings based on Lamport-s happened-before relation have been used, namely, causal, Δ-causal, and causal-total orderings. They mainly differ in the degree of asynchronous execution allowed. One of the most important orderings is the causal order, which establishes that the events must be seen in the cause-effect order as they occur in the system. In this paper we show that for certain SCS (e.g. videoconferences, tele-immersion) where some degradation of the system is allowed, ensuring the causal order is still rigid, which can render negative affects to the system. In this paper, we illustrate how a more relaxed ordering, which we call Fuzzy Causal Order (FCO), is useful for such kind of systems by allowing a more asynchronous execution than the causal order. The benefit of the FCO is illustrated by applying it to a particular scenario of intermedia synchronization of an audio-conference system.
Abstract: The orthogonal processes to shape the triangle steel plate into a equilateral vertical steel are examined by an incremental elasto-plastic finite-element method based on an updated Lagrangian formulation. The highly non-linear problems due to the geometric changes, the inelastic constitutive behavior and the boundary conditions varied with deformation are taken into account in an incremental manner. On the contact boundary, a modified Coulomb friction mode is specially considered. A weighting factor r-minimum is employed to limit the step size of loading increment to linear relation. In particular, selective reduced integration was adopted to formulate the stiffness matrix. The simulated geometries of verticality could clearly demonstrate the vertical processes until unloading. A series of experiments and simulations were performed to validate the formulation in the theory, leading to the development of the computer codes. The whole deformation history and the distribution of stress, strain and thickness during the forming process were obtained by carefully considering the moving boundary condition in the finite-element method. Therefore, this modeling can be used for judging whether a equilateral vertical steel can be shaped successfully. The present work may be expected to improve the understanding of the formation of the equilateral vertical steel.
Abstract: IVE toolkit has been created for facilitating research,education and development in the ?eld of virtual storytelling andcomputer games. Primarily, the toolkit is intended for modellingaction selection mechanisms of virtual humans, investigating level-of-detail AI techniques for large virtual environments, and for exploringjoint behaviour and role-passing technique (Sec. V). Additionally, thetoolkit can be used as an AI middleware without any changes. Themain facility of IVE is that it serves for prototyping both the AI andvirtual worlds themselves. The purpose of this paper is to describeIVE?s features in general and to present our current work - includingan educational game - on this platform.Keywords? AI middleware, simulation, virtual world.
Abstract: Image Edge Detection is one of the most important
parts of image processing. In this paper, by fuzzy technique, a new
method is used to improve digital image edge detection. In this
method, a 3x3 mask is employed to process each pixel by means of
vicinity. Each pixel is considered a fuzzy input and by examining
fuzzy rules in its vicinity, the edge pixel is specified and by utilizing
calculation algorithms in image processing, edges are displayed more
clearly. This method shows significant improvement compared to
different edge detection methods (e.g. Sobel, Canny).
Abstract: This study uses a simulation to establish a realistic
environment for laboratory research on Accountable Care
Organizations. We study network attributes in order to gain insights
regarding healthcare providers- conduct and performance. Our
findings indicate how network structure creates significant
differences in organizational performance. We demonstrate how
healthcare providers positioning themselves at the central, pivotal
point of the network while maintaining their alliances with their
partners produce better outcomes.