Abstract: This paper proposes a framework for product
development including hardware and software components. It
provides separation of hardware dependent software, modifications of
current product development process, and integration of software
modules with existing product configuration models and assembly
product structures. In order to decide the dependent software, the
framework considers product configuration modules and engineering
changes of associated software and hardware components. In order to
support efficient integration of the two different hardware and
software development, a modified product development process is
proposed. The process integrates the dependent software development
into product development through the interchanges of specific product
information. By using existing product data models in Product Data
Management (PDM), the framework represents software as modules
for product configurations and software parts for product structure.
The framework is applied to development of a robot system in order to
show its effectiveness.
Abstract: We provide a maximum norm analysis of a finite
element Schwarz alternating method for a nonlinear elliptic boundary
value problem of the form -Δu = f(u), on two overlapping sub
domains with non matching grids. We consider a domain which is
the union of two overlapping sub domains where each sub domain
has its own independently generated grid. The two meshes being
mutually independent on the overlap region, a triangle belonging to
one triangulation does not necessarily belong to the other one. Under
a Lipschitz assumption on the nonlinearity, we establish, on each sub
domain, an optimal L∞ error estimate between the discrete Schwarz
sequence and the exact solution of the boundary value problem.
Abstract: Neighborhood Rough Sets (NRS) has been proven to
be an efficient tool for heterogeneous attribute reduction. However,
most of researches are focused on dealing with complete and noiseless
data. Factually, most of the information systems are noisy, namely,
filled with incomplete data and inconsistent data. In this paper, we
introduce a generalized neighborhood rough sets model, called
VPTNRS, to deal with the problem of heterogeneous attribute
reduction in noisy system. We generalize classical NRS model with
tolerance neighborhood relation and the probabilistic theory.
Furthermore, we use the neighborhood dependency to evaluate the
significance of a subset of heterogeneous attributes and construct a
forward greedy algorithm for attribute reduction based on it.
Experimental results show that the model is efficient to deal with noisy
data.
Abstract: Information technology managers nowadays are
facing with tremendous pressure to plan, implement, and adopt new
technology solution due to the rapidity of technology changes.
Resulted from a lack of study that have been done in this topic, the
aim of this paper is to provide a comparison review on current tools
that are currently being used in order to respond to technological
changes. The study is based on extensive literature review of
published works with majority of them are ranging from 2000 to the
first part of 2011. The works were gathered from journals, books,
and other information sources available on the Web. Findings show
that, each tools has different focus and none of the tools are
providing a framework in holistic view, which should include
technical, people, process, and business environment aspect. Hence,
this result provides potential information about current available
tools that IT managers could use to manage changes in technology.
Further, the result reveals a research gap in the area where the
industries a short of such framework.
Abstract: This paper focuses on sovereign credit risk meaning a
hot topic related to the current Eurozone crisis. In the light of the
recent financial crisis, market perception of the creditworthiness of
individual sovereigns has changed significantly. Before the outbreak
of the financial crisis, market participants did not differentiate
between credit risk born by individual states despite different levels
of public indebtedness. In the proceeding of the financial crisis, the
market participants became aware of the worsening fiscal situation in
the European countries and started to discriminate among
government issuers. Concerns about the increasing sovereign risk
were reflected in surging sovereign risk premium. The main of this
paper is to shed light on the characteristics of the sovereign risk with
the special attention paid to the mutual relation between credit spread
and the CDS premium as the main measures of the sovereign risk
premium.
Abstract: Free convection effects and heat transfer due to a pulsating point heat source embedded in an infinite, fluid saturated, porous dusty medium are studied analytically. Both velocity and temperature fields are discussed in the form of series expansions in the Rayleigh number, for both the fluid and particle phases based on the mean heat generation rate from source and on the permeability of the porous dusty medium. This study is carried out by assuming the Rayleigh number small and the validity of Darcy-s law. Analytical expressions for both phases are obtained for second order mean in both velocity and temperature fields and evolution of different wave patterns are observed in the fluctuating part. It has been observed that, at the vicinity of the origin, the second order mean flow is influenced only by relaxation time of dust particles and not by dust concentration.
Abstract: This paper aims to develop a NOx emission model of
an acid gas incinerator using Nelder-Mead least squares support
vector regression (LS-SVR). Malaysia DOE is actively imposing the
Clean Air Regulation to mandate the installation of analytical
instrumentation known as Continuous Emission Monitoring System
(CEMS) to report emission level online to DOE . As a hardware
based analyzer, CEMS is expensive, maintenance intensive and often
unreliable. Therefore, software predictive technique is often
preferred and considered as a feasible alternative to replace the
CEMS for regulatory compliance. The LS-SVR model is built based
on the emissions from an acid gas incinerator that operates in a LNG
Complex. Simulated Annealing (SA) is first used to determine the
initial hyperparameters which are then further optimized based on the
performance of the model using Nelder-Mead simplex algorithm.
The LS-SVR model is shown to outperform a benchmark model
based on backpropagation neural networks (BPNN) in both training
and testing data.
Abstract: A challenging problem in radar signal processing is to
achieve reliable target detection in the presence of interferences. In
this paper, we propose a novel algorithm for automatic censoring of
radar interfering targets in log-normal clutter. The proposed
algorithm, termed the forward automatic censored cell averaging
detector (F-ACCAD), consists of two steps: removing the corrupted
reference cells (censoring) and the actual detection. Both steps are
performed dynamically by using a suitable set of ranked cells to
estimate the unknown background level and set the adaptive
thresholds accordingly. The F-ACCAD algorithm does not require
any prior information about the clutter parameters nor does it require
the number of interfering targets. The effectiveness of the F-ACCAD
algorithm is assessed by computing, using Monte Carlo simulations,
the probability of censoring and the probability of detection in
different background environments.
Abstract: Atlantic herring (Clupea harengus) is an important
commercial fish and shows to be more and more demanded for
human consumption. Therefore, it is very important to find good
methods for monitoring the freshness of the fish in order to keep it in
the best quality for human consumption. In this study, the fish was
stored in ice up to 2 weeks. Quality changes during storage were
assessed by the Quality Index Method (QIM), quantitative
descriptive analysis (QDA) and Torry scheme, by texture
measurements: puncture tests and Texture Profile Analysis (TPA)
tests on texture analyzer TA.XT2i, and by electronic nose (e-nose)
measurements using FreshSense instrument. Storage time of herring
in ice could be estimated by QIM with ± 2 days using 5 herring per
lot. No correlation between instrumental texture parameters and
storage time or between sensory and instrumental texture variables
was found. E-nose measurements could be use to detect the onset of
spoilage.
Abstract: The aim of the study was to evaluate the effect of
texturizers on the rheological properties of the apple mass and
desserts made from various raw materials. The apple varieties -
‘Antonovka’, ‘Baltais Dzidrais’, and ‘Zarja Alatau’ harvested in
Latvia, were used for the experiment. The apples were processed in a
blender unpeeled for obtaining a homogenous mass. The apple mass
was analyzed fresh and after storage at –18ºC. Both fresh and thawed
apple mass samples with added gelatin, xantan gum, and sodium
carboxymethylcellulose were whisked obtaining dessert. Pectin, pH
and soluble dry matter of the product were determined. Apparent
viscosity was measured using a rotational viscometer DV–III Ultra.
Pectin content in frozen apple mass decreased significantly (p
Abstract: A hardware efficient, multi mode, re-configurable
architecture of interleaver/de-interleaver for multiple standards,
like DVB, WiMAX and WLAN is presented. The interleavers
consume a large part of silicon area when implemented by using
conventional methods as they use memories to store permutation
patterns. In addition, different types of interleavers in different
standards cannot share the hardware due to different construction
methodologies. The novelty of the work presented in this paper is
threefold: 1) Mapping of vital types of interleavers including
convolutional interleaver onto a single architecture with flexibility
to change interleaver size; 2) Hardware complexity for channel
interleaving in WiMAX is reduced by using 2-D realization of the
interleaver functions; and 3) Silicon cost overheads reduced by
avoiding the use of small memories. The proposed architecture
consumes 0.18mm2 silicon area for 0.12μm process and can
operate at a frequency of 140 MHz. The reduced complexity helps
in minimizing the memory utilization, and at the same time
provides strong support to on-the-fly computation of permutation
patterns.
Abstract: Querying a data source and routing data towards sink
becomes a serious challenge in static wireless sensor networks if sink
and/or data source are mobile. Many a times the event to be observed
either moves or spreads across wide area making maintenance of
continuous path between source and sink a challenge. Also, sink can
move while query is being issued or data is on its way towards sink.
In this paper, we extend our already proposed Grid Based Data
Dissemination (GBDD) scheme which is a virtual grid based
topology management scheme restricting impact of movement of
sink(s) and event(s) to some specific cells of a grid. This obviates the
need for frequent path modifications and hence maintains continuous
flow of data while minimizing the network energy consumptions.
Simulation experiments show significant improvements in network
energy savings and average packet delay for a packet to reach at sink.
Abstract: In today-s new technology era, cluster has become a
necessity for the modern computing and data applications since many
applications take more time (even days or months) for computation.
Although after parallelization, computation speeds up, still time
required for much application can be more. Thus, reliability of the
cluster becomes very important issue and implementation of fault
tolerant mechanism becomes essential. The difficulty in designing a
fault tolerant cluster system increases with the difficulties of various
failures. The most imperative obsession is that the algorithm, which
avoids a simple failure in a system, must tolerate the more severe
failures. In this paper, we implemented the theory of watchdog timer
in a parallel environment, to take care of failures. Implementation of
simple algorithm in our project helps us to take care of different
types of failures; consequently, we found that the reliability of this
cluster improves.
Abstract: With the fast evolution of digital data exchange, security information becomes much important in data storage and transmission. Due to the increasing use of images in industrial process, it is essential to protect the confidential image data from unauthorized access. In this paper, we analyze the Advanced Encryption Standard (AES), and we add a key stream generator (A5/1, W7) to AES to ensure improving the encryption performance; mainly for images characterised by reduced entropy. The implementation of both techniques has been realized for experimental purposes. Detailed results in terms of security analysis and implementation are given. Comparative study with traditional encryption algorithms is shown the superiority of the modified algorithm.
Abstract: Acute kidney injury (AKI) is a new worldwide public
health problem. A diagnosis of this disease using creatinine is still a
problem in clinical practice. Therefore, a measurement of biomarkers
responsible for AKI has received much attention in the past couple
years. Cytokine interleukin-18 (IL-18) was reported as one of the
early biomarkers for AKI. The most commonly used method to
detect this biomarker is an immunoassay. This study used a planar
platform to perform an immunoassay using fluorescence for
detection. In this study, anti-IL-18 antibody was immobilized onto a
microscope slide using a covalent binding method. Make-up samples
were diluted at the concentration between 10 to 1000 pg/ml to create
a calibration curve. The precision of the system was determined
using a coefficient of variability (CV), which was found to be less
than 10%. The performance of this immunoassay system was
compared with the measurement from ELISA.
Abstract: Fast depth estimation from binocular vision is often
desired for autonomous vehicles, but, most algorithms could not easily
be put into practice because of the much time cost. We present an
image-processing technique that can fast estimate depth image from
binocular vision images. By finding out the lines which present the
best matched area in the disparity space image, the depth can be
estimated. When detecting these lines, an edge-emphasizing filter is
used. The final depth estimation will be presented after the smooth
filter. Our method is a compromise between local methods and global
optimization.
Abstract: Vacuum membrane distillation (VMD) process can be
used for water purification or the desalination of salt water. The
process simply consists of a flat sheet hydrophobic micro porous
PTFE membrane and diaphragm vacuum pump without a condenser
for the water recovery or trap. The feed was used aqueous NaCl
solution. The VMD experiments were performed to evaluate the heat
and mass transfer coefficient of the boundary layer in a membrane
module. The only operating parameters are feed inlet temperature,
and feed flow rate were investigated. The permeate flux was strongly
affected by the feed inlet temperature, feed flow rate, and boundary
layer heat transfer coefficient. Since lowering the temperature
polarization coefficient is essential enhance the process performance
considerable and maximizing the heat transfer coefficient for
maximizes the mass flux of distillate water. In this paper, the results
of VMD experiments are used to measure the boundary layer heat
transfer coefficient, and the experimental results are used to reevaluate
the empirical constants in the Dittus- Boelter equation.
Abstract: Image synthesis is an important area in image processing.
To synthesize images various systems are proposed in
the literature. In this paper, we propose a bio-inspired system to
synthesize image and to study the generating power of the system, we
define the class of languages generated by our system. We call image
as array in this paper. We use a primitive called iso-array to synthesize
image/array. The operation is double splicing on iso-arrays. The
double splicing operation is used in DNA computing and we use
this to synthesize image. A comparison of the family of languages
generated by the proposed self restricted double splicing systems on
iso-arrays with the existing family of local iso-picture languages is
made. Certain closure properties such as union, concatenation and
rotation are studied for the family of languages generated by the
proposed model.
Abstract: The demand of the energy management systems (EMS) set forth by modern power systems requires fast energy management systems. Contingency analysis is among the functions in EMS which is time consuming. In order to handle this limitation, this paper introduces agent based technology in the contingency analysis. The main function of agents is to speed up the performance. Negotiations process in decision making is explained and the issue set forth is the minimization of the operating costs. The IEEE 14 bus system and its line outage have been used in the research and simulation results are presented.
Abstract: In this paper, an improved technique for contingency
ranking using artificial neural network (ANN) is presented. The
proposed approach is based on multi-layer perceptrons trained by
backpropagation to contingency analysis. Severity indices in dynamic
stability assessment are presented. These indices are based on the
concept of coherency and three dot products of the system variables.
It is well known that some indices work better than others for a
particular power system. This paper along with test results using
several different systems, demonstrates that combination of indices
with ANN provides better ranking than a single index. The presented
results are obtained through the use of power system simulation
(PSS/E) and MATLAB 6.5 software.