Abstract: This paper gives an overview of how an OWL
ontology has been created to represent template knowledge models
defined in CML that are provided by CommonKADS.
CommonKADS is a mature knowledge engineering methodology
which proposes the use of template knowledge model for knowledge
modelling. The aim of developing this ontology is to present the
template knowledge model in a knowledge representation language
that can be easily understood and shared in the knowledge
engineering community. Hence OWL is used as it has become a
standard for ontology and also it already has user friendly tools for
viewing and editing.
Abstract: Vector quantization is a powerful tool for speech
coding applications. This paper deals with LPC Coding of speech
signals which uses a new technique called Multi Switched Split
Vector Quantization (MSSVQ), which is a hybrid of Multi, switched,
split vector quantization techniques. The spectral distortion
performance, computational complexity, and memory requirements
of MSSVQ are compared to split vector quantization (SVQ), multi
stage vector quantization(MSVQ) and switched split vector
quantization (SSVQ) techniques. It has been proved from results that
MSSVQ has better spectral distortion performance, lower
computational complexity and lower memory requirements when
compared to all the above mentioned product code vector
quantization techniques. Computational complexity is measured in
floating point operations (flops), and memory requirements is
measured in (floats).
Abstract: We provide a maximum norm analysis of a finite
element Schwarz alternating method for a nonlinear elliptic boundary
value problem of the form -Δu = f(u), on two overlapping sub
domains with non matching grids. We consider a domain which is
the union of two overlapping sub domains where each sub domain
has its own independently generated grid. The two meshes being
mutually independent on the overlap region, a triangle belonging to
one triangulation does not necessarily belong to the other one. Under
a Lipschitz assumption on the nonlinearity, we establish, on each sub
domain, an optimal L∞ error estimate between the discrete Schwarz
sequence and the exact solution of the boundary value problem.
Abstract: Soils are normally dried in either a convection oven or stove. Laboratory moisture content testing indicated that the typical drying durations for a convection oven were, 24 hours. The purpose of this study was to determine the accuracy and soil drying duration of both, moisture content and liquid limit using microwave radiation. The soils were tested with both, convection and microwave ovens. The convection oven was considered to produce the true values for both, natural moisture content and liquid limit of soils; it was, therefore, used as a basis for comparison for the results of the microwave ovens. The samples used in this study were obtained from different projects of Consulting Engineering Bureau of College of Engineering of Sulaimani University. These samples were collected from different locations and at the different depths and consist mostly of brown and light brown clay and silty clay. A total of 102 samples were prepared. 26 of them were tested for natural moisture determination, while the other 76 were used for liquid limits determination
Abstract: This study investigates the electrical performance of a
planar solid oxide fuel cell unit with cross-flow configuration when the fuel utilization gets higher and the fuel inlet flow are non-uniform.
A software package in this study solves two-dimensional,
simultaneous, partial differential equations of mass, energy, and
electro-chemistry, without considering stack direction variation. The
results show that the fuel utilization increases with a decrease in the molar flow rate, and the average current density decreases when the
molar flow rate drops. In addition, non-uniform Pattern A will induce more severe happening of non-reaction area in the corner of the fuel
exit and the air inlet. This non-reaction area deteriorates the average
current density and then deteriorates the electrical performance to –7%.
Abstract: Information technology managers nowadays are
facing with tremendous pressure to plan, implement, and adopt new
technology solution due to the rapidity of technology changes.
Resulted from a lack of study that have been done in this topic, the
aim of this paper is to provide a comparison review on current tools
that are currently being used in order to respond to technological
changes. The study is based on extensive literature review of
published works with majority of them are ranging from 2000 to the
first part of 2011. The works were gathered from journals, books,
and other information sources available on the Web. Findings show
that, each tools has different focus and none of the tools are
providing a framework in holistic view, which should include
technical, people, process, and business environment aspect. Hence,
this result provides potential information about current available
tools that IT managers could use to manage changes in technology.
Further, the result reveals a research gap in the area where the
industries a short of such framework.
Abstract: This paper focuses on sovereign credit risk meaning a
hot topic related to the current Eurozone crisis. In the light of the
recent financial crisis, market perception of the creditworthiness of
individual sovereigns has changed significantly. Before the outbreak
of the financial crisis, market participants did not differentiate
between credit risk born by individual states despite different levels
of public indebtedness. In the proceeding of the financial crisis, the
market participants became aware of the worsening fiscal situation in
the European countries and started to discriminate among
government issuers. Concerns about the increasing sovereign risk
were reflected in surging sovereign risk premium. The main of this
paper is to shed light on the characteristics of the sovereign risk with
the special attention paid to the mutual relation between credit spread
and the CDS premium as the main measures of the sovereign risk
premium.
Abstract: Free convection effects and heat transfer due to a pulsating point heat source embedded in an infinite, fluid saturated, porous dusty medium are studied analytically. Both velocity and temperature fields are discussed in the form of series expansions in the Rayleigh number, for both the fluid and particle phases based on the mean heat generation rate from source and on the permeability of the porous dusty medium. This study is carried out by assuming the Rayleigh number small and the validity of Darcy-s law. Analytical expressions for both phases are obtained for second order mean in both velocity and temperature fields and evolution of different wave patterns are observed in the fluctuating part. It has been observed that, at the vicinity of the origin, the second order mean flow is influenced only by relaxation time of dust particles and not by dust concentration.
Abstract: This paper aims to develop a NOx emission model of
an acid gas incinerator using Nelder-Mead least squares support
vector regression (LS-SVR). Malaysia DOE is actively imposing the
Clean Air Regulation to mandate the installation of analytical
instrumentation known as Continuous Emission Monitoring System
(CEMS) to report emission level online to DOE . As a hardware
based analyzer, CEMS is expensive, maintenance intensive and often
unreliable. Therefore, software predictive technique is often
preferred and considered as a feasible alternative to replace the
CEMS for regulatory compliance. The LS-SVR model is built based
on the emissions from an acid gas incinerator that operates in a LNG
Complex. Simulated Annealing (SA) is first used to determine the
initial hyperparameters which are then further optimized based on the
performance of the model using Nelder-Mead simplex algorithm.
The LS-SVR model is shown to outperform a benchmark model
based on backpropagation neural networks (BPNN) in both training
and testing data.
Abstract: Clustering is a very well known technique in data mining. One of the most widely used clustering techniques is the k-means algorithm. Solutions obtained from this technique are dependent on the initialization of cluster centers. In this article we propose a new algorithm to initialize the clusters. The proposed algorithm is based on finding a set of medians extracted from a dimension with maximum variance. The algorithm has been applied to different data sets and good results are obtained.
Abstract: Lighting upgrades involve relatively lower costs which
allow the benefits to be spread more widely than is possible with any
other energy efficiency measure. In order to popularize the adoption of
CFL in Taiwan, the authority proposes to implement a new energy efficient lamp comparative label system. The current study was
accordingly undertaken to investigate the factors affecting the performance and the deviation of actual and labeled performance of
commercially available integrated CFLs. In this paper, standard test
methods to determine the electrical and photometric performances of
CFL were developed based on CIE 84-1989 and CIE 60901-1987,
then 55 selected CFLs from market were tested. The results show that
with higher color temperature of CFLs lower efficacy are achieved. It
was noticed that the most packaging of CFL often lack the information of Color Rendering Index. Also, there was no correlation between
price and performance of the CFLs was indicated in this work. The results of this paper might help consumers to make more informed
CFL-purchasing decisions.
Abstract: Atlantic herring (Clupea harengus) is an important
commercial fish and shows to be more and more demanded for
human consumption. Therefore, it is very important to find good
methods for monitoring the freshness of the fish in order to keep it in
the best quality for human consumption. In this study, the fish was
stored in ice up to 2 weeks. Quality changes during storage were
assessed by the Quality Index Method (QIM), quantitative
descriptive analysis (QDA) and Torry scheme, by texture
measurements: puncture tests and Texture Profile Analysis (TPA)
tests on texture analyzer TA.XT2i, and by electronic nose (e-nose)
measurements using FreshSense instrument. Storage time of herring
in ice could be estimated by QIM with ± 2 days using 5 herring per
lot. No correlation between instrumental texture parameters and
storage time or between sensory and instrumental texture variables
was found. E-nose measurements could be use to detect the onset of
spoilage.
Abstract: The purpose of this study is to analyze the islands
tourist travel information sources, as well as for the satisfaction of the
tourist destination services. This study used questionnaires to the
island of Taiwan to the Penghu Islands to engage in tourism activities
tourist adopt the designated convenience sampling method, a total of
889 valid questionnaires were collected. After statistical analysis, this
study found that: 1. tourists to the Penghu Islands travel information
source for “friends and family came to Penghu". 2. Tourists feel the
service of the outlying islands of Penghu, the highest feelings of
“friendly local residents". 3. There are different demographic variables
affect the tourist travel information source and service satisfaction.
Based on the findings of this study not only for Penghu's tourism
industry with the unit in charge of the proposed operating and
suggestions for future research to other researchers.
Abstract: The aim of the study was to evaluate the effect of
texturizers on the rheological properties of the apple mass and
desserts made from various raw materials. The apple varieties -
‘Antonovka’, ‘Baltais Dzidrais’, and ‘Zarja Alatau’ harvested in
Latvia, were used for the experiment. The apples were processed in a
blender unpeeled for obtaining a homogenous mass. The apple mass
was analyzed fresh and after storage at –18ºC. Both fresh and thawed
apple mass samples with added gelatin, xantan gum, and sodium
carboxymethylcellulose were whisked obtaining dessert. Pectin, pH
and soluble dry matter of the product were determined. Apparent
viscosity was measured using a rotational viscometer DV–III Ultra.
Pectin content in frozen apple mass decreased significantly (p
Abstract: Security issue and the importance of the function of
police to provide practical and psychological contexts in the
community has been the main topics among researchers , police and
security circles and this subject require to review and analysis
mechanisms within the police and its interaction with other parts of
the system for providing community safety. This paper examine
national and social security in the Internet.
Abstract: Querying a data source and routing data towards sink
becomes a serious challenge in static wireless sensor networks if sink
and/or data source are mobile. Many a times the event to be observed
either moves or spreads across wide area making maintenance of
continuous path between source and sink a challenge. Also, sink can
move while query is being issued or data is on its way towards sink.
In this paper, we extend our already proposed Grid Based Data
Dissemination (GBDD) scheme which is a virtual grid based
topology management scheme restricting impact of movement of
sink(s) and event(s) to some specific cells of a grid. This obviates the
need for frequent path modifications and hence maintains continuous
flow of data while minimizing the network energy consumptions.
Simulation experiments show significant improvements in network
energy savings and average packet delay for a packet to reach at sink.
Abstract: In today-s new technology era, cluster has become a
necessity for the modern computing and data applications since many
applications take more time (even days or months) for computation.
Although after parallelization, computation speeds up, still time
required for much application can be more. Thus, reliability of the
cluster becomes very important issue and implementation of fault
tolerant mechanism becomes essential. The difficulty in designing a
fault tolerant cluster system increases with the difficulties of various
failures. The most imperative obsession is that the algorithm, which
avoids a simple failure in a system, must tolerate the more severe
failures. In this paper, we implemented the theory of watchdog timer
in a parallel environment, to take care of failures. Implementation of
simple algorithm in our project helps us to take care of different
types of failures; consequently, we found that the reliability of this
cluster improves.
Abstract: With the fast evolution of digital data exchange, security information becomes much important in data storage and transmission. Due to the increasing use of images in industrial process, it is essential to protect the confidential image data from unauthorized access. In this paper, we analyze the Advanced Encryption Standard (AES), and we add a key stream generator (A5/1, W7) to AES to ensure improving the encryption performance; mainly for images characterised by reduced entropy. The implementation of both techniques has been realized for experimental purposes. Detailed results in terms of security analysis and implementation are given. Comparative study with traditional encryption algorithms is shown the superiority of the modified algorithm.
Abstract: We board the problem of creating a seismic alert
system, based upon artificial neural networks, trained by using the
well-known back-propagation and genetic algorithms, in order to emit
the alarm for the population located into a specific city, about an
eminent earthquake greater than 4.5 Richter degrees, and avoiding
disasters and human loses. In lieu of using the propagation wave, we
employed the magnitude of the earthquake, to establish a correlation
between the recorded magnitudes from a controlled area and the city,
where we want to emit the alarm. To measure the accuracy of the
posed method, we use a database provided by CIRES, which contains
the records of 2500 quakes incoming from the State of Guerrero
and Mexico City. Particularly, we performed the proposed method to
generate an issue warning in Mexico City, employing the magnitudes
recorded in the State of Guerrero.
Abstract: Acute kidney injury (AKI) is a new worldwide public
health problem. A diagnosis of this disease using creatinine is still a
problem in clinical practice. Therefore, a measurement of biomarkers
responsible for AKI has received much attention in the past couple
years. Cytokine interleukin-18 (IL-18) was reported as one of the
early biomarkers for AKI. The most commonly used method to
detect this biomarker is an immunoassay. This study used a planar
platform to perform an immunoassay using fluorescence for
detection. In this study, anti-IL-18 antibody was immobilized onto a
microscope slide using a covalent binding method. Make-up samples
were diluted at the concentration between 10 to 1000 pg/ml to create
a calibration curve. The precision of the system was determined
using a coefficient of variability (CV), which was found to be less
than 10%. The performance of this immunoassay system was
compared with the measurement from ELISA.