Abstract: Scheduling algorithms are used in operating systems
to optimize the usage of processors. One of the most efficient
algorithms for scheduling is Multi-Layer Feedback Queue (MLFQ)
algorithm which uses several queues with different quanta. The most
important weakness of this method is the inability to define the
optimized the number of the queues and quantum of each queue. This
weakness has been improved in IMLFQ scheduling algorithm.
Number of the queues and quantum of each queue affect the response
time directly. In this paper, we review the IMLFQ algorithm for
solving these problems and minimizing the response time. In this
algorithm Recurrent Neural Network has been utilized to find both
the number of queues and the optimized quantum of each queue.
Also in order to prevent any probable faults in processes' response
time computation, a new fault tolerant approach has been presented.
In this approach we use combinational software redundancy to
prevent the any probable faults. The experimental results show that
using the IMLFQ algorithm results in better response time in
comparison with other scheduling algorithms also by using fault
tolerant mechanism we improve IMLFQ performance.
Abstract: In this paper a combined feature selection method is
proposed which takes advantages of sample domain filtering,
resampling and feature subset evaluation methods to reduce
dimensions of huge datasets and select reliable features. This method
utilizes both feature space and sample domain to improve the process
of feature selection and uses a combination of Chi squared with
Consistency attribute evaluation methods to seek reliable features.
This method consists of two phases. The first phase filters and
resamples the sample domain and the second phase adopts a hybrid
procedure to find the optimal feature space by applying Chi squared,
Consistency subset evaluation methods and genetic search.
Experiments on various sized datasets from UCI Repository of
Machine Learning databases show that the performance of five
classifiers (Naïve Bayes, Logistic, Multilayer Perceptron, Best First
Decision Tree and JRIP) improves simultaneously and the
classification error for these classifiers decreases considerably. The
experiments also show that this method outperforms other feature
selection methods.
Abstract: Two-dimensional (2D) bar codes were designed to
carry significantly more data with higher information density and
robustness than its 1D counterpart. Thanks to the popular
combination of cameras and mobile phones, it will naturally bring
great commercial value to use the camera phone for 2D bar code
reading. This paper addresses the problem of specific 2D bar code
design for mobile phones and introduces a low-level encoding
method of matrix codes. At the same time, we propose an efficient
scheme for 2D bar codes decoding, of which the effort is put on
solutions of the difficulties introduced by low image quality that is
very common in bar code images taken by a phone camera.
Abstract: This paper presents a vocoder to obtain high quality synthetic speech at 600 bps. To reduce the bit rate, the algorithm is based on a sinusoidally excited linear prediction model which extracts few coding parameters, and three consecutive frames are grouped into a superframe and jointly vector quantization is used to obtain high coding efficiency. The inter-frame redundancy is exploited with distinct quantization schemes for different unvoiced/voiced frame combinations in the superframe. Experimental results show that the quality of the proposed coder is better than that of 2.4kbps LPC10e and achieves approximately the same as that of 2.4kbps MELP and with high robustness.
Abstract: A straightforward and intuitive combination of single simulations into an aggregated master-simulation is not trivial. There are lots of problems, which trigger-specific difficulties during the modeling and execution of such a simulation. In this paper we identify these problems and aim to solve them by mapping the task to the field of multi agent systems. The solution is a new meta-model named AGENTMAP, which is able to mitigate most of the problems and to support intuitive modeling at the same time. This meta-model will be introduced and explained on basis of an example from the e-commerce domain.
Abstract: Collateralized Debt Obligations are not as widely used
nowadays as they were before 2007 Subprime crisis. Nonetheless
there remains an enthralling challenge to optimize cash flows
associated with synthetic CDOs. A Gaussian-based model is used
here in which default correlation and unconditional probabilities of
default are highlighted. Then numerous simulations are performed
based on this model for different scenarios in order to evaluate the
associated cash flows given a specific number of defaults at different
periods of time. Cash flows are not solely calculated on a single
bought or sold tranche but rather on a combination of bought and
sold tranches. With some assumptions, the simplex algorithm gives
a way to find the maximum cash flow according to correlation of
defaults and maturities. The used Gaussian model is not realistic in
crisis situations. Besides present system does not handle buying or
selling a portion of a tranche but only the whole tranche. However the
work provides the investor with relevant elements on how to know
what and when to buy and sell.
Abstract: In the present paper, an improved initial value
numerical technique is presented to analyze the free vibration of
symmetrically laminated rectangular plate. A combination of the
initial value method (IV) and the finite differences (FD) devices is
utilized to develop the present (IVFD) technique. The achieved
technique is applied to the equation of motion of vibrating laminated
rectangular plate under various types of boundary conditions. Three
common types of laminated symmetrically cross-ply, orthotropic and
isotropic plates are analyzed here. The convergence and accuracy of
the presented Initial Value-Finite Differences (IVFD) technique have
been examined. Also, the merits and validity of improved technique
are satisfied via comparing the obtained results with those available
in literature indicating good agreements.
Abstract: Polarization modulation infrared reflection absorption
spectroscopy (PM-IRRAS) in combination with electrochemistry,
was employed to study the influence of surface charge (potential) on
the kinetics of bovine serum albumin (BSA) adsorption on a
biomedical-grade 316LVM stainless steel surface is discussed. The
BSA adsorption kinetics was found to greatly depend on the surface
potential. With an increase in surface potential towards more
negative values, both the BSA initial adsorption rate and the
equilibrium (saturated) surface concentration also increased. Both
effects were explained on the basis of replacement of well-ordered
water molecules at the 316LVM / solution interface, i.e. by the
increase in entropy of the system.
Abstract: Field mapping activity for an active volcano mainly in
the Torrid Zone is usually hampered by several problems such as steep
terrain and bad atmosphere conditions. In this paper we present a
simple solution for such problem by a combination Synthetic Aperture
Radar (SAR) and geostatistical methods. By this combination, we
could reduce the speckle effect from the SAR data and then estimate
roughness distribution of the pyroclastic flow deposits. The main
purpose of this study is to detect spatial distribution of new pyroclastic
flow deposits termed as P-zone accurately using the β°data from two
RADARSAT-1 SAR level-0 data. Single scene of Hyperion data and
field observation were used for cross-validation of the SAR results.
Mt. Merapi in central Java, Indonesia, was chosen as a study site and
the eruptions in May-June 2006 were examined. The P-zones were
found in the western and southern flanks. The area size and the longest
flow distance were calculated as 2.3 km2 and 6.8 km, respectively. The
grain size variation of the P-zone was mapped in detail from fine to
coarse deposits regarding the C-band wavelength of 5.6 cm.
Abstract: One of the main environmental problems which affect extensive areas in the world is soil salinity. Traditional data collection methods are neither enough for considering this important environmental problem nor accurate for soil studies. Remote sensing data could overcome most of these problems. Although satellite images are commonly used for these studies, however there are still needs to find the best calibration between the data and real situations in each specified area. Neyshaboor area, North East of Iran was selected as a field study of this research. Landsat satellite images for this area were used in order to prepare suitable learning samples for processing and classifying the images. 300 locations were selected randomly in the area to collect soil samples and finally 273 locations were reselected for further laboratory works and image processing analysis. Electrical conductivity of all samples was measured. Six reflective bands of ETM+ satellite images taken from the study area in 2002 were used for soil salinity classification. The classification was carried out using common algorithms based on the best composition bands. The results showed that the reflective bands 7, 3, 4 and 1 are the best band composition for preparing the color composite images. We also found out, that hybrid classification is a suitable method for identifying and delineation of different salinity classes in the area.
Abstract: In the context of sensor networks, where every few
dB saving counts, the novel node cooperation schemes are reviewed
where MIMO techniques play a leading role. These methods could be
treated as joint approach for designing physical layer of their
communication scenarios. Then we analyzed the BER performance
of transmission diversity schemes under a general fading channel
model and proposed a power allocation strategy to the transmitting
sensor nodes. This approach is then compared to an equal-power
assignment method and its performance enhancement is verified by
the simulation. Another key point of the contribution lies in the
combination of optimal power allocation and sensor nodes-
cooperation in a transmission diversity regime (MISO). Numerical
results are given through figures to demonstrate the optimality and
efficiency of proposed combined approach.
Abstract: This paper presents a formant-tracking linear prediction
(FTLP) model for speech processing in noise. The main focus of this
work is the detection of formant trajectory based on Hidden Markov
Models (HMM), for improved formant estimation in noise. The
approach proposed in this paper provides a systematic framework for
modelling and utilization of a time- sequence of peaks which satisfies
continuity constraints on parameter; the within peaks are modelled
by the LP parameters. The formant tracking LP model estimation
is composed of three stages: (1) a pre-cleaning multi-band spectral
subtraction stage to reduce the effect of residue noise on formants
(2) estimation stage where an initial estimate of the LP model of
speech for each frame is obtained (3) a formant classification using
probability models of formants and Viterbi-decoders. The evaluation
results for the estimation of the formant tracking LP model tested
in Gaussian white noise background, demonstrate that the proposed
combination of the initial noise reduction stage with formant tracking
and LPC variable order analysis, results in a significant reduction in
errors and distortions. The performance was evaluated with noisy
natual vowels extracted from international french and English vocabulary
speech signals at SNR value of 10dB. In each case, the
estimated formants are compared to reference formants.
Abstract: This paper introduces and studies new indexing techniques for content-based queries in images databases. Indexing is the key to providing sophisticated, accurate and fast searches for queries in image data. This research describes a new indexing approach, which depends on linear modeling of signals, using bases for modeling. A basis is a set of chosen images, and modeling an image is a least-squares approximation of the image as a linear combination of the basis images. The coefficients of the basis images are taken together to serve as index for that image. The paper describes the implementation of the indexing scheme, and presents the findings of our extensive evaluation that was conducted to optimize (1) the choice of the basis matrix (B), and (2) the size of the index A (N). Furthermore, we compare the performance of our indexing scheme with other schemes. Our results show that our scheme has significantly higher performance.
Abstract: Air emissions from waste treatment plants often
consist of a combination of Volatile Organic Compounds (VOCs)
and odors. Hydrogen sulfide is one of the major odorous gases
present in the waste emissions coming from municipal wastewater
treatment facilities. Hydrogen sulfide (H2S) is odorous, highly toxic
and flammable. Exposure to lower concentrations can result in eye
irritation, a sore throat and cough, shortness of breath, and fluid in
the lungs. Biofiltration has become a widely accepted technology for
treating air streams containing H2S. When compared with other nonbiological
technologies, biofilter is more cost-effective for treating large
volumes of air containing low concentrations of biodegradable compounds.
Optimization of biofilter media is essential for many reasons such as:
providing a higher surface area for biofilm growth, low pressure drop,
physical stability, and good moisture retention. In this work, a novel
biofilter media is developed and tested at a pumping station of a
municipality located in the United Arab Emirates (UAE). The
media is found to be very effective (>99%) in removing H2S
concentrations that are expected in pumping stations under steady
state and shock loading conditions.
Abstract: This study analyzed environmental health risks and
people-s perceptions of risks related to waste management in poor
settlements of Abidjan, to develop integrated solutions for health and
well-being improvement. The trans-disciplinary approach used relied
on remote sensing, a geographic information system (GIS),
qualitative and quantitative methods such as interviews and a
household survey (n=1800). Mitigating strategies were then
developed using an integrated participatory stakeholder workshop.
Waste management deficiencies resulting in lack of drainage and
uncontrolled solid and liquid waste disposal in the poor settlements
lead to severe environmental health risks. Health problems were
caused by direct handling of waste, as well as through broader
exposure of the population. People in poor settlements had little
awareness of health risks related to waste management in their
community and a general lack of knowledge pertaining to sanitation
systems. This unfortunate combination was the key determinant
affecting the health and vulnerability. For example, an increased
prevalence of malaria (47.1%) and diarrhoea (19.2%) was observed
in the rainy season when compared to the dry season (32.3% and
14.3%). Concerted and adapted solutions that suited all the
stakeholders concerned were developed in a participatory workshop
to allow for improvement of health and well-being.
Abstract: Wheat gluten hydrolyzates (WGHs) and anchovy fine
powder hydrolyzates (AFPHs) were produced at 300 MPa using
combinations of Flavourzyme 500MG (F), Alcalase 2.4L (A),
Marugoto E (M) and Protamex (P), and then were compared to those
produced at ambient pressure concerning the contents of soluble solid
(SS), soluble nitrogen and electrophoretic profiles. The contents of SS
in the WGHs and AFPHs increased up to 87.2% according to the
increase in enzyme number both at high and ambient pressure. Based
on SS content, the optimum enzyme combinations for one-, two-,
three- and four-enzyme hydrolysis were determined as F, FA, FAM
and FAMP, respectively. Similar trends were found for the contents of
total soluble nitrogen (TSN) and TCA-soluble nitrogen (TCASN). The
contents of SS, TSN and TCASN in the hydrolyzates together with
electrophoretic mobility maps indicates that the high-pressure
treatment of this study accelerated protein hydrolysis compared to
ambient-pressure treatment.
Abstract: Investment in a constructed facility represents a cost in
the short term that returns benefits only over the long term use of the
facility. Thus, the costs occur earlier than the benefits, and the owners
of facilities must obtain the capital resources to finance the costs of
construction. A project cannot proceed without an adequate
financing, and the cost of providing an adequate financing can be
quite large. For these reasons, the attention to the project finance is an
important aspect of project management. Finance is also a concern to
the other organizations involved in a project such as the general
contractor and material suppliers. Unless an owner immediately and
completely covers the costs incurred by each participant, these
organizations face financing problems of their own. At a more
general level, the project finance is the only one aspect of the general
problem of corporate finance. If numerous projects are considered
and financed together, then the net cash flow requirements constitute
the corporate financing problem for capital investment. Whether
project finance is performed at the project or at the corporate level
does not alter the basic financing problem .In this paper, we will first
consider facility financing from the owner's perspective, with due
consideration for its interaction with other organizations involved in a
project. Later, we discuss the problems of construction financing
which are crucial to the profitability and solvency of construction
contractors. The objective of this paper is to present the steps utilized
to determine the best combination of minimum project financing.
The proposed model considers financing; schedule and maximum net
area .The proposed model is called Project Financing and Schedule
Integration using Genetic Algorithms "PFSIGA". This model
intended to determine more steps (maximum net area) for any project
with a subproject. An illustrative example will demonstrate the
feature of this technique. The model verification and testing are put
into consideration.
Abstract: To improve the classification rate of the face
recognition, features combination and a novel non-linear kernel are
proposed. The feature vector concatenates three different radius of
local binary patterns and Gabor wavelet features. Gabor features are
the mean, standard deviation and the skew of each scaling and
orientation parameter. The aim of the new kernel is to incorporate
the power of the kernel methods with the optimal balance between
the features. To verify the effectiveness of the proposed method,
numerous methods are tested by using four datasets, which are
consisting of various emotions, orientations, configuration,
expressions and lighting conditions. Empirical results show the
superiority of the proposed technique when compared to other
methods.
Abstract: Risk of infectious disease outbreaks is related to the
hygiene among the population. To assess the actual risks and modify
the relevant emergency procedures if necessary, a hygiene survey
was conducted among undergraduate students on the Rhodes
University campus. Soap was available to 10.5% and only 26.8% of
the study participants followed proper hygiene in relation to food
consumption. This combination increases the risk of infectious
disease outbreaks at the campus. Around 83.6% were willing to wash
their hands if soap was provided. Procurement and availability of
soap in undergraduate residences on campus should be improved, as
the total cost is estimated at only 2000 USD per annum. Awareness
campaigns about food-related hygiene and the need for regular handwashing
with soap should be run among Rhodes University students.
If successful, rates of respiratory and hygiene-related diseases will be
decreased and emergency health management simplified.
Abstract: NFκB activation plays a crucial role in anti-apoptotic responses in response to the apoptotic signaling during tumor necrosis factor (TNFa) stimulation in Multiple Myeloma (MM). Although several drugs have been found effective for the treatment of MM by mainly inhibiting NFκB pathway, there are no any quantitative or qualitative results of comparison assessment on inhibition effect between different single drugs or drug combinations. Computational modeling is becoming increasingly indispensable for applied biological research mainly because it can provide strong quantitative predicting power. In this study, a novel computational pathway modeling approach is employed to comparably assess the inhibition effects of specific single drugs and drug combinations on the NFκB pathway in MM, especially the prediction of synergistic drug combinations.