Abstract: Scale defects are common surface defects in hot steel rolling. The modelling of such defects is problematic and their causes are not straightforward. In this study, we investigated genetic algorithms in search for a mathematical solution to scale formation. For this research, a high-dimensional data set from hot steel rolling process was gathered. The synchronisation of the variables as well as the allocation of the measurements made on the steel strip were solved before the modelling phase.
Abstract: The indoor airflow with a mixed natural/forced convection
was numerically calculated using the laminar and turbulent
approach. The Boussinesq approximation was considered for a simplification
of the mathematical model and calculations. The results
obtained, such as mean velocity fields, were successfully compared
with experimental PIV flow visualizations. The effect of the distance
between the cooled wall and the heat exchanger on the temperature
and velocity distributions was calculated. In a room with a simple
shape, the computational code OpenFOAM demonstrated an ability to
numerically predict flow patterns. Furthermore, numerical techniques,
boundary type conditions and the computational grid quality were
examined. Calculations using the turbulence model k-omega had a
significant effect on the results influencing temperature and velocity
distributions.
Abstract: It has been shown that in most accidents the driver is responsible due to being distracted or misjudging the situation. In order to solve such problems research has been dedicated to developing driver assistance systems that are able to monitor the traffic situation around the vehicle. This paper presents methods for recognizing several circumstances on a road. The methods use both the in-vehicle warning systems and the roadside infrastructure. Preliminary evaluation results for fog and ice-on-road detection are presented. The ice detection results are based on data recorded in a test track dedicated to tyre friction testing. The achieved results anticipate that ice detection could work at a performance of 70% detection with the right setup, which is a good foundation for implementation. However, the full benefit of the presented cooperative system is achieved by fusing the outputs of multiple data sources, which is the key point of discussion behind this publication.
Abstract: This paper describes the application of a model
predictive controller to the problem of batch reactor temperature
control. Although a great deal of work has been done to improve
reactor throughput using batch sequence control, the control of the
actual reactor temperature remains a difficult problem for many
operators of these processes. Temperature control is important as
many chemical reactions are sensitive to temperature for formation of
desired products. This controller consist of two part (1) a nonlinear
control method GLC (Global Linearizing Control) to create a linear
model of system and (2) a Model predictive controller used to obtain
optimal input control sequence. The temperature of reactor is tuned
to track a predetermined temperature trajectory that applied to the
batch reactor. To do so two input signals, electrical powers and the
flow of coolant in the coil are used. Simulation results show that the
proposed controller has a remarkable performance for tracking
reference trajectory while at the same time it is robust against noise
imposed to system output.
Abstract: Constant upgrading of Enterprise Resource Planning
(ERP) systems is necessary, but can cause new defects. This paper
attempts to model the likelihood of defects after completed upgrades
with Weibull defect probability density function (PDF). A case study
is presented analyzing data of recorded defects obtained for one ERP
subsystem. The trends are observed for the value of the parameters
relevant to the proposed statistical Weibull distribution for a given
one year period. As a result, the ability to predict the appearance of
defects after the next upgrade is described.
Abstract: Methods of contemporary mathematical physics such
as chaos theory are useful for analyzing and understanding the
behavior of complex biological and physiological systems. The three
dimensional model of HIV/AIDS is the basis of active research since
it provides a complete characterization of disease dynamics and the
interaction of HIV-1 with the immune system. In this work, the
behavior of the HIV system is analyzed using the three dimensional
HIV model and a chaotic measure known as the Hurst exponent.
Results demonstrate that Hurst exponents of CD4, CD8 cells and
viral load vary nonlinearly with respect to variations in system
parameters. Further, it was observed that the three dimensional HIV
model can accommodate both persistent (H>0.5) and anti-persistent
(H
Abstract: This paper presents the investigation results of UV
measurement at different level of altitudes and the development of a
new portable instrument for measuring UV. The rapid growth of
industrial sectors in developing countries including Malaysia, brings
not only income to the nation, but also causes pollution in various
forms. Air pollution is one of the significant contributors to global
warming by depleting the Ozone layer, which would reduce the
filtration of UV rays. Long duration of exposure to high to UV rays
has many devastating health effects to mankind directly or indirectly
through destruction of the natural resources. This study aimed to
show correlation between UV and altitudes which indirectly can help
predict Ozone depletion. An instrument had been designed to
measure and monitors the level of UV. The instrument comprises of
two main blocks namely data logger and Graphic User Interface
(GUI). Three sensors were used in the data logger to detect changes
in the temperature, humidity and ultraviolet. The system has
undergone experimental measurement to capture data at two different
conditions; industrial area and high attitude area. The performance of
the instrument showed consistency in the data captured and the
results of the experiment drew a significantly high reading of UV at
high altitudes.
Abstract: In this study, a mathematical model was proposed and
the accuracy of this model was assessed to predict the growth of
Pseudomonas aeruginosa and rhamnolipid production under nitrogen
limiting (sodium nitrate) fed-batch fermentation. All of the
parameters used in this model were achieved individually without
using any data from the literature.
The overall growth kinetic of the strain was evaluated using a
dual-parallel substrate Monod equation which was described by
several batch experimental data. Fed-batch data under different
glycerol (as the sole carbon source, C/N=10) concentrations and feed
flow rates were used to describe the proposed fed-batch model and
other parameters. In order to verify the accuracy of the proposed
model several verification experiments were performed in a vast
range of initial glycerol concentrations. While the results showed an
acceptable prediction for rhamnolipid production (less than 10%
error), in case of biomass prediction the errors were less than 23%. It
was also found that the rhamnolipid production by P. aeruginosa was
more sensitive at low glycerol concentrations.
Based on the findings of this work, it was concluded that the
proposed model could effectively be employed for rhamnolipid
production by this strain under fed-batch fermentation on up to 80 g l-
1 glycerol.
Abstract: Vitamin A deficiency is a public health problem in
Zimbabwe. Addressing vitamin A deficiency has the potential of
enhancing resistance to disease and reducing mortality especially in
children less than 5 years. We implemented and adapted vitamin A
outreach supplementation strategy within the National Immunization
Days and Extended Programme of Immunization in a rural district in
Zimbabwe. Despite usual operational challenges faced this approach
enabled the district to increase delivery of supplementation coverage.
This paper describes the outreach strategy that was implemented in
the remote rural district. The strategy covered 63 outreach sites with
2 sites being covered per day and visited once per month for the
whole year. Coverage reached 71% in an area of previous coverage
rates of around less than 50%. We recommend further exploration of
this strategy by others working in similar circumstances. This
strategy can be a potential way for use by Scaling-Up-Nutrition
member states.
Abstract: Dynamic bandwidth allocation in EPONs can be
generally separated into inter-ONU scheduling and intra-ONU scheduling. In our previous work, the active intra-ONU scheduling
(AS) utilizes multiple queue reports (QRs) in each report message to cooperate with the inter-ONU scheduling and makes the granted
bandwidth fully utilized without leaving unused slot remainder (USR).
This scheme successfully solves the USR problem originating from the
inseparability of Ethernet frame. However, without proper setting of
threshold value in AS, the number of QRs constrained by the IEEE
802.3ah standard is not enough, especially in the unbalanced traffic
environment. This limitation may be solved by enlarging the threshold
value. The large threshold implies the large gap between the adjacent QRs, thus resulting in the large difference between the best granted bandwidth and the real granted bandwidth. In this paper, we integrate
AS with a cooperative prediction mechanism and distribute multiple
QRs to reduce the penalty brought by the prediction error.
Furthermore, to improve the QoS and save the usage of queue reports,
the highest priority (EF) traffic which comes during the waiting time is
granted automatically by OLT and is not considered in the requested
bandwidth of ONU. The simulation results show that the proposed
scheme has better performance metrics in terms of bandwidth
utilization and average delay for different classes of packets.
Abstract: Compression algorithms reduce the redundancy in
data representation to decrease the storage required for that data.
Lossless compression researchers have developed highly
sophisticated approaches, such as Huffman encoding, arithmetic
encoding, the Lempel-Ziv (LZ) family, Dynamic Markov
Compression (DMC), Prediction by Partial Matching (PPM), and
Burrows-Wheeler Transform (BWT) based algorithms.
Decompression is also required to retrieve the original data by
lossless means. A compression scheme for text files coupled with
the principle of dynamic decompression, which decompresses only
the section of the compressed text file required by the user instead of
decompressing the entire text file. Dynamic decompressed files offer
better disk space utilization due to higher compression ratios
compared to most of the currently available text file formats.
Abstract: The ever increasing product diversity and competition on the market of goods and services has dictated the pace of growth in the number of advertisements. Despite their admittedly diminished effectiveness over the recent years, advertisements remain the favored method of sales promotion. Consequently, the challenge for an advertiser is to explore every possible avenue of making an advertisement more noticeable, attractive and impellent for consumers. One way to achieve this is through invoking celebrity endorsements. On the one hand, the use of a celebrity to endorse a product involves substantial costs, however, on the other hand, it does not immediately guarantee the success of an advertisement. The question of how celebrities can be used in advertising to the best advantage is therefore of utmost importance. Celebrity endorsements have become commonplace: empirical evidence indicates that approximately 20 to 25 per cent of advertisements feature some famous person as a product endorser. The popularity of celebrity endorsements demonstrates the relevance of the topic, especially in the context of the current global economic downturn, when companies are forced to save in order to survive, yet simultaneously to heavily invest in advertising and sales promotion. The issue of the effective use of celebrity endorsements also figures prominently in the academic discourse. The study presented below is thus aimed at exploring what qualities (characteristics) of a celebrity endorser have an impact on the ffectiveness of the advertisement in which he/she appears and how.
Abstract: An appropriate project delivery system (PDS) is crucial
to the success of a construction projects. Case-based Reasoning (CBR)
is a useful support for PDS selection. However, the traditional CBR
approach represents cases as attribute-value vectors without taking
relations among attributes into consideration, and could not calculate
the similarity when the structures of cases are not strictly same.
Therefore, this paper solves this problem by adopting the Relational
Case-based Reasoning (RCBR) approach for PDS selection,
considering both the structural similarity and feature similarity. To
develop the feature terms of the construction projects, the criteria and
factors governing PDS selection process are first identified. Then
feature terms for the construction projects are developed. Finally, the
mechanism of similarity calculation and a case study indicate how
RCBR works for PDS selection. The adoption of RCBR in PDS
selection expands the scope of application of traditional CBR method
and improves the accuracy of the PDS selection system.
Abstract: Feature selection has recently been the subject of intensive research in data mining, specially for datasets with a large number of attributes. Recent work has shown that feature selection can have a positive effect on the performance of machine learning algorithms. The success of many learning algorithms in their attempts to construct models of data, hinges on the reliable identification of a small set of highly predictive attributes. The inclusion of irrelevant, redundant and noisy attributes in the model building process phase can result in poor predictive performance and increased computation. In this paper, a novel feature search procedure that utilizes the Ant Colony Optimization (ACO) is presented. The ACO is a metaheuristic inspired by the behavior of real ants in their search for the shortest paths to food sources. It looks for optimal solutions by considering both local heuristics and previous knowledge. When applied to two different classification problems, the proposed algorithm achieved very promising results.
Abstract: Segmentation, filtering out of measurement errors and
identification of breakpoints are integral parts of any analysis of
microarray data for the detection of copy number variation (CNV).
Existing algorithms designed for these tasks have had some successes
in the past, but they tend to be O(N2) in either computation time or
memory requirement, or both, and the rapid advance of microarray
resolution has practically rendered such algorithms useless. Here we
propose an algorithm, SAD, that is much faster and much less thirsty
for memory – O(N) in both computation time and memory requirement
-- and offers higher accuracy. The two key ingredients of SAD are the
fundamental assumption in statistics that measurement errors are
normally distributed and the mathematical relation that the product of
two Gaussians is another Gaussian (function). We have produced a
computer program for analyzing CNV based on SAD. In addition to
being fast and small it offers two important features: quantitative
statistics for predictions and, with only two user-decided parameters,
ease of use. Its speed shows little dependence on genomic profile.
Running on an average modern computer, it completes CNV analyses
for a 262 thousand-probe array in ~1 second and a 1.8 million-probe
array in 9 seconds
Abstract: The visualization of geographic information on mobile devices has become popular as the widespread use of mobile Internet. The mobility of these devices brings about much convenience to people-s life. By the add-on location-based services of the devices, people can have an access to timely information relevant to their tasks. However, visual analysis of geographic data on mobile devices presents several challenges due to the small display and restricted computing resources. These limitations on the screen size and resources may impair the usability aspects of the visualization applications. In this paper, a variable-scale visualization method is proposed to handle the challenge of small mobile display. By merging multiple scales of information into a single image, the viewer is able to focus on the interesting region, while having a good grasp of the surrounding context. This is essentially visualizing the map through a fisheye lens. However, the fisheye lens induces undesirable geometric distortion in the peripheral, which renders the information meaningless. The proposed solution is to apply map generalization that removes excessive information around the peripheral and an automatic smoothing process to correct the distortion while keeping the local topology consistent. The proposed method is applied on both artificial and real geographical data for evaluation.
Abstract: This study proposes a new recommender system based on the collaborative folksonomy. The purpose of the proposed system is to recommend Internet resources (such as books, articles, documents, pictures, audio and video) to users. The proposed method includes four steps: creating the user profile based on the tags, grouping the similar users into clusters using an agglomerative hierarchical clustering, finding similar resources based on the user-s past collections by using content-based filtering, and recommending similar items to the target user. This study examines the system-s performance for the dataset collected from “del.icio.us," which is a famous social bookmarking website. Experimental results show that the proposed tag-based collaborative and content-based filtering hybridized recommender system is promising and effectiveness in the folksonomy-based bookmarking website.
Abstract: In online context, the design and implementation of
effective remote laboratories environment is highly challenging on
account of hardware and software needs. This paper presents the
remote laboratory software framework modified from ilab shared
architecture (ISA). The ISA is a framework which enables students to
remotely acccess and control experimental hardware using internet
infrastructure. The need for remote laboratories came after
experiencing problems imposed by traditional laboratories. Among
them are: the high cost of laboratory equipment, scarcity of space,
scarcity of technical personnel along with the restricted university
budget creates a significant bottleneck on building required
laboratory experiments. The solution to these problems is to build
web-accessible laboratories. Remote laboratories allow students and
educators to interact with real laboratory equipment located
anywhere in the world at anytime. Recently, many universities and
other educational institutions especially in third world countries rely
on simulations because they do not afford the experimental
equipment they require to their students. Remote laboratories enable
users to get real data from real-time hand-on experiments. To
implement many remote laboratories, the system architecture should
be flexible, understandable and easy to implement, so that different
laboratories with different hardware can be deployed easily. The
modifications were made to enable developers to add more
equipment in ISA framework and to attract the new developers to
develop many online laboratories.
Abstract: The article contains results of the flour and bread
quality assessment from the grains of spring spelt, also called as an
ancient wheat. Spelt was cultivated on heavy and medium soils
observing principles of organic farming. Based on flour and bread
laboratory studies, as well as laboratory baking, the technological
usefulness of studied flour has been determined. These results were
referred to the standard derived from common wheat cultivated in the
same conditions. Grain of spring spelt is a good raw material for
manufacturing bread flour, from which to get high-quality bakery
products, but this is strictly dependent on the variety of ancient
wheat.
Abstract: This study examines the use of the persuasive strategy
of deixis and personalization in advertising slogans. This rhetorical/
stylistic and linguistic strategy has been found to be widely used in
advertising slogans for over a century. A total of five hundred
advertising slogans of multinational companies in both product and
service sectors were obtained. The analysis reveals the 3 main
components of this strategy as being deictic words, absolute
uniqueness and personal pronouns. The percentage and mean of the
use of the 3 components are tabulated. The findings show that
advertisers have used this persuasive strategy in creative ways to
persuade consumers to buy their products and services.