Abstract: A major requirement for Grid application developers is ensuring performance and scalability of their applications. Predicting the performance of an application demands understanding its specific features. This paper discusses performance modeling and prediction of multi-agent based simulation (MABS) applications on the Grid. An experiment conducted using a synthetic MABS workload explains the key features to be included in the performance model. The results obtained from the experiment show that the prediction model developed for the synthetic workload can be used as a guideline to understand to estimate the performance characteristics of real world simulation applications.
Abstract: Day by day technology increases and problems
associated with this technology also increase. Several researches
were carried out to investigate the deployment of such material safely
in geotechnical engineering in particular and civil engineering in
general. However, different types of waste material have such as
cement duct, fly ash and slag been proven to be suitable in several
applications. In this research cement dust mixed with different
percentages of sand will be used in some civil engineering
application as will be explained later in this paper throughout filed
and laboratory test. The used mixer (waste material with sand) prove
high performance, durability to environmental condition, low cost
and high benefits. At higher cement dust ratio, small cement ratio is
valuable for compressive strength and permeability. Also at small
cement dust ratio higher cement ratio is valuable for compressive
strength.
Abstract: With the advance in wireless networking, IEEE 802.16 WiMAX technology has been widely deployed for several applications such as “last mile" broadband service, cellular backhaul, and high-speed enterprise connectivity. As a result, military employed WiMAX as a high-speed wireless connection for data-link because of its point to multi-point and non-line-of-sight (NLOS) capability for many years. However, the risk of using WiMAX is a critical factor in some sensitive area of military applications especially in ammunition manufacturing such as solid propellant rocket production. The US DoD policy states that the following certification requirements are met for WiMAX: electromagnetic effects on the environment (E3) and Hazards of Electromagnetic Radiation to Ordnance (HERO). This paper discuses the Recommended Power Densities and Safe Separation Distance (SSD) for HERO on WiMAX systems deployed on solid propellant rocket production. The result of this research found that WiMAX is safe to operate at close proximity distances to the rocket production based on AF Guidance Memorandum immediately changing AFMAN 91-201.
Abstract: Artifact rejection plays a key role in many signal processing applications. The artifacts are disturbance that can occur during the signal acquisition and that can alter the analysis of the signals themselves. Our aim is to automatically remove the artifacts, in particular from the Electroencephalographic (EEG) recordings. A technique for the automatic artifact rejection, based on the Independent Component Analysis (ICA) for the artifact extraction and on some high order statistics such as kurtosis and Shannon-s entropy, was proposed some years ago in literature. In this paper we try to enhance this technique proposing a new method based on the Renyi-s entropy. The performance of our method was tested and compared to the performance of the method in literature and the former proved to outperform the latter.
Abstract: HSDPA is a new feature which is introduced in
Release-5 specifications of the 3GPP WCDMA/UTRA standard to
realize higher speed data rate together with lower round-trip times.
Moreover, the HSDPA concept offers outstanding improvement of
packet throughput and also significantly reduces the packet call
transfer delay as compared to Release -99 DSCH. Till now the
HSDPA system uses turbo coding which is the best coding technique
to achieve the Shannon limit. However, the main drawbacks of turbo
coding are high decoding complexity and high latency which makes
it unsuitable for some applications like satellite communications,
since the transmission distance itself introduces latency due to
limited speed of light. Hence in this paper it is proposed to use LDPC
coding in place of Turbo coding for HSDPA system which decreases
the latency and decoding complexity. But LDPC coding increases the
Encoding complexity. Though the complexity of transmitter
increases at NodeB, the End user is at an advantage in terms of
receiver complexity and Bit- error rate. In this paper LDPC Encoder
is implemented using “sparse parity check matrix" H to generate a
codeword at Encoder and “Belief Propagation algorithm "for LDPC
decoding .Simulation results shows that in LDPC coding the BER
suddenly drops as the number of iterations increase with a small
increase in Eb/No. Which is not possible in Turbo coding. Also same
BER was achieved using less number of iterations and hence the
latency and receiver complexity has decreased for LDPC coding.
HSDPA increases the downlink data rate within a cell to a theoretical
maximum of 14Mbps, with 2Mbps on the uplink. The changes that
HSDPA enables includes better quality, more reliable and more
robust data services. In other words, while realistic data rates are
only a few Mbps, the actual quality and number of users achieved
will improve significantly.
Abstract: augmented reality is a technique used to insert virtual objects in real scenes. One of the most used libraries in the area is the ARToolkit library. It is based on the recognition of the markers that are in the form of squares with a pattern inside. This pattern which is mostly textual is source of confusing. In this paper, we present the results of a classification of Latin characters as a pattern on the ARToolkit markers to know the most distinguishable among them.
Abstract: This paper proposes a “soft systems" approach to
domain-driven design of computer-based information systems. We
propose a systemic framework combining techniques from Soft
Systems Methodology (SSM), the Unified Modelling Language
(UML), and an implementation pattern known as “Naked Objects".
We have used this framework in action research projects that have
involved the investigation and modelling of business processes using
object-oriented domain models and the implementation of software
systems based on those domain models. Within the proposed
framework, Soft Systems Methodology (SSM) is used as a guiding
methodology to explore the problem situation and to generate a
ubiquitous language (soft language) which can be used as the basis
for developing an object-oriented domain model. The domain model
is further developed using techniques based on the UML and is
implemented in software following the “Naked Objects"
implementation pattern. We argue that there are advantages from
combining and using techniques from different methodologies in this
way.
The proposed systemic framework is overviewed and justified as
multimethodologyusing Mingers multimethodology ideas.
This multimethodology approach is being evaluated through a
series of action research projects based on real-world case studies. A
Peer-Tutoring case study is presented here as a sample of the
framework evaluation process
Abstract: Lycopene, which can be extracted from plants and is
very popular for fruit intake, is restricted for healthy food development
due to its high price. On the other hand, it will get great safety
concerns, especially in the food or cosmetic application, if the raw
material of lycopene is produced by chemical synthesis. In this
project, we provide a key technology to bridge the limitation as
mentioned above. Based on the abundant bioresources of BCRC
(Bioresource Collection and Research Center, Taiwan), a promising
lycopene output will be anticipated by the introduction of fermentation
technology along with industry-related core energy. Our results
showed that addition of tween 80(0.2%) and span 20 produced higher
amount of lycopene. And piperidine, when was added at 48hr to the
cultivation medium, could promote lycopene excretion effectively
also.
Abstract: Internet computer games turn to be more and more
attractive within the context of technology enhanced learning.
Educational games as quizzes and quests have gained significant
success in appealing and motivating learners to study in a different
way and provoke steadily increasing interest in new methods of
application. Board games are specific group of games where figures
are manipulated in competitive play mode with race conditions on a
surface according predefined rules. The article represents a new,
formalized model of traditional quizzes, puzzles and quests shown as
multimedia board games which facilitates the construction process of
such games. Authors provide different examples of quizzes and their
models in order to demonstrate the model is quite general and does
support not only quizzes, mazes and quests but also any set of
teaching activities. The execution process of such models is
explained and, as well, how they can be useful for creation and
delivery of adaptive e-learning courseware.
Abstract: Detection and tracking of the lip contour is an important
issue in speechreading. While there are solutions for lip tracking
once a good contour initialization in the first frame is available,
the problem of finding such a good initialization is not yet solved
automatically, but done manually. We have developed a new tracking
solution for lip contour detection using only few landmarks (15
to 25) and applying the well known Active Shape Models (ASM).
The proposed method is a new LMS-like adaptive scheme based on
an Auto regressive (AR) model that has been fit on the landmark
variations in successive video frames. Moreover, we propose an extra
motion compensation model to address more general cases in lip
tracking. Computer simulations demonstrate a fair match between
the true and the estimated spatial pixels. Significant improvements
related to the well known LMS approach has been obtained via a
defined Frobenius norm index.
Abstract: In today-s era of plasma and laser cutting, machines using oxy-acetylene flame are also meritorious due to their simplicity and cost effectiveness. The objective to devise a Computer controlled Oxy-Fuel profile cutting machine arose from the increasing demand for metal cutting with respect to edge quality, circularity and lesser formation of redeposit material. The System has an 8 bit micro controller based embedded system, which assures stipulated time response. A new window based Application software was devised which takes a standard CAD file .DXF as input and converts it into numerical data required for the controller. It uses VB6 as a front end whereas MS-ACCESS and AutoCAD as back end. The system is designed around AT89C51RD2, powerful 8 bit, ISP micro controller from Atmel and is optimized to achieve cost effectiveness and also maintains the required accuracy and reliability for complex shapes. The backbone of the system is a cleverly designed mechanical assembly along with the embedded system resulting in an accuracy of about 10 microns while maintaining perfect linearity in the cut. This results in substantial increase in productivity. The observed results also indicate reduced inter laminar spacing of pearlite with an increase in the hardness of the edge region.
Abstract: This paper presents a complete procedure for tool path
planning and blade machining in 5-axis manufacturing. The actual
cutting contact and cutter locations can be determined by lead and tilt
angles. The tool path generation is implemented by piecewise curved
approximation and chordal deviation detection. An application about
drive surface method promotes flexibility of tool control and stability
of machine motion. A real manufacturing process is proposed to
separate the operation into three regions with five stages and to modify
the local tool orientation with an interactive algorithm.
Abstract: The main goal of microarray experiments is to quantify the expression of every object on a slide as precisely as possible, with a further goal of clustering the objects. Recently, many studies have discussed clustering issues involving similar patterns of gene expression. This paper presents an application of fuzzy-type methods for clustering DNA microarray data that can be applied to typical comparisons. Clustering and analyses were performed on microarray and simulated data. The results show that fuzzy-possibility c-means clustering substantially improves the findings obtained by others.
Abstract: The article describes a case study on one of Czech
Republic-s manufacturing middle size enterprises (ME), where due to
the European financial crisis, production lines had to be redesigned
and optimized in order to minimize the total costs of the production
of goods. It is considered an optimization problem of minimizing the
total cost of the work load, according to the costs of the possible
locations of the workplaces, with an application of the Greedy
algorithm and a partial analogy to a Set Packing Problem. The
displacement of working tables in a company should be as a one-toone
monotone increasing function in order for the total costs of
production of the goods to be at minimum. We use a heuristic
approach with greedy algorithm for solving this linear optimization
problem, regardless the possible greediness which may appear and
we apply it in a Czech ME.
Abstract: This research deals with a flexible flowshop
scheduling problem with arrival and delivery of jobs in groups and
processing them individually. Due to the special characteristics of
each job, only a subset of machines in each stage is eligible to
process that job. The objective function deals with minimization of
sum of the completion time of groups on one hand and minimization
of sum of the differences between completion time of jobs and
delivery time of the group containing that job (waiting period) on the
other hand. The problem can be stated as FFc / rj , Mj / irreg which
has many applications in production and service industries. A
mathematical model is proposed, the problem is proved to be NPcomplete,
and an effective heuristic method is presented to schedule
the jobs efficiently. This algorithm can then be used within the body
of any metaheuristic algorithm for solving the problem.
Abstract: The objective of the presented work is to implement the Kalman Filter into an application that reduces the influence of the environmental changes over the robot expected to navigate over a terrain of varying friction properties. The Discrete Kalman Filter is used to estimate the robot position, project the estimated current state ahead at time through time update and adjust the projected estimated state by an actual measurement at that time via the measurement update using the data coming from the infrared sensors, ultrasonic sensors and the visual sensor respectively. The navigation test has been performed in a real world environment and has been found to be robust.
Abstract: In this paper, a simple active contour based visual
tracking algorithm is presented for outdoor AGV application which is
currently under development at the USM robotic research group
(URRG) lab. The presented algorithm is computationally low cost
and able to track road boundaries in an image sequence and can
easily be implemented on available low cost hardware. The proposed
algorithm used an active shape modeling using the B-spline
deformable template and recursive curve fitting method to track the
current orientation of the road.
Abstract: In this communication an expression for mean
velocity of waste flow via an open channel is proposed which
is an improvement over Manning formula. The discharges,
storages and depths are computed at all locations of the Lyari river
by utilizing proposed expression. The results attained through
proposed expression are in good agreement with the observed data
and better than those acquired using Manning formula.
Abstract: Finding the minimal logical functions has important applications in the design of logical circuits. This task is solved by many different methods but, frequently, they are not suitable for a computer implementation. We briefly summarise the well-known Quine-McCluskey method, which gives a unique procedure of computing and thus can be simply implemented, but, even for simple examples, does not guarantee an optimal solution. Since the Petrick extension of the Quine-McCluskey method does not give a generally usable method for finding an optimum for logical functions with a high number of values, we focus on interpretation of the result of the Quine-McCluskey method and show that it represents a set covering problem that, unfortunately, is an NP-hard combinatorial problem. Therefore it must be solved by heuristic or approximation methods. We propose an approach based on genetic algorithms and show suitable parameter settings.
Abstract: Diagnosis can be achieved by building a model of a
certain organ under surveillance and comparing it with the real time
physiological measurements taken from the patient. This paper deals
with the presentation of the benefits of using Data Mining techniques
in the computer-aided diagnosis (CAD), focusing on the cancer
detection, in order to help doctors to make optimal decisions quickly
and accurately. In the field of the noninvasive diagnosis techniques,
the endoscopic ultrasound elastography (EUSE) is a recent elasticity
imaging technique, allowing characterizing the difference between
malignant and benign tumors. Digitalizing and summarizing the main
EUSE sample movies features in a vector form concern with the use
of the exploratory data analysis (EDA). Neural networks are then
trained on the corresponding EUSE sample movies vector input in
such a way that these intelligent systems are able to offer a very
precise and objective diagnosis, discriminating between benign and
malignant tumors. A concrete application of these Data Mining
techniques illustrates the suitability and the reliability of this
methodology in CAD.