Abstract: This paper presents a method for determining the
uniaxial tensile properties such as Young-s modulus, yield strength
and the flow behaviour of a material in a virtually non-destructive
manner. To achieve this, a new dumb-bell shaped miniature
specimen has been designed. This helps in avoiding the removal of
large size material samples from the in-service component for the
evaluation of current material properties. The proposed miniature
specimen has an advantage in finite element modelling with respect
to computational time and memory space. Test fixtures have been
developed to enable the tension tests on the miniature specimen in a
testing machine. The studies have been conducted in a chromium
(H11) steel and an aluminum alloy (AR66). The output from the
miniature test viz. load-elongation diagram is obtained and the finite
element simulation of the test is carried out using a 2D plane stress
analysis. The results are compared with the experimental results. It is
observed that the results from the finite element simulation
corroborate well with the miniature test results. The approach seems
to have potential to predict the mechanical properties of the
materials, which could be used in remaining life estimation of the
various in-service structures.
Abstract: Model-based approaches have been applied successfully
to a wide range of tasks such as specification, simulation, testing, and
diagnosis. But one bottleneck often prevents the introduction of these
ideas: Manual modeling is a non-trivial, time-consuming task.
Automatically deriving models by observing and analyzing running
systems is one possible way to amend this bottleneck. To
derive a model automatically, some a-priori knowledge about the
model structure–i.e. about the system–must exist. Such a model
formalism would be used as follows: (i) By observing the network
traffic, a model of the long-term system behavior could be generated
automatically, (ii) Test vectors can be generated from the model,
(iii) While the system is running, the model could be used to diagnose
non-normal system behavior.
The main contribution of this paper is the introduction of a model
formalism called 'probabilistic regression automaton' suitable for the
tasks mentioned above.
Abstract: The presence of cold air with the convergent
topography of the Lut valley over the valley-s sloping terrain can
generate Low Level Jets (LLJ). Moreover, the valley-parallel
pressure gradients and northerly LLJ are produced as a result of the
large-scale processes. In the numerical study the regional MM5
model was run leading to achieve an appropriate dynamical analysis
of flows in the region for summer and winter. The results of this
study show the presence of summer synoptical systems cause the
formation of north-south pressure gradients in the valley which could
be led to the blowing of winds with the velocity more than 14 ms-1
and vulnerable dust and wind storms lasting more than 120 days.
Whereas the presence of cold air masses in the region in winter,
cause the average speed of LLJs decrease. In this time downslope
flows are noticeable in creating the night LLJs.
Abstract: Bioinformatics and computational biology involve
the use of techniques including applied mathematics,
informatics, statistics, computer science, artificial intelligence,
chemistry, and biochemistry to solve biological problems
usually on the molecular level. Research in computational
biology often overlaps with systems biology. Major research
efforts in the field include sequence alignment, gene finding,
genome assembly, protein structure alignment, protein structure
prediction, prediction of gene expression and proteinprotein
interactions, and the modeling of evolution. Various
global rearrangements of permutations, such as reversals and
transpositions,have recently become of interest because of their
applications in computational molecular biology. A reversal is
an operation that reverses the order of a substring of a permutation.
A transposition is an operation that swaps two adjacent
substrings of a permutation. The problem of determining the
smallest number of reversals required to transform a given
permutation into the identity permutation is called sorting by
reversals. Similar problems can be defined for transpositions
and other global rearrangements. In this work we perform a
study about some genome rearrangement primitives. We show
how a genome is modelled by a permutation, introduce some
of the existing primitives and the lower and upper bounds
on them. We then provide a comparison of the introduced
primitives.
Abstract: In this study, the contact problem of a layered composite which consists of two materials with different elastic constants and heights resting on two rigid flat supports with sharp edges is considered. The effect of gravity is neglected. While friction between the layers is taken into account, it is assumed that there is no friction between the supports and the layered composite so that only compressive tractions can be transmitted across the interface. The layered composite is subjected to a uniform clamping pressure over a finite portion of its top surface. The problem is reduced to a singular integral equation in which the contact pressure is the unknown function. The singular integral equation is evaluated numerically and the results for various dimensionless quantities are presented in graphical forms.
Abstract: In this paper, a multi-agent robot system is presented. The system consists of four robots. The developed robots are able to automatically enter and patrol a harmful environment, such as the building infected with virus or the factory with leaking hazardous gas. Further, every robot is able to perform obstacle avoidance and search for the victims. Several operation modes are designed: remote control, obstacle avoidance, automatic searching, and so on.
Abstract: An integrated Artificial Neural Network- Particle Swarm Optimization (PSO) is presented for analyzing global electricity consumption. To aim this purpose, following steps are done: STEP 1: in the first step, PSO is applied in order to determine world-s oil, natural gas, coal and primary energy demand equations based on socio-economic indicators. World-s population, Gross domestic product (GDP), oil trade movement and natural gas trade movement are used as socio-economic indicators in this study. For each socio-economic indicator, a feed-forward back propagation artificial neural network is trained and projected for future time domain. STEP 2: in the second step, global electricity consumption is projected based on the oil, natural gas, coal and primary energy consumption using PSO. global electricity consumption is forecasted up to year 2040.
Abstract: Video Mosaicing is the stitching of selected frames of
a video by estimating the camera motion between the frames and
thereby registering successive frames of the video to arrive at the
mosaic. Different techniques have been proposed in the literature for
video mosaicing. Despite of the large number of papers dealing with
techniques to generate mosaic, only a few authors have investigated
conditions under which these techniques generate good estimate of
motion parameters. In this paper, these techniques are studied under
different videos, and the reasons for failures are found. We propose
algorithms with incorporation of outlier removal algorithms for better
estimation of motion parameters.
Abstract: In this paper we illuminate a frequency domain based
classification method for video scenes. Videos from certain topical
areas often contain activities with repeating movements. Sports
videos, home improvement videos, or videos showing mechanical
motion are some example areas. Assessing main and side frequencies
of each repeating movement gives rise to the motion type. We
obtain the frequency domain by transforming spatio-temporal motion
trajectories. Further on we explain how to compute frequency features
for video clips and how to use them for classifying. The focus of
the experimental phase is on transforms utilized for our system.
By comparing various transforms, experiments show the optimal
transform for a motion frequency based approach.
Abstract: Ethanol has been known for a long time, being
perhaps the oldest product obtained through traditional biotechnology
fermentation. Agriculture waste as substrate in fermentation is vastly
discussed as alternative to replace edible food and utilization of
organic material. Pineapple peel, highly potential source as substrate
is a by-product of the pineapple processing industry. Bio-ethanol
from pineapple (Ananas comosus) peel extract was carried out by
controlling fermentation without any treatment. Saccharomyces
ellipsoides was used as inoculum in this fermentation process as it is
naturally found at the pineapple skin. In this study, the capability of
Response Surface Methodology (RSM) for optimization of ethanol
production from pineapple peel extract using Saccharomyces
ellipsoideus in batch fermentation process was investigated. Effect of
five test variables in a defined range of inoculum concentration 6-
14% (v/v), pH (4.0-6.0), sugar concentration (14-22°Brix),
temperature (24-32°C) and time of incubation (30-54 hrs) on the
ethanol production were evaluated. Data obtained from experiment
were analyzed with RSM of MINITAB Software (Version 15)
whereby optimum ethanol concentration of 8.637% (v/v) was
determined. The optimum condition of 14% (v/v) inoculum
concentration, pH 6, 22°Brix, 26°C and 30hours of incubation. The
significant regression equation or model at the 5% level with
correlation value of 99.96% was also obtained.
Abstract: The paper describes the evaluation of quality of
control for cases of controlled non-minimal phase plants. Control
circuits containing non-minimal phase plants have different
properties, they manifest reversed reaction at the beginning of unit
step response. For these types of plants are developed special
criterion of quality of control, which considers the difference and can
be helpful for synthesis of optimal controller tuning. All results are
clearly presented using Matlab/Simulink models.
Abstract: Positron emission particle tracking (PEPT) is a
technique in which a single radioactive tracer particle can be
accurately tracked as it moves. A limitation of PET is that in order to
reconstruct a tomographic image it is necessary to acquire a large
volume of data (millions of events), so it is difficult to study rapidly
changing systems. By considering this fact, PEPT is a very fast
process compared with PET.
In PEPT detecting both photons defines a line and the annihilation
is assumed to have occurred somewhere along this line. The location
of the tracer can be determined to within a few mm from coincident
detection of a small number of pairs of back-to-back gamma rays and
using triangulation. This can be achieved many times per second and
the track of a moving particle can be reliably followed. This
technique was invented at the University of Birmingham [1].
The attempt in PEPT is not to form an image of the tracer particle
but simply to determine its location with time. If this tracer is
followed for a long enough period within a closed, circulating system
it explores all possible types of motion.
The application of PEPT to industrial process systems carried out
at the University of Birmingham is categorized in two subjects: the
behaviour of granular materials and viscous fluids. Granular
materials are processed in industry for example in the manufacture of
pharmaceuticals, ceramics, food, polymers and PEPT has been used
in a number of ways to study the behaviour of these systems [2].
PEPT allows the possibility of tracking a single particle within the
bed [3]. Also PEPT has been used for studying systems such as: fluid
flow, viscous fluids in mixers [4], using a neutrally-buoyant tracer
particle [5].
Abstract: Automatic detection of bleeding is of practical
importance since capsule endoscopy produces an extremely large
number of images. Algorithm development of bleeding detection in
the digestive tract is difficult due to different contrasts among the
images, food dregs, secretion and others. In this study, were assigned
weighting factors derived from the independent features of the
contrast and brightness between bleeding and normality. Spectral
analysis based on weighting factors was fast and accurate. Results
were a sensitivity of 87% and a specificity of 90% when the accuracy
was determined for each pixel out of 42 endoscope images.
Abstract: Feature and model selection are in the center of
attention of many researches because of their impact on classifiers-
performance. Both selections are usually performed separately but
recent developments suggest using a combined GA-SVM approach to
perform them simultaneously. This approach improves the
performance of the classifier identifying the best subset of variables
and the optimal parameters- values. Although GA-SVM is an
effective method it is computationally expensive, thus a rough
method can be considered. The paper investigates a joined approach
of Genetic Algorithm and kernel matrix criteria to perform
simultaneously feature and model selection for SVM classification
problem. The purpose of this research is to improve the classification
performance of SVM through an efficient approach, the Kernel
Matrix Genetic Algorithm method (KMGA).
Abstract: Using the animations video of teaching materials is an
effective learning method. However, we thought that more effective learning method is to produce the teaching video by learners
themselves. The learners who act as the producer must learn and understand well to produce and present video of teaching materials to
others. The purpose of this study is to propose the project based learning (PBL) technique by co-producing video of IT (information
technology) teaching materials. We used the T2V player to produce
the video based on TVML a TV program description language. By
proposed method, we have assigned the learners to produce the
animations video for “National Examination for Information
Processing Technicians (IPA examination)" in Japan, in order to get
them learns various knowledge and skill on IT field. Experimental
result showed that learning effect has occurred at the video production
process that useful for IT personnel resources development.
Abstract: Motion estimation is a key problem in video
processing and computer vision. Optical flow motion estimation can
achieve high estimation accuracy when motion vector is small.
Three-step search algorithm can handle large motion vector but not
very accurate. A joint algorithm was proposed in this paper to
achieve high estimation accuracy disregarding whether the motion
vector is small or large, and keep the computation cost much lower
than full search.
Abstract: The fuzzy technique is an operator introduced in order
to simulate at a mathematical level the compensatory behavior in
process of decision making or subjective evaluation. The following
paper introduces such operators on hand of computer vision
application.
In this paper a novel method based on fuzzy logic reasoning
strategy is proposed for edge detection in digital images without
determining the threshold value. The proposed approach begins by
segmenting the images into regions using floating 3x3 binary matrix.
The edge pixels are mapped to a range of values distinct from each
other. The robustness of the proposed method results for different
captured images are compared to those obtained with the linear Sobel
operator. It is gave a permanent effect in the lines smoothness and
straightness for the straight lines and good roundness for the curved
lines. In the same time the corners get sharper and can be defined
easily.
Abstract: This article describes design of the 8-bit asynchronous
microcontroller simulation model in VHDL. The model is created in
ISE Foundation design tool and simulated in Modelsim tool. This
model is a simple application example of asynchronous systems
designed in synchronous design tools. The design process of creating
asynchronous system with 4-phase bundled-data protocol and with
matching delays is described in the article. The model is described in
gate-level abstraction.
The simulation waveform of the functional construction is the
result of this article. Described construction covers only the
simulation model. The next step would be creating synthesizable
model to FPGA.
Abstract: The excessive consumption of fossil energies (electrical energy) during summer caused by the technological development involves more and more climate warming.
In order to reduce the worst impact of gas emissions produced from classical air conditioning, heat driven solar absorption chiller is pretty promising; it consists on using solar as motive energy which is clean and environmentally friendly to provide cold.
Solar absorption machine is composed by four components using Lithium Bromide /water as a refrigerating couple. LiBr- water is the most promising in chiller applications due to high safety, high volatility ratio, high affinity, high stability and its high latent heat. The lithium bromide solution is constitute by the salt lithium bromide which absorbs water under certain conditions of pressure and temperature however if the concentration of the solution is high in the absorption chillers; which exceed 70%, the solution will crystallize.
The main aim of this article is to study the phenomena of the crystallization and to evaluate how the dependence between the electric conductivity and the concentration which should be controlled.
Abstract: The aim of this paper is to emphasize and alleviate the effect of phase noise due to imperfect local oscillators on the performances of a Multi-Carrier CDMA system. After the cancellation of Common Phase Error (CPE), an iterative approach is introduced which iteratively estimates Inter-Carrier Interference (ICI) components in the frequency domain and cancels their contribution in the time domain. Simulation are conducted in order to investigate the achievable performances for several parameters, such as the spreading factor, the modulation order, the phase noise power and the transmission Signal-to-Noise Ratio.