Abstract: In this paper we presented a new method for tracking
flying targets in color video sequences based on contour and kernel.
The aim of this work is to overcome the problem of losing target in
changing light, large displacement, changing speed, and occlusion.
The proposed method is made in three steps, estimate the target
location by particle filter, segmentation target region using neural
network and find the exact contours by greedy snake algorithm. In
the proposed method we have used both region and contour
information to create target candidate model and this model is
dynamically updated during tracking. To avoid the accumulation of
errors when updating, target region given to a perceptron neural
network to separate the target from background. Then its output used
for exact calculation of size and center of the target. Also it is used as
the initial contour for the greedy snake algorithm to find the exact
target's edge. The proposed algorithm has been tested on a database
which contains a lot of challenges such as high speed and agility of
aircrafts, background clutter, occlusions, camera movement, and so
on. The experimental results show that the use of neural network
increases the accuracy of tracking and segmentation.
Abstract: We assume an IoT-based smart-home environment where the on-off status of each of the electrical appliances including the room lights can be recognized in a real time by monitoring and analyzing the smart meter data. At any moment in such an environment, we can recognize what the household or the user is doing by referring to the status data of the appliances. In this paper, we focus on a smart-home service that is to activate a robot vacuum cleaner at right time by recognizing the user situation, which requires a situation-aware model that can distinguish the situations that allow vacuum cleaning (Yes) from those that do not (No). We learn as our candidate models a few classifiers such as naïve Bayes, decision tree, and logistic regression that can map the appliance-status data into Yes and No situations. Our training and test data are obtained from simulations of user behaviors, in which a sequence of user situations such as cooking, eating, dish washing, and so on is generated with the status of the relevant appliances changed in accordance with the situation changes. During the simulation, both the situation transition and the resulting appliance status are determined stochastically. To compare the performances of the aforementioned classifiers we obtain their learning curves for different types of users through simulations. The result of our empirical study reveals that naïve Bayes achieves a slightly better classification accuracy than the other compared classifiers.
Abstract: This paper covers application of an elitist selfadaptive
step-size search (ESASS) to optimum design of steel
skeletal structures. In the ESASS two approaches are considered for
improving the convergence accuracy as well as the computational
efficiency of the original technique namely the so called selfadaptive
step-size search (SASS). Firstly, an additional randomness
is incorporated into the sampling step of the technique to preserve
exploration capability of the algorithm during the optimization.
Moreover, an adaptive sampling scheme is introduced to improve the
quality of final solutions. Secondly, computational efficiency of the
technique is accelerated via avoiding unnecessary analyses during the
optimization process using an upper bound strategy. The numerical
results demonstrate the usefulness of the ESASS in the sizing
optimization problems of steel truss and frame structures.
Abstract: An accuracy nonlinear analysis of a deep beam resting on elastic perfectly plastic soil is carried out in this study. In fact, a nonlinear finite element modeling for large deflection and moderate rotation of Euler-Bernoulli beam resting on linear and nonlinear random soil is investigated. The geometric nonlinear analysis of the beam is based on the theory of von Kàrmàn, where the Newton-Raphson incremental iteration method is implemented in a Matlab code to solve the nonlinear equation of the soil-beam interaction system. However, two analyses (deterministic and probabilistic) are proposed to verify the accuracy and the efficiency of the proposed model where the theory of the local average based on the Monte Carlo approach is used to analyze the effect of the spatial variability of the soil properties on the nonlinear beam response. The effect of six main parameters are investigated: the external load, the length of a beam, the coefficient of subgrade reaction of the soil, the Young’s modulus of the beam, the coefficient of variation and the correlation length of the soil’s coefficient of subgrade reaction. A comparison between the beam resting on linear and nonlinear soil models is presented for different beam’s length and external load. Numerical results have been obtained for the combination of the geometric nonlinearity of beam and material nonlinearity of random soil. This comparison highlighted the need of including the material nonlinearity and spatial variability of the soil in the geometric nonlinear analysis, when the beam undergoes large deflections.
Abstract: Texture is an important characteristic in real and
synthetic scenes. Texture analysis plays a critical role in inspecting
surfaces and provides important techniques in a variety of
applications. Although several descriptors have been presented to
extract texture features, the development of object recognition is still a
difficult task due to the complex aspects of texture. Recently, many
robust and scaling-invariant image features such as SIFT, SURF and
ORB have been successfully used in image retrieval and object
recognition. In this paper, we have tried to compare the performance
for texture classification using these feature descriptors with k-means
clustering. Different classifiers including K-NN, Naive Bayes, Back
Propagation Neural Network , Decision Tree and Kstar were applied in
three texture image sets - UIUCTex, KTH-TIPS and Brodatz,
respectively. Experimental results reveal SIFTS as the best average
accuracy rate holder in UIUCTex, KTH-TIPS and SURF is
advantaged in Brodatz texture set. BP neuro network works best in the
test set classification among all used classifiers.
Abstract: Health diseases have a vital significance affecting human being's life and life quality. Sudden death events can be prevented owing to early diagnosis and treatment methods. Electrical signals, taken from the human being's body using non-invasive methods and showing the heart activity is called Electrocardiogram (ECG). The ECG signal is used for following daily activity of the heart by clinicians. Heart Rate Variability (HRV) is a physiological parameter giving the variation between the heart beats. ECG data taken from MITBIH Arrhythmia Database is used in the model employed in this study. The detection of arrhythmic heart beats is aimed utilizing the features extracted from the HRV time domain parameters. The developed model provides a satisfactory performance with ~89% accuracy, 91.7 % sensitivity and 85% specificity rates for the detection of arrhythmic beats.
Abstract: This paper details the progress made in the development of the different state-of-the-art aerodynamic tools for the analysis of vertical axis wind turbines including the flow simulation around the blade, viscous flow, stochastic wind, and dynamic stall effects. The paper highlights the capabilities of the developed wind turbine aerodynamic codes over the last thirty years which are currently being used in North America and Europe by Sandia Laboratories, FloWind, IMST Marseilles, and Hydro-Quebec among others. The aerodynamic codes developed at Ecole Polytechnique de Montreal, Canada, represent valuable tools for simulating the flow around wind turbines including secondary effects. Comparison of theoretical results with experimental data have shown good agreement. The strength of the aerodynamic codes based on Double-Multiple Stream tube model (DMS) lies in its simplicity, accuracy, and ability to analyze secondary effects that interfere with wind turbine aerodynamic calculations.
Abstract: In this paper, we present an analytical method for
analysis of nano-scale spherical shell subjected to thermo-mechanical
shocks based on nonlocal elasticity theory. Thermo-mechanical
properties of nano shpere is assumed to be temperature dependent.
Governing partial differential equation of motion is solved
analytically by using Laplace transform for time domain and power
series for spacial domain. The results in Laplace domain is
transferred to time domain by employing the fast inverse Laplace
transform (FLIT) method. Accuracy of present approach is assessed
by comparing the the numerical results with the results of published
work in literature. Furtheremore, the effects of non-local parameter
and wall thickness on the dynamic characteristics of the nano-sphere
are studied.
Abstract: Mining big data represents a big challenge nowadays. Many types of research are concerned with mining massive amounts of data and big data streams. Mining big data faces a lot of challenges including scalability, speed, heterogeneity, accuracy, provenance and privacy. In telecommunication industry, mining big data is like a mining for gold; it represents a big opportunity and maximizing the revenue streams in this industry. This paper discusses the characteristics of big data (volume, variety, velocity and veracity), data mining techniques and tools for handling very large data sets, mining big data in telecommunication and the benefits and opportunities gained from them.
Abstract: In recent years, a wide variety of applications are developed with Support Vector Machines -SVM- methods and Artificial Neural Networks -ANN-. In general, these methods depend on intrusion knowledge databases such as KDD99, ISCX, and CAIDA among others. New classes of detectors are generated by machine learning techniques, trained and tested over network databases. Thereafter, detectors are employed to detect anomalies in network communication scenarios according to user’s connections behavior. The first detector based on training dataset is deployed in different real-world networks with mobile and non-mobile devices to analyze the performance and accuracy over static detection. The vulnerabilities are based on previous work in telemedicine apps that were developed on the research group. This paper presents the differences on detections results between some network scenarios by applying traditional detectors deployed with artificial neural networks and support vector machines.
Abstract: Growth and remodeling of biological structures have
gained lots of attention over the past decades. Determining the
response of living tissues to mechanical loads is necessary for a wide
range of developing fields such as prosthetics design or computerassisted
surgical interventions. It is a well-known fact that biological
structures are never stress-free, even when externally unloaded. The
exact origin of these residual stresses is not clear, but theoretically,
growth is one of the main sources. Extracting body organ’s shapes
from medical imaging does not produce any information regarding
the existing residual stresses in that organ. The simplest cause of such
stresses is gravity since an organ grows under its influence from
birth. Ignoring such residual stresses might cause erroneous results in
numerical simulations. Accounting for residual stresses due to tissue
growth can improve the accuracy of mechanical analysis results. This
paper presents an original computational framework based on gradual
growth to determine the residual stresses due to growth. To illustrate
the method, we apply it to a finite element model of a healthy human
face reconstructed from medical images. The distribution of residual
stress in facial tissues is computed, which can overcome the effect of
gravity and maintain tissues firmness. Our assumption is that tissue
wrinkles caused by aging could be a consequence of decreasing
residual stress and thus not counteracting gravity. Taking into
account these stresses seems therefore extremely important in
maxillofacial surgery. It would indeed help surgeons to estimate
tissues changes after surgery.
Abstract: Open jet testing is a valuable testing technique which
provides the desired results with reasonable accuracy. It has been
used in past for the airships and now has recently been applied for the
hybrid ones, having more non-buoyant force coming from the wings,
empennage and the fuselage. In the present review work, an effort
has been done to review the challenges involved in open jet testing.
In order to shed light on the application of this technique, the
experimental results of two different configurations are presented.
Although, the aerodynamic results of such vehicles are unique to its
own design; however, it will provide a starting point for planning any
future testing. Few important testing areas which need more attention
are also highlighted. Most of the hybrid buoyant aerial vehicles are
unconventional in shape and there experimental data is generated,
which is unique to its own design.
Abstract: This paper describes a simple way to control the speed
of PMBLDC motor using Fuzzy logic control method. In the
conventional PI controller the performance of the motor system is
simulated and the speed is regulated by using PI controller. These
methods used to improve the performance of PMSM drives, but in
some cases at different operating conditions when the dynamics of
the system also vary over time and it can change the reference speed,
parameter variations and the load disturbance. The simulation is
powered with the MATLAB program to get a reliable and flexible
simulation. In order to highlight the effectiveness of the speed control
method the FLC method is used. The proposed method targeted in
achieving the improved dynamic performance and avoids the
variations of the motor drive. This drive has high accuracy, robust
operation from near zero to high speed. The effectiveness and
flexibility of the individual techniques of the speed control method
will be thoroughly discussed for merits and demerits and finally
verified through simulation and experimental results for comparative
analysis.
Abstract: This paper discusses the applicability of the numerical model for a damage prediction method of the accidental hydrogen explosion occurring in a hydrogen facility. The numerical model was based on an unstructured finite volume method (FVM) code “NuFD/FrontFlowRed”. For simulating unsteady turbulent combustion of leaked hydrogen gas, a combination of Large Eddy Simulation (LES) and a combustion model were used. The combustion model was based on a two scalar flamelet approach, where a G-equation model and a conserved scalar model expressed a propagation of premixed flame surface and a diffusion combustion process, respectively. For validation of this numerical model, we have simulated the previous two types of hydrogen explosion tests. One is open-space explosion test, and the source was a prismatic 5.27 m3 volume with 30% of hydrogen-air mixture. A reinforced concrete wall was set 4 m away from the front surface of the source. The source was ignited at the bottom center by a spark. The other is vented enclosure explosion test, and the chamber was 4.6 m × 4.6 m × 3.0 m with a vent opening on one side. Vent area of 5.4 m2 was used. Test was performed with ignition at the center of the wall opposite the vent. Hydrogen-air mixtures with hydrogen concentrations close to 18% vol. were used in the tests. The results from the numerical simulations are compared with the previous experimental data for the accuracy of the numerical model, and we have verified that the simulated overpressures and flame time-of-arrival data were in good agreement with the results of the previous two explosion tests.
Abstract: Data fusion technology can be the best way to extract
useful information from multiple sources of data. It has been widely
applied in various applications. This paper presents a data fusion
approach in multimedia data for event detection in twitter by using
Dempster-Shafer evidence theory. The methodology applies a mining
algorithm to detect the event. There are two types of data in the
fusion. The first is features extracted from text by using the bag-ofwords
method which is calculated using the term frequency-inverse
document frequency (TF-IDF). The second is the visual features
extracted by applying scale-invariant feature transform (SIFT). The
Dempster - Shafer theory of evidence is applied in order to fuse the
information from these two sources. Our experiments have indicated
that comparing to the approaches using individual data source, the
proposed data fusion approach can increase the prediction accuracy
for event detection. The experimental result showed that the proposed
method achieved a high accuracy of 0.97, comparing with 0.93 with
texts only, and 0.86 with images only.
Abstract: This paper outlines the development of an
experimental technique in quantifying supersonic jet flows, in an
attempt to avoid seeding particle problems frequently associated with
particle-image velocimetry (PIV) techniques at high Mach numbers.
Based on optical flow algorithms, the idea behind the technique
involves using high speed cameras to capture Schlieren images of the
supersonic jet shear layers, before they are subjected to an adapted
optical flow algorithm based on the Horn-Schnuck method to
determine the associated flow fields. The proposed method is capable
of offering full-field unsteady flow information with potentially
higher accuracy and resolution than existing point-measurements or
PIV techniques. Preliminary study via numerical simulations of a
circular de Laval jet nozzle successfully reveals flow and shock
structures typically associated with supersonic jet flows, which serve
as useful data for subsequent validation of the optical flow based
experimental results. For experimental technique, a Z-type Schlieren
setup is proposed with supersonic jet operated in cold mode,
stagnation pressure of 4 bar and exit Mach of 1.5. High-speed singleframe
or double-frame cameras are used to capture successive
Schlieren images. As implementation of optical flow technique to
supersonic flows remains rare, the current focus revolves around
methodology validation through synthetic images. The results of
validation test offers valuable insight into how the optical flow
algorithm can be further improved to improve robustness and
accuracy. Despite these challenges however, this supersonic flow
measurement technique may potentially offer a simpler way to
identify and quantify the fine spatial structures within the shock shear
layer.
Abstract: Myoelectric control system is the fundamental
component of modern prostheses, which uses the myoelectric signals
from an individual’s muscles to control the prosthesis movements.
The surface electromyogram signal (sEMG) being noninvasive has
been used as an input to prostheses controllers for many years.
Recent technological advances has led to the development of
implantable myoelectric sensors which enable the internal
myoelectric signal (MES) to be used as input to these prostheses
controllers. The intramuscular measurement can provide focal
recordings from deep muscles of the forearm and independent signals
relatively free of crosstalk thus allowing for more independent
control sites. However, little work has been done to compare the two
inputs. In this paper we have compared the classification accuracy of
six pattern recognition based myoelectric controllers which use
surface myoelectric signals recorded using untargeted (symmetric)
surface electrode arrays to the same controllers with multichannel
intramuscular myolectric signals from targeted intramuscular
electrodes as inputs. There was no significant enhancement in the
classification accuracy as a result of using the intramuscular EMG
measurement technique when compared to the results acquired using
the surface EMG measurement technique. Impressive classification
accuracy (99%) could be achieved by optimally selecting only five
channels of surface EMG.
Abstract: In this paper, an approach for the liver tumor detection
in computed tomography (CT) images is represented. The detection
process is based on classifying the features of target liver cell to
either tumor or non-tumor. Fractional differential (FD) is applied for
enhancement of Liver CT images, with the aim of enhancing texture
and edge features. Later on, a fusion method is applied to merge
between the various enhanced images and produce a variety of
feature improvement, which will increase the accuracy of
classification. Each image is divided into NxN non-overlapping
blocks, to extract the desired features. Support vector machines
(SVM) classifier is trained later on a supplied dataset different from
the tested one. Finally, the block cells are identified whether they are
classified as tumor or not. Our approach is validated on a group of
patients’ CT liver tumor datasets. The experiment results
demonstrated the efficiency of detection in the proposed technique.
Abstract: Segmentation of left ventricle (LV) from cardiac
ultrasound images provides a quantitative functional analysis of the
heart to diagnose disease. Active Shape Model (ASM) is widely used
for LV segmentation, but it suffers from the drawback that
initialization of the shape model is not sufficiently close to the target,
especially when dealing with abnormal shapes in disease. In this work,
a two-step framework is improved to achieve a fast and efficient LV
segmentation. First, a robust and efficient detection based on Hough
forest localizes cardiac feature points. Such feature points are used to
predict the initial fitting of the LV shape model. Second, ASM is
applied to further fit the LV shape model to the cardiac ultrasound
image. With the robust initialization, ASM is able to achieve more
accurate segmentation. The performance of the proposed method is
evaluated on a dataset of 810 cardiac ultrasound images that are mostly
abnormal shapes. This proposed method is compared with several
combinations of ASM and existing initialization methods. Our
experiment results demonstrate that accuracy of the proposed method
for feature point detection for initialization was 40% higher than the
existing methods. Moreover, the proposed method significantly
reduces the number of necessary ASM fitting loops and thus speeds up
the whole segmentation process. Therefore, the proposed method is
able to achieve more accurate and efficient segmentation results and is
applicable to unusual shapes of heart with cardiac diseases, such as left
atrial enlargement.
Abstract: Background modeling and subtraction in video
analysis has been widely used as an effective method for moving
objects detection in many computer vision applications. Recently, a
large number of approaches have been developed to tackle different
types of challenges in this field. However, the dynamic background
and illumination variations are the most frequently occurred problems
in the practical situation. This paper presents a favorable two-layer
model based on codebook algorithm incorporated with local binary
pattern (LBP) texture measure, targeted for handling dynamic
background and illumination variation problems. More specifically,
the first layer is designed by block-based codebook combining with
LBP histogram and mean value of each RGB color channel. Because
of the invariance of the LBP features with respect to monotonic
gray-scale changes, this layer can produce block wise detection results
with considerable tolerance of illumination variations. The pixel-based
codebook is employed to reinforce the precision from the output of the
first layer which is to eliminate false positives further. As a result, the
proposed approach can greatly promote the accuracy under the
circumstances of dynamic background and illumination changes.
Experimental results on several popular background subtraction
datasets demonstrate very competitive performance compared to
previous models.