Abstract: Brain Computer Interface (BCI) has been recently
increased in research. Functional Near Infrared Spectroscope (fNIRs)
is one the latest technologies which utilize light in the near-infrared
range to determine brain activities. Because near infrared technology
allows design of safe, portable, wearable, non-invasive and wireless
qualities monitoring systems, fNIRs monitoring of brain
hemodynamics can be value in helping to understand brain tasks. In
this paper, we present results of fNIRs signal analysis indicating that
there exist distinct patterns of hemodynamic responses which
recognize brain tasks toward developing a BCI. We applied two
different mathematics tools separately, Wavelets analysis for
preprocessing as signal filters and feature extractions and Neural
networks for cognition brain tasks as a classification module. We
also discuss and compare with other methods while our proposals
perform better with an average accuracy of 99.9% for classification.
Abstract: The simulation of extrusion process is studied widely
in order to both increase products and improve quality, with broad
application in wire coating. The annular tube-tooling extrusion was
set up by a model that is termed as Navier-Stokes equation in
addition to a rheological model of differential form based on singlemode
exponential Phan-Thien/Tanner constitutive equation in a twodimensional
cylindrical coordinate system for predicting the
contraction point of the polymer melt beyond the die. Numerical
solutions are sought through semi-implicit Taylor-Galerkin pressurecorrection
finite element scheme. The investigation was focused on
incompressible creeping flow with long relaxation time in terms of
Weissenberg numbers up to 200. The isothermal case was considered
with surface tension effect on free surface in extrudate flow and no
slip at die wall. The Stream Line Upwind Petrov-Galerkin has been
proposed to stabilize solution. The structure of mesh after die exit
was adjusted following prediction of both top and bottom free
surfaces so as to keep the location of contraction point around one
unit length which is close to experimental results. The simulation of
extrusion process is studied widely in order to both increase products
and improve quality, with broad application in wire coating. The
annular tube-tooling extrusion was set up by a model that is termed
as Navier-Stokes equation in addition to a rheological model of
differential form based on single-mode exponential Phan-
Thien/Tanner constitutive equation in a two-dimensional cylindrical
coordinate system for predicting the contraction point of the polymer
melt beyond the die. Numerical solutions are sought through semiimplicit
Taylor-Galerkin pressure-correction finite element scheme.
The investigation was focused on incompressible creeping flow with
long relaxation time in terms of Weissenberg numbers up to 200. The
isothermal case was considered with surface tension effect on free
surface in extrudate flow and no slip at die wall. The Stream Line
Upwind Petrov-Galerkin has been proposed to stabilize solution. The
structure of mesh after die exit was adjusted following prediction of
both top and bottom free surfaces so as to keep the location of
contraction point around one unit length which is close to
experimental results.
Abstract: This paper describes a new method for extracting the fetal heart rate (fHR) and the fetal heart rate variability (fHRV) signal non-invasively using abdominal maternal electrocardiogram (mECG) recordings. The extraction is based on the fundamental frequency (Fourier-s) theorem. The fundamental frequency of the mother-s electrocardiogram signal (fo-m) is calculated directly from the abdominal signal. The heart rate of the fetus is usually higher than that of the mother; as a result, the fundamental frequency of the fetal-s electrocardiogram signal (fo-f) is higher than that of the mother-s (fo-f > fo-m). Notch filters to suppress mother-s higher harmonics were designed; then a bandpass filter to target fo-f and reject fo-m is implemented. Although the bandpass filter will pass some other frequencies (harmonics), we have shown in this study that those harmonics are actually carried on fo-f, and thus have no impact on the evaluation of the beat-to-beat changes (RR intervals). The oscillations of the time-domain extracted signal represent the RR intervals. We have also shown in this study that zero-to-zero evaluation of the periods is more accurate than the peak-to-peak evaluation. This method is evaluated both on simulated signals and on different abdominal recordings obtained at different gestational ages.
Abstract: This paper presents the automated methods employed
for extracting craniofacial landmarks in white light images as part of
a registration framework designed to support three neurosurgical
procedures. The intraoperative space is characterised by white light
stereo imaging while the preoperative plan is performed on CT scans.
The registration aims at aligning these two modalities to provide a
calibrated environment to enable image-guided solutions. The
neurosurgical procedures can then be carried out by mapping the
entry and target points from CT space onto the patient-s space. The
registration basis adopted consists of natural landmarks (eye corner
and ear tragus). A 5mm accuracy is deemed sufficient for these three
procedures and the validity of the selected registration basis in
achieving this accuracy has been assessed by simulation studies. The
registration protocol is briefly described, followed by a presentation
of the automated techniques developed for the extraction of the
craniofacial features and results obtained from tests on the AR and
FERET databases. Since the three targeted neurosurgical procedures
are routinely used for head injury management, the effect of
bruised/swollen faces on the automated algorithms is assessed. A
user-interactive method is proposed to deal with such unpredictable
circumstances.
Abstract: This paper proposes a neural network weights and
topology optimization using genetic evolution and the
backpropagation training algorithm. The proposed crossover and
mutation operators aims to adapt the networks architectures and
weights during the evolution process. Through a specific inheritance
procedure, the weights are transmitted from the parents to their
offsprings, which allows re-exploitation of the already trained
networks and hence the acceleration of the global convergence of the
algorithm. In the preprocessing phase, a new feature extraction
method is proposed based on Legendre moments with the Maximum
entropy principle MEP as a selection criterion. This allows a global
search space reduction in the design of the networks. The proposed
method has been applied and tested on the well known MNIST
database of handwritten digits.
Abstract: To produce sugar and ethanol, sugarcane processing
generates several agricultural residues, being straw and bagasse is
considered as the main among them. And what to do with this
residues has been subject of many studies and experiences in an
industry that, in recent years, highlighted by the ability to transform
waste into valuable products such as electric power. Cellulose is the
main component of these materials. It is the most common organic
polymer and represents about 1.5 x 1012 tons of total production of
biomass per year and is considered an almost inexhaustible source of
raw material. Pretreatment with mineral acids is one of the most
widely used as stage of cellulose extraction from lignocellulosic
materials for solubilizing most of the hemicellulose content. This
study had as goal to find the best reaction time of sugarcane bagasse
pretreatment with sulfuric acid in order to minimize the losses of
cellulose concomitantly with the highest possible removal of
hemicellulose and lignin. It was found that the best time for this
reaction was 40 minutes, in which it was reached a loss of
hemicelluloses around 70% and lignin and cellulose, around 15%.
Over this time, it was verified that the cellulose loss increased and
there was no loss of lignin and hemicellulose.
Abstract: The objective of this study is to investigate the effect of adding coal to obtain insulating ceramic product. The preparation of mixtures is achieved with 04 types of different masse compositions, consisting of gray and yellow clay, and coal. Analyses are performed on local raw materials by adding coal as additive. The coal content varies from 5 to 20 % in weight by varying the size of coal particles ranging from 0.25mm to 1.60mm.
Initially, each natural moisture content of a raw material has been determined at the temperature of 105°C in a laboratory oven. The Influence of low-coal content on absorption, the apparent density, the contraction and the resistance during compression have been evaluated. The experimental results showed that the optimized composition could be obtained by adding 10% by weight of coal leading thus to insulating ceramic products with water absorption, a density and resistance to compression of 9.40 %, 1.88 g/cm3, 35.46 MPa, respectively. The results show that coal, when mixed with traditional raw materials, offers the conditions to be used as an additive in the production of lightweight ceramic products.
Abstract: In this paper, a new face recognition method based on
PCA (principal Component Analysis), LDA (Linear Discriminant
Analysis) and neural networks is proposed. This method consists of
four steps: i) Preprocessing, ii) Dimension reduction using PCA, iii)
feature extraction using LDA and iv) classification using neural
network. Combination of PCA and LDA is used for improving the
capability of LDA when a few samples of images are available and
neural classifier is used to reduce number misclassification caused by
not-linearly separable classes. The proposed method was tested on
Yale face database. Experimental results on this database
demonstrated the effectiveness of the proposed method for face
recognition with less misclassification in comparison with previous
methods.
Abstract: Extraction of Fe(III) from aqueous solution using Trin-
butyl Phosphate (TBP) as carrier needs a highly acidic medium
(>6N) as it favours formation of chelating complex FeCl3.TBP.
Similarly, stripping of Iron(III) from loaded organic solvents requires
neutral pH or alkaline medium to dissociate the same complex. It is
observed that TBP co-extracts acids along with metal, which causes
reversal of driving force of extraction and iron(III) is re-extracted
back from the strip phase into the feed phase during Liquid Emulsion
Membrane (LEM) pertraction. Therefore, rate of extraction of
different mineral acids (HCl, HNO3, H2SO4) using TBP with and
without presence of metal Fe(III) was examined. It is revealed that in
presence of metal acid extraction is enhanced. Determination of mass
transfer coefficient of both acid and metal extraction was performed
by using Bulk Liquid Membrane (BLM). The average mass transfer
coefficient was obtained by fitting the derived model equation with
experimentally obtained data. The mass transfer coefficient of the
mineral acid extraction is in the order of kHNO3 = 3.3x10-6m/s > kHCl =
6.05x10-7m/s > kH2SO4 = 1.85x10-7m/s. The distribution equilibria of
the above mentioned acids between aqueous feed solution and a
solution of tri-n-butyl-phosphate (TBP) in organic solvents have been
investigated. The stoichiometry of acid extraction reveals the
formation of TBP.2HCl, HNO3.2TBP, and TBP.H2SO4 complexes.
Moreover, extraction of Iron(III) by TBP in HCl aqueous solution
forms complex FeCl3.TBP.2HCl while in HNO3 medium forms
complex 3FeCl3.TBP.2HNO3
Abstract: Massive use of places with strong tourist attraction
with the consequent possibility of losing place-identity produces
harmful effects on cities and their users. In order to mitigate this risk,
areas close to such places can be identified so as to widen the
visitor-s range of action and offer alternative activities integrated
with the main site. The cultural places and appropriate activities can
be identified using a method of analysis and design able to trace the
identity of the places, their characteristics and potential, and to
provide a sustainable improvement. The aim of this work is to
propose PlaceMaker as a method of urban analysis and design which
both detects elements that do not feature in traditional mapping and
which constitute the contemporary identity of the places, and
identifies appropriate project interventions. Two final complex maps
– the first of analysis and the second of design – respectively
represent the identity of places and project interventions. In order to
illustrate the method-s potential; the results of the experimentation
carried out in the Trevi-Pantheon route in Rome and the appropriate
interventions to decongest the area are illustrated.
Abstract: In this paper three basic approaches and different
methods under each of them for extracting region of interest (ROI)
from stationary images are explored. The results obtained for each of
the proposed methods are shown, and it is demonstrated where each
method outperforms the other. Two main problems in ROI
extraction: the channel selection problem and the saliency reversal
problem are discussed and how best these two are addressed by
various methods is also seen. The basic approaches are 1) Saliency
based approach 2) Wavelet based approach 3) Clustering based
approach. The saliency approach performs well on images containing
objects of high saturation and brightness. The wavelet based
approach performs well on natural scene images that contain regions
of distinct textures. The mean shift clustering approach partitions the
image into regions according to the density distribution of pixel
intensities. The experimental results of various methodologies show
that each technique performs at different acceptable levels for
various types of images.
Abstract: Coal tar is a liquid by-product of the process of coal
gasification and carbonation. This liquid oil mixture contains various
kinds of useful compounds such as phenol, o-cresol, and p-cresol.
These compounds are widely used as raw material for insecticides,
dyes, medicines, perfumes, coloring matters, and many others.
This research needed to be done that given the optimum conditions
for the separation of phenol, o-cresol, and p-cresol from the coal tar
by solvent extraction process. The aim of the present work was to
study the effect of two kinds of aqueous were used as solvents:
methanol and acetone solutions, the effect of temperature (298, 306,
and 313K) and mixing (30, 35, and 40rpm) for the separation of
phenol, o-cresol, and p-cresol from coal tar by solvent extraction.
Results indicated that phenol, o-cresol, and p-cresol in coal tar
were selectivity extracted into the solvent phase and these
components could be separated by solvent extraction. The aqueous
solution of methanol, mass ratio of solvent to feed, Eo/Ro=1,
extraction temperature 306K and mixing 35 rpm were the most
efficient for extraction of phenol, o-cresol, and p-cresol from coal tar.
Abstract: In this paper we propose a method for recognition of
adult video based on support vector machine (SVM). Different kernel
features are proposed to classify adult videos. SVM has an advantage
that it is insensitive to the relative number of training example in
positive (adult video) and negative (non adult video) classes. This
advantage is illustrated by comparing performance between different
SVM kernels for the identification of adult video.
Abstract: Classification is an interesting problem in functional
data analysis (FDA), because many science and application problems
end up with classification problems, such as recognition, prediction,
control, decision making, management, etc. As the high dimension
and high correlation in functional data (FD), it is a key problem to
extract features from FD whereas keeping its global characters, which
relates to the classification efficiency and precision to heavens. In this
paper, a novel automatic method which combined Genetic Algorithm
(GA) and classification algorithm to extract classification features is
proposed. In this method, the optimal features and classification model
are approached via evolutional study step by step. It is proved by
theory analysis and experiment test that this method has advantages in
improving classification efficiency, precision and robustness whereas
using less features and the dimension of extracted classification
features can be controlled.
Abstract: Frequency domain independent component analysis has
a scaling indeterminacy and a permutation problem. The scaling
indeterminacy can be solved by use of a decomposed spectrum. For
the permutation problem, we have proposed the rules in terms of gain
ratio and phase difference derived from the decomposed spectra and
the source-s coarse directions.
The present paper experimentally clarifies that the gain ratio and
the phase difference work effectively in a real environment but their
performance depends on frequency bands, a microphone-space and
a source-microphone distance. From these facts it is seen that it is
difficult to attain a perfect solution for the permutation problem in a
real environment only by either the gain ratio or the phase difference.
For the perfect solution, this paper gives a solution to the problems
in a real environment. The proposed method is simple, the amount of
calculation is small. And the method has high correction performance
without depending on the frequency bands and distances from source
signals to microphones. Furthermore, it can be applied under the real
environment. From several experiments in a real room, it clarifies
that the proposed method has been verified.
Abstract: In this paper we focus on event extraction from Tamil
news article. This system utilizes a scoring scheme for extracting and
grouping event-specific sentences. Using this scoring scheme eventspecific
clustering is performed for multiple documents. Events are
extracted from each document using a scoring scheme based on
feature score and condition score. Similarly event specific sentences
are clustered from multiple documents using this scoring scheme.
The proposed system builds the Event Template based on user
specified query. The templates are filled with event specific details
like person, location and timeline extracted from the formed clusters.
The proposed system applies these methodologies for Tamil news
articles that have been enconverted into UNL graphs using a Tamil to
UNL-enconverter. The main intention of this work is to generate an
event based template.
Abstract: This paper shows the advantages of the material failure process simulation by improve finite elements with embedded discontinuities, using a new definition of traction vector, dependent on the discontinuity length and the angle. Particularly, two families of this kind of elements are compared: kinematically optimal symmetric and statically and kinematically optimal non-symmetric. The constitutive model to describe the behavior of the material in the symmetric formulation is a traction-displacement jump relationship equipped with softening after reaching the failure surface.
To show the validity of this symmetric formulation, representative numerical examples illustrating the performance of the proposed formulation are presented. It is shown that the non-symmetric family may over or underestimate the energy required to create a discontinuity, as this effect is related with the total length of the discontinuity, fact that is not noticed when the discontinuity path is a straight line.
Abstract: DNA microarray technology is widely used by
geneticists to diagnose or treat diseases through gene expression.
This technology is based on the hybridization of a tissue-s DNA
sequence into a substrate and the further analysis of the image
formed by the thousands of genes in the DNA as green, red or yellow
spots. The process of DNA microarray image analysis involves
finding the location of the spots and the quantification of the
expression level of these. In this paper, a tool to perform DNA
microarray image analysis is presented, including a spot addressing
method based on the image projections, the spot segmentation
through contour based segmentation and the extraction of relevant
information due to gene expression.
Abstract: In this paper, we propose a new robust and secure
system that is based on the combination between two different
transforms Discrete wavelet Transform (DWT) and Contourlet
Transform (CT). The combined transforms will compensate the
drawback of using each transform separately. The proposed
algorithm has been designed, implemented and tested successfully.
The experimental results showed that selecting the best sub-band for
embedding from both transforms will improve the imperceptibility
and robustness of the new combined algorithm. The evaluated
imperceptibility of the combined DWT-CT algorithm which gave a
PSNR value 88.11 and the combination DWT-CT algorithm
improves robustness since it produced better robust against Gaussian
noise attack. In addition to that, the implemented system shored a
successful extraction method to extract watermark efficiently.
Abstract: This paper presents a comparison of metaheuristic
algorithms, Genetic Algorithm (GA) and Ant Colony Optimization
(ACO), in producing freeman chain code (FCC). The main problem
in representing characters using FCC is the length of the FCC
depends on the starting points. Isolated characters, especially the
upper-case characters, usually have branches that make the traversing
process difficult. The study in FCC construction using one
continuous route has not been widely explored. This is our
motivation to use the population-based metaheuristics. The
experimental result shows that the route length using GA is better
than ACO, however, ACO is better in computation time than GA.