Abstract: Segmentation and quantification of stenosis is an
important task in assessing coronary artery disease. One of the main
challenges is measuring the real diameter of curved vessels.
Moreover, uncertainty in segmentation of different tissues in the
narrow vessel is an important issue that affects accuracy. This paper
proposes an algorithm to extract coronary arteries and measure the
degree of stenosis. Markovian fuzzy clustering method is applied to
model uncertainty arises from partial volume effect problem. The
algorithm employs: segmentation, centreline extraction, estimation of
orthogonal plane to centreline, measurement of the degree of
stenosis. To evaluate the accuracy and reproducibility, the approach
has been applied to a vascular phantom and the results are compared
with real diameter. The results of 10 patient datasets have been
visually judged by a qualified radiologist. The results reveal the
superiority of the proposed method compared to the Conventional
thresholding Method (CTM) on both datasets.
Abstract: To extract the important physiological factors related to
diabetes from an oral glucose tolerance test (OGTT) by mathematical
modeling, highly informative but convenient protocols are required.
Current models require a large number of samples and extended
period of testing, which is not practical for daily use. The purpose
of this study is to make model assessments possible even from a
reduced number of samples taken over a relatively short period.
For this purpose, test values were extrapolated using a support
vector machine. A good correlation was found between reference and
extrapolated values in evaluated 741 OGTTs. This result indicates
that a reduction in the number of clinical test is possible through a
computational approach.
Abstract: The explosive growth of World Wide Web has posed
a challenging problem in extracting relevant data. Traditional web
crawlers focus only on the surface web while the deep web keeps
expanding behind the scene. Deep web pages are created
dynamically as a result of queries posed to specific web databases.
The structure of the deep web pages makes it impossible for
traditional web crawlers to access deep web contents. This paper,
Deep iCrawl, gives a novel and vision-based approach for extracting
data from the deep web. Deep iCrawl splits the process into two
phases. The first phase includes Query analysis and Query translation
and the second covers vision-based extraction of data from the
dynamically created deep web pages. There are several established
approaches for the extraction of deep web pages but the proposed
method aims at overcoming the inherent limitations of the former.
This paper also aims at comparing the data items and presenting them
in the required order.
Abstract: Flood zoning studies have become more efficient in
recent years because of the availability of advanced computational
facilities and use of Geographic Information Systems (GIS). In the
present study, flood inundated areas were mapped using GIS for the
Dikrong river basin of Arunachal Pradesh, India, corresponding to
different return periods (2, 5, 25, 50, and 100 years). Further, the developed inundation maps corresponding to 25, 50, and 100 year return period floods were compared to corresponding maps
developed by conventional methods as reported in the Brahmaputra Board Master Plan for Dikrong basin. It was found that, the average
deviation of modelled flood inundation areas from reported map
inundation areas is below 5% (4.52%). Therefore, it can be said that
the modelled flood inundation areas matched satisfactorily with
reported map inundation areas. Hence, GIS techniques were proved to be successful in extracting the flood inundation extent in a time and cost effective manner for the remotely located hilly basin of Dikrong, where conducting conventional surveys is very difficult.
Abstract: In this study, the ability of Aspergillus niger and
Penicillium simplicissimum to extract heavy metals from a spent
refinery catalyst was investigated. For the first step, a spent
processing catalyst from one of the oil refineries in Iran was
physically and chemically characterized. Aspergillus niger and
Penicillium simplicissimum were used to mobilize Al/Co/Mo/Ni from
hazardous spent catalysts. The fungi were adapted to the mixture of
metals at 100-800 mg L-1 with increments in concentration of 100 mg
L-1. Bioleaching experiments were carried out in batch cultures. To
investigate the production of organic acids in sucrose medium,
analyses of the culture medium by HPLC were performed at specific
time intervals after inoculation. The results obtained from Inductive
coupled plasma-optical emission spectrometry (ICP-OES) showed
that after the one-step bioleaching process using Aspergillus niger,
maximum removal efficiencies of 27%, 66%, 62% and 38% were
achieved for Al, Co, Mo and Ni, respectively. However, the highest
removal efficiencies using Penicillium simplicissimum were of 32%,
67%, 65% and 38% for Al, Co, Mo and Ni, respectively
Abstract: The main goal in this paper is to quantify the quality of
different techniques for radiation treatment plans, a back-propagation
artificial neural network (ANN) combined with biomedicine theory
was used to model thirteen dosimetric parameters and to calculate
two dosimetric indices. The correlations between dosimetric indices
and quality of life were extracted as the features and used in the ANN
model to make decisions in the clinic. The simulation results show
that a trained multilayer back-propagation neural network model can
help a doctor accept or reject a plan efficiently. In addition, the
models are flexible and whenever a new treatment technique enters
the market, the feature variables simply need to be imported and the
model re-trained for it to be ready for use.
Abstract: Different methods containing biometric algorithms are
presented for the representation of eigenfaces detection including
face recognition, are identification and verification. Our theme of this
research is to manage the critical processing stages (accuracy, speed,
security and monitoring) of face activities with the flexibility of
searching and edit the secure authorized database. In this paper we
implement different techniques such as eigenfaces vector reduction
by using texture and shape vector phenomenon for complexity
removal, while density matching score with Face Boundary Fixation
(FBF) extracted the most likelihood characteristics in this media
processing contents. We examine the development and performance
efficiency of the database by applying our creative algorithms in both
recognition and detection phenomenon. Our results show the
performance accuracy and security gain with better achievement than
a number of previous approaches in all the above processes in an
encouraging mode.
Abstract: A research project dealing with the phytoremediation
of a soil polluted by some heavy metals is currently running. The
case study is represented by a mining area in Hamedan province in
the central west part of Iran. The potential of phytoextraction and
phytostabilization of plants was evaluated considering the
concentration of heavy metals in the plant tissues and also the
bioconcentration factor (BCF) and the translocation factor (TF). Also
the several established criteria were applied to define
hyperaccumulator plants in the studied area. Results showed that
none of the collected plant species were suitable for phytoextraction
of Cu, Zn, Fe and Mn, but among the plants, Euphorbia macroclada
was the most efficient in phytostabilization of Cu and Fe, while,
Ziziphora clinopodioides, Cousinia sp. and Chenopodium botrys
were the most suitable for phytostabilization of Zn and Chondrila
juncea and Stipa barbata had the potential for phytostabilization of
Mn. Using the most common criterion, Euphorbia macroclada and
Verbascum speciosum were Fe hyperaccumulator plants. Present
study showed that native plant species growing on contaminated sites
may have the potential for phytoremediation.
Abstract: The rapid growth of e-Commerce services is
significantly observed in the past decade. However, the method to
verify the authenticated users still widely depends on numeric
approaches. A new search on other verification methods suitable for
online e-Commerce is an interesting issue. In this paper, a new online
signature-verification method using angular transformation is
presented. Delay shifts existing in online signatures are estimated by
the estimation method relying on angle representation. In the
proposed signature-verification algorithm, all components of input
signature are extracted by considering the discontinuous break points
on the stream of angular values. Then the estimated delay shift is
captured by comparing with the selected reference signature and the
error matching can be computed as a main feature used for verifying
process. The threshold offsets are calculated by two types of error
characteristics of the signature verification problem, False Rejection
Rate (FRR) and False Acceptance Rate (FAR). The level of these two
error rates depends on the decision threshold chosen whose value is
such as to realize the Equal Error Rate (EER; FAR = FRR). The
experimental results show that through the simple programming,
employed on Internet for demonstrating e-Commerce services, the
proposed method can provide 95.39% correct verifications and 7%
better than DP matching based signature-verification method. In
addition, the signature verification with extracting components
provides more reliable results than using a whole decision making.
Abstract: It is difficult to judge ripeness by outward
characteristics such as size or external color. In this paper a nondestructive
method was studied to determine watermelon (Crimson
Sweet) quality. Responses of samples to excitation vibrations were
detected using laser Doppler vibrometry (LDV) technology. Phase
shift between input and output vibrations were extracted overall
frequency range. First and second were derived using frequency
response spectrums. After nondestructive tests, watermelons were
sensory evaluated. So the samples were graded in a range of ripeness
based on overall acceptability (total desired traits consumers).
Regression models were developed to predict quality using obtained
results and sample mass. The determination coefficients of the
calibration and cross validation models were 0.89 and 0.71
respectively. This study demonstrated feasibility of information
which is derived vibration response curves for predicting fruit
quality. The vibration response of watermelon using the LDV method
is measured without direct contact; it is accurate and timely, which
could result in significant advantage for classifying watermelons
based on consumer opinions.
Abstract: Search for a tertiary substructure that geometrically
matches the 3D pattern of the binding site of a well-studied protein provides a solution to predict protein functions. In our previous work,
a web server has been built to predict protein-ligand binding sites
based on automatically extracted templates. However, a drawback of such templates is that the web server was prone to resulting in many
false positive matches. In this study, we present a sequence-order constraint to reduce the false positive matches of using automatically
extracted templates to predict protein-ligand binding sites. The binding site predictor comprises i) an automatically constructed template library and ii) a local structure alignment algorithm for
querying the library. The sequence-order constraint is employed to
identify the inconsistency between the local regions of the query protein and the templates. Experimental results reveal that the sequence-order constraint can largely reduce the false positive matches and is effective for template-based binding site prediction.
Abstract: Work Breakdown Structure (WBS) is one of the
most vital planning processes of the project management since it
is considered to be the fundamental of other processes like
scheduling, controlling, assigning responsibilities, etc. In fact
WBS or activity list is the heart of a project and omission of a
simple task can lead to an irrecoverable result. There are some
tools in order to generate a project WBS. One of the most
powerful tools is mind mapping which is the basis of this article.
Mind map is a method for thinking together and helps a project
manager to stimulate the mind of project team members to
generate project WBS. Here we try to generate a WBS of a
sample project involving with the building construction using the
aid of mind map and the artificial intelligence (AI) programming
language. Since mind map structure can not represent data in a
computerized way, we convert it to a semantic network which can
be used by the computer and then extract the final WBS from the
semantic network by the prolog programming language. This
method will result a comprehensive WBS and decrease the
probability of omitting project tasks.
Abstract: The application of Neural Network for disease
diagnosis has made great progress and is widely used by physicians.
An Electrocardiogram carries vital information about heart activity and physicians use this signal for cardiac disease diagnosis which
was the great motivation towards our study. In our work, tachycardia
features obtained are used for the training and testing of a Neural
Network. In this study we are using Fuzzy Probabilistic Neural
Networks as an automatic technique for ECG signal analysis. As
every real signal recorded by the equipment can have different
artifacts, we needed to do some preprocessing steps before feeding it
to our system. Wavelet transform is used for extracting the
morphological parameters of the ECG signal. The outcome of the
approach for the variety of arrhythmias shows the represented
approach is superior than prior presented algorithms with an average
accuracy of about %95 for more than 7 tachy arrhythmias.
Abstract: This paper characterizes the effects of artificial short
term aging in the laboratory on the rheological properties of virgin
80/100 penetration grade asphalt binder. After several years in
service, asphalt mixture started to deteriorate due to aging. Aging is a
complex physico-chemical phenomenon that influences asphalt
binder rheological properties causing a deterioration in asphalt
mixture performance. To ascertain asphalt binder aging effects, the
virgin, artificially aged and extracted asphalt binder were tested via
the Rolling Thin film Oven (RTFO), Dynamic Shear Rheometer
(DSR) and Rotational Viscometer (RV). A comparative study
between laboratory and field aging conditions were also carried out.
The results showed that the specimens conditioned for 85 minutes
inside the RTFO was insufficient to simulate the actual short term
aging caused that took place in the field under Malaysian field
conditions
Abstract: The empirical mode decomposition (EMD) represents any time series into a finite set of basis functions. The bases are termed as intrinsic mode functions (IMFs) which are mutually orthogonal containing minimum amount of cross-information. The EMD successively extracts the IMFs with the highest local frequencies in a recursive way, which yields effectively a set low-pass filters based entirely on the properties exhibited by the data. In this paper, EMD is applied to explore the properties of the multi-year air temperature and to observe its effects on climate change under global warming. This method decomposes the original time-series into intrinsic time scale. It is capable of analyzing nonlinear, non-stationary climatic time series that cause problems to many linear statistical methods and their users. The analysis results show that the mode of EMD presents seasonal variability. The most of the IMFs have normal distribution and the energy density distribution of the IMFs satisfies Chi-square distribution. The IMFs are more effective in isolating physical processes of various time-scales and also statistically significant. The analysis results also show that the EMD method provides a good job to find many characteristics on inter annual climate. The results suggest that climate fluctuations of every single element such as temperature are the results of variations in the global atmospheric circulation.
Abstract: Optimization of extraction of phenolic compounds
from Avicennia marina using response surface methodology was
carried out during the present study. Five levels, three factors
rotatable design (CCRD) was utilized to examine the optimum
combination of extraction variables based on the TPC of Avicennia
marina leaves. The best combination of response function was 78.41
°C, drying temperature; 26.18°C; extraction temperature and 36.53
minutes of extraction time. However, the procedure can be promptly
extended to the study of several others pharmaceutical processes like
purification of bioactive substances, drying of extracts and
development of the pharmaceutical dosage forms for the benefit of
consumers.
Abstract: Automatic reading of handwritten cheque is a computationally
complex process and it plays an important role in financial
risk management. Machine vision and learning provide a viable
solution to this problem. Research effort has mostly been focused
on recognizing diverse pitches of cheques and demand drafts with an
identical outline. However most of these methods employ templatematching
to localize the pitches and such schemes could potentially
fail when applied to different types of outline maintained by the
bank. In this paper, the so-called outline problem is resolved by
a cheque information tree (CIT), which generalizes the localizing
method to extract active-region-of-entities. In addition, the weight
based density plot (WBDP) is performed to isolate text entities and
read complete pitches. Recognition is based on texture features using
neural classifiers. Legal amount is subsequently recognized by both
texture and perceptual features. A post-processing phase is invoked
to detect the incorrect readings by Type-2 grammar using the Turing
machine. The performance of the proposed system was evaluated
using cheque and demand drafts of 22 different banks. The test data
consists of a collection of 1540 leafs obtained from 10 different
account holders from each bank. Results show that this approach
can easily be deployed without significant design amendments.
Abstract: Extracting and elaborating software requirements and
transforming them into viable software architecture are still an
intricate task. This paper defines a solution architecture which is
based on the blurred amalgamation of problem space and solution
space. The dependencies between domain constraints, requirements
and architecture and their importance are described that are to be
considered collectively while evolving from problem space to
solution space. This paper proposes a revised version of Twin Peaks
Model named Win Peaks Model that reconciles software
requirements and architecture in more consistent and adaptable
manner. Further the conflict between stakeholders- win-requirements
is resolved by proposed Voting methodology that is simple
adaptation of win-win requirements negotiation model and QARCC.
Abstract: Article presents the geometry and structure
reconstruction procedure of the aircraft model for flatter research
(based on the I22-IRYDA aircraft). For reconstruction the Reverse
Engineering techniques and advanced surface modeling CAD tools
are used. Authors discuss all stages of data acquisition process,
computation and analysis of measured data. For acquisition the three
dimensional structured light scanner was used. In the further sections,
details of reconstruction process are present. Geometry
reconstruction procedure transform measured input data (points
cloud) into the three dimensional parametric computer model
(NURBS solid model) which is compatible with CAD systems.
Parallel to the geometry of the aircraft, the internal structure
(structural model) are extracted and modeled. In last chapter the
evaluation of obtained models are discussed.
Abstract: Partial discharge (PD) detection is an important
method to evaluate the insulation condition of metal-clad apparatus.
Non-intrusive sensors which are easy to install and have no
interruptions on operation are preferred in onsite PD detection.
However, it often lacks of accuracy due to the interferences in PD
signals. In this paper a novel PD extraction method that uses frequency
analysis and entropy based time-frequency (TF) analysis is introduced.
The repetitive pulses from convertor are first removed via frequency
analysis. Then, the relative entropy and relative peak-frequency of
each pulse (i.e. time-indexed vector TF spectrum) are calculated and
all pulses with similar parameters are grouped. According to the
characteristics of non-intrusive sensor and the frequency distribution
of PDs, the pulses of PD and interferences are separated. Finally the
PD signal and interferences are recovered via inverse TF transform.
The de-noised result of noisy PD data demonstrates that the
combination of frequency and time-frequency techniques can
discriminate PDs from interferences with various frequency
distributions.