Abstract: Purpose of this work is the development of an
automatic classification system which could be useful for radiologists
in the investigation of breast cancer. The software has been designed
in the framework of the MAGIC-5 collaboration.
In the automatic classification system the suspicious regions with
high probability to include a lesion are extracted from the image as
regions of interest (ROIs). Each ROI is characterized by some
features based on morphological lesion differences.
Some classifiers as a Feed Forward Neural Network, a K-Nearest
Neighbours and a Support Vector Machine are used to distinguish the
pathological records from the healthy ones.
The results obtained in terms of sensitivity (percentage of
pathological ROIs correctly classified) and specificity (percentage of
non-pathological ROIs correctly classified) will be presented through
the Receive Operating Characteristic curve (ROC). In particular the
best performances are 88% ± 1 of area under ROC curve obtained
with the Feed Forward Neural Network.
Abstract: Value engineering is an efficacious contraption for
administrators to make up their minds. Value perusals proffer the
gaffers a suitable instrument to decrease the expenditures of the life
span, quality amelioration, structural improvement, curtailment of the
construction schedule, longevity prolongation or a merging of the
aforementioned cases. Subjecting organizers to pressures on one
hand and their accountability towards their pertinent fields together
with inherent risks and ambiguities of other options on the other hand
set some comptrollers in a dilemma utilization of risk management
and the value engineering in projects manipulation with regard to
complexities of implementing projects can be wielded as a
contraption to identify and efface each item which wreaks
unnecessary expenses and time squandering sans inflicting any
damages upon the essential project applications. Of course It should
be noted that implementation of risk management and value
engineering with regard to the betterment of efficiency and functions
may lead to the project implementation timing elongation. Here time
revamping does not refer to time diminishing in the whole cases. his
article deals with risk and value engineering conceptualizations at
first. The germane reverberations effectuated due to its execution in
Iran Khodro Corporation are regarded together with the joint features
and amalgamation of the aforesaid entia; hence the proposed
blueprint is submitted to be taken advantage of in engineering and
industrial projects including Iran Khodro Corporation.
Abstract: Today-s children, who are born into a more colorful,
more creative, more abstract and more accessible communication
environment than their ancestors as a result of dizzying advances in
technology, have an interesting capacity to perceive and make sense
of the world. Millennium children, who live in an environment where
all kinds of efforts by marketing communication are more intensive
than ever are, from their early childhood on, subject to all kinds of
persuasive messages. As regards advertising communication, it
outperforms all the other marketing communication efforts in
creating little consumer individuals and, as a result of processing of
codes and signs, plays a significant part in building a world of seeing,
thinking and understanding for children. Children who are raised with
metaphorical expressions such as tales and riddles also meet that fast
and effective meaning communication in advertisements.
Children-s perception of metaphors, which help grasp the “product
and its promise" both verbally and visually and facilitate association
between them is the subject of this study. Stimulating and activating
imagination, metaphors have unique advantages in promoting the
product and its promise especially in regard to print advertisements,
which have certain limitations. This study deals comparatively with
both literal and metaphoric versions of print advertisements
belonging to various product groups and attempts to discover to what
extent advertisements are liked, recalled, perceived and are
persuasive. The sample group of the study, which was conducted in
two elementary schools situated in areas that had different socioeconomic
features, consisted of children aged 12.
Abstract: The morphological short-term evolution of Ponta do Tubarão Island (PTI) was investigated through high accurate surveys based on post-processed kinematic (PPK) relative positioning on Global Navigation Satellite Systems (GNSS). PTI is part of a barrier island system on a high energy northeast Brazilian coastal environment and also an area of high environmental sensitivity. Surveys were carried out quarterly over a two years period from May 2010 to May 2012. This paper assesses statically the performance of digital elevation models (DEM) derived from different interpolation methods to represent morphologic features and to quantify volumetric changes and TIN models shown the best results to that purposes. The MDE allowed quantifying surfaces and volumes in detail as well as identifying the most vulnerable segments of the PTI to erosion and/or accumulation of sediments and relate the alterations to climate conditions. The coastal setting and geometry of PTI protects a significant mangrove ecosystem and some oil and gas facilities installed in the vicinities from damaging effects of strong oceanwaves and currents. Thus, the maintenance of PTI is extremely required but the prediction of its longevity is uncertain because results indicate an irregularity of sedimentary balance and a substantial decline in sediment supply to this coastal area.
Abstract: A plausible architecture of an ancient genetic code is derived from an extended base triplet vector space over the Galois field of the extended base alphabet {D, G, A, U, C}, where the letter D represent one or more hypothetical bases with unspecific pairing. We hypothesized that the high degeneration of a primeval genetic code with five bases and the gradual origin and improvements of a primitive DNA repair system could make possible the transition from the ancient to the modern genetic code. Our results suggest that the Watson-Crick base pairing and the non-specific base pairing of the hypothetical ancestral base D used to define the sum and product operations are enough features to determine the coding constraints of the primeval and the modern genetic code, as well as the transition from the former to the later. Geometrical and algebraic properties of this vector space reveal that the present codon assignment of the standard genetic code could be induced from a primeval codon assignment. Besides, the Fourier spectrum of the extended DNA genome sequences derived from the multiple sequence alignment suggests that the called period-3 property of the present coding DNA sequences could also exist in the ancient coding DNA sequences.
Abstract: IPsec has now become a standard information security
technology throughout the Internet society. It provides a well-defined
architecture that takes into account confidentiality, authentication,
integrity, secure key exchange and protection mechanism against
replay attack also. For the connectionless security services on packet
basis, IETF IPsec Working Group has standardized two extension
headers (AH&ESP), key exchange and authentication protocols. It is
also working on lightweight key exchange protocol and MIB's for
security management. IPsec technology has been implemented on
various platforms in IPv4 and IPv6, gradually replacing old
application-specific security mechanisms. IPv4 and IPv6 are not
directly compatible, so programs and systems designed to one
standard can not communicate with those designed to the other. We
propose the design and implementation of controlled Internet security
system, which is IPsec-based Internet information security system in
IPv4/IPv6 network and also we show the data of performance
measurement. With the features like improved scalability and
routing, security, ease-of-configuration, and higher performance of
IPv6, the controlled Internet security system provides consistent
security policy and integrated security management on IPsec-based
Internet security system.
Abstract: Recently, permeable breakwaters have been suggested to overcome the disadvantages of fully protection breakwaters. These protection structures have minor impacts on the coastal environment and neighboring beaches where they provide a more economical protection from waves and currents. For regular waves, a numerical model is used (FLOW-3D, VOF) to investigate the hydraulic performance of a permeable breakwater. The model of permeable breakwater consists of a pair of identical vertical slotted walls with an impermeable upper and lower part, where the draft is a decimal multiple of the total depth. The middle part is permeable with a porosity of 50%. The second barrier is located at distant of 0.5 and 1.5 of the water depth from the first one. The numerical model is validated by comparisons with previous laboratory data and semi-analytical results of the same model. A good agreement between the numerical results and both laboratory data and semi-analytical results has been shown and the results indicate the applicability of the numerical model to reproduce most of the important features of the interaction. Through the numerical investigation, the friction factor of the model is carefully discussed.
Abstract: In this paper, Wavelet based ANFIS for finding inter
turn fault of generator is proposed. The detector uniquely responds to
the winding inter turn fault with remarkably high sensitivity.
Discrimination of different percentage of winding affected by inter
turn fault is provided via ANFIS having an Eight dimensional input
vector. This input vector is obtained from features extracted from
DWT of inter turn faulty current leaving the generator phase
winding. Training data for ANFIS are generated via a simulation of
generator with inter turn fault using MATLAB. The proposed
algorithm using ANFIS is giving satisfied performance than ANN
with selected statistical data of decomposed levels of faulty current.
Abstract: By systematically applying different engineering
methods, difficult financial problems become approachable. Using a
combination of theory and techniques such as wavelet transform,
time series data mining, Markov chain based discrete stochastic
optimization, and evolutionary algorithms, this work formulated a
strategy to characterize and forecast non-linear time series. It
attempted to extract typical features from the volatility data sets of
S&P100 and S&P500 indices that include abrupt drops, jumps and
other non-linearity. As a result, accuracy of forecasting has reached
an average of over 75% surpassing any other publicly available
results on the forecast of any financial index.
Abstract: A method of collecting composition data and examining structural features of pearlite lamellae and the parent austenite at the growth interface in a 13wt. % manganese steel has been demonstrated with the use of Scanning Transmission Electron Microscopy (STEM). The combination of composition data and the structural features observed at the growth interface show that available theories of pearlite growth cannot explain all the observations.
Abstract: A novel feature selection strategy to improve the recognition accuracy on the faces that are affected due to nonuniform illumination, partial occlusions and varying expressions is proposed in this paper. This technique is applicable especially in scenarios where the possibility of obtaining a reliable intra-class probability distribution is minimal due to fewer numbers of training samples. Phase congruency features in an image are defined as the points where the Fourier components of that image are maximally inphase. These features are invariant to brightness and contrast of the image under consideration. This property allows to achieve the goal of lighting invariant face recognition. Phase congruency maps of the training samples are generated and a novel modular feature selection strategy is implemented. Smaller sub regions from a predefined neighborhood within the phase congruency images of the training samples are merged to obtain a large set of features. These features are arranged in the order of increasing distance between the sub regions involved in merging. The assumption behind the proposed implementation of the region merging and arrangement strategy is that, local dependencies among the pixels are more important than global dependencies. The obtained feature sets are then arranged in the decreasing order of discriminating capability using a criterion function, which is the ratio of the between class variance to the within class variance of the sample set, in the PCA domain. The results indicate high improvement in the classification performance compared to baseline algorithms.
Abstract: Iris-based biometric authentication is gaining importance
in recent times. Iris biometric processing however, is a complex
process and computationally very expensive. In the overall processing
of iris biometric in an iris-based biometric authentication system,
feature processing is an important task. In feature processing, we extract
iris features, which are ultimately used in matching. Since there
is a large number of iris features and computational time increases
as the number of features increases, it is therefore a challenge to
develop an iris processing system with as few as possible number of
features and at the same time without compromising the correctness.
In this paper, we address this issue and present an approach to feature
extraction and feature matching process. We apply Daubechies D4
wavelet with 4 levels to extract features from iris images. These
features are encoded with 2 bits by quantizing into 4 quantization
levels. With our proposed approach it is possible to represent an
iris template with only 304 bits, whereas existing approaches require
as many as 1024 bits. In addition, we assign different weights to
different iris region to compare two iris templates which significantly
increases the accuracy. Further, we match the iris template based on
a weighted similarity measure. Experimental results on several iris
databases substantiate the efficacy of our approach.
Abstract: Fingerprint based identification system; one of a well
known biometric system in the area of pattern recognition and has
always been under study through its important role in forensic
science that could help government criminal justice community. In
this paper, we proposed an identification framework of individuals by
means of fingerprint. Different from the most conventional
fingerprint identification frameworks the extracted Geometrical
element features (GEFs) will go through a Discretization process.
The intention of Discretization in this study is to attain individual
unique features that could reflect the individual varianceness in order
to discriminate one person from another. Previously, Discretization
has been shown a particularly efficient identification on English
handwriting with accuracy of 99.9% and on discrimination of twins-
handwriting with accuracy of 98%. Due to its high discriminative
power, this method is adopted into this framework as an independent
based method to seek for the accuracy of fingerprint identification.
Finally the experimental result shows that the accuracy rate of
identification of the proposed system using Discretization is 100%
for FVC2000, 93% for FVC2002 and 89.7% for FVC2004 which is
much better than the conventional or the existing fingerprint
identification system (72% for FVC2000, 26% for FVC2002 and
32.8% for FVC2004). The result indicates that Discretization
approach manages to boost up the classification effectively, and
therefore prove to be suitable for other biometric features besides
handwriting and fingerprint.
Abstract: This paper proposes a novel hybrid algorithm for feature selection based on a binary ant colony and SVM. The final subset selection is attained through the elimination of the features that produce noise or, are strictly correlated with other already selected features. Our algorithm can improve classification accuracy with a small and appropriate feature subset. Proposed algorithm is easily implemented and because of use of a simple filter in that, its computational complexity is very low. The performance of the proposed algorithm is evaluated through a real Rotary Cement kiln dataset. The results show that our algorithm outperforms existing algorithms.
Abstract: In Geographic Information System, one of the sources
of obtaining needed geographic data is digitizing analog maps and
evaluation of aerial and satellite photos. In this study, a method will
be discussed which can be used to extract vectorial features and
creating vectorized drawing files for aerial photos. At the same time
a software developed for these purpose. Converting from raster to
vector is also known as vectorization and it is the most important step
when creating vectorized drawing files. In the developed algorithm,
first of all preprocessing on the aerial photo is done. These are;
converting to grayscale if necessary, reducing noise, applying some
filters and determining the edge of the objects etc. After these steps,
every pixel which constitutes the photo are followed from upper left
to right bottom by examining its neighborhood relationship and one
pixel wide lines or polylines obtained. The obtained lines have to be
erased for preventing confusion while continuing vectorization
because if not erased they can be perceived as new line, but if erased
it can cause discontinuity in vector drawing so the image converted
from 2 bit to 8 bit and the detected pixels are expressed as a different
bit. In conclusion, the aerial photo can be converted to vector form
which includes lines and polylines and can be opened in any CAD
application.
Abstract: Today automobile and aerospace industries realise Laser Beam Welding for a clean and non contact source of heating and fusion for joining of sheets. The welding performance is mainly based on by the laser welding parameters. Some concepts related to Artificial Neural Networks and how can be applied to model weld bead geometry and mechanical properties in terms of equipment parameters are reported in order to evaluate the accuracy and compare it with traditional modeling schemes. This review reveals the output features of Titanium and Aluminium weld bead geometry and mechanical properties such as ultimate tensile strength, yield strength, elongation and reduction of the area of the weld using Artificial Neural Network.
Abstract: Occurrences of spurious crests on the troughs of large,
relatively steep second-order Stokes waves are anomalous and not an
inherent characteristic of real waves. Here, the effects of such
occurrences on the statistics described by the standard second-order
stochastic model are examined theoretically and by way of
simulations. Theoretical results and simulations indicate that when
spurious occurrences are sufficiently large, the standard model leads
to physically unrealistic surface features and inaccuracies in the
statistics of various surface features, in particular, the troughs and
thus zero-crossing heights of large waves. Whereas inaccuracies can
be fairly noticeable for long-crested waves in both deep and
shallower depths, they tend to become relatively insignificant in
directional waves.
Abstract: A registration framework for image-guided robotic
surgery is proposed for three emergency neurosurgical procedures,
namely Intracranial Pressure (ICP) Monitoring, External Ventricular
Drainage (EVD) and evacuation of a Chronic Subdural Haematoma
(CSDH). The registration paradigm uses CT and white light as
modalities. This paper presents two simulation studies for a
preliminary evaluation of the registration protocol: (1) The loci of the
Target Registration Error (TRE) in the patient-s axial, coronal and
sagittal views were simulated based on a Fiducial Localisation Error
(FLE) of 5 mm and (2) Simulation of the actual framework using
projected views from a surface rendered CT model to represent white
light images of the patient. Craniofacial features were employed as
the registration basis to map the CT space onto the simulated
intraoperative space. Photogrammetry experiments on an artificial
skull were also performed to benchmark the results obtained from the
second simulation. The results of both simulations show that the
proposed protocol can provide a 5mm accuracy for these
neurosurgical procedures.
Abstract: Purpose: To explore the use of Curvelet transform to
extract texture features of pulmonary nodules in CT image and support
vector machine to establish prediction model of small solitary
pulmonary nodules in order to promote the ratio of detection and
diagnosis of early-stage lung cancer. Methods: 2461 benign or
malignant small solitary pulmonary nodules in CT image from 129
patients were collected. Fourteen Curvelet transform textural features
were as parameters to establish support vector machine prediction
model. Results: Compared with other methods, using 252 texture
features as parameters to establish prediction model is more proper.
And the classification consistency, sensitivity and specificity for the
model are 81.5%, 93.8% and 38.0% respectively. Conclusion: Based
on texture features extracted from Curvelet transform, support vector
machine prediction model is sensitive to lung cancer, which can
promote the rate of diagnosis for early-stage lung cancer to some
extent.
Abstract: A feature weighting and selection method is proposed
which uses the structure of a weightless neuron and exploits the
principles that govern the operation of Genetic Algorithms and
Evolution. Features are coded onto chromosomes in a novel way
which allows weighting information regarding the features to be
directly inferred from the gene values. The proposed method is
significant in that it addresses several problems concerned with
algorithms for feature selection and weighting as well as providing
significant advantages such as speed, simplicity and suitability for
real-time systems.