Abstract: The evaluation and measurement of human body
dimensions are achieved by physical anthropometry. This research
was conducted in view of the importance of anthropometric indices
of the face in forensic medicine, surgery, and medical imaging. The
main goal of this research is to optimization of facial feature point by
establishing a mathematical relationship among facial features and
used optimize feature points for age classification. Since selected
facial feature points are located to the area of mouth, nose, eyes and
eyebrow on facial images, all desire facial feature points are extracted
accurately. According this proposes method; sixteen Euclidean
distances are calculated from the eighteen selected facial feature
points vertically as well as horizontally. The mathematical
relationships among horizontal and vertical distances are established.
Moreover, it is also discovered that distances of the facial feature
follows a constant ratio due to age progression. The distances
between the specified features points increase with respect the age
progression of a human from his or her childhood but the ratio of the
distances does not change (d = 1 .618 ) . Finally, according to the
proposed mathematical relationship four independent feature
distances related to eight feature points are selected from sixteen
distances and eighteen feature point-s respectively. These four feature
distances are used for classification of age using Support Vector
Machine (SVM)-Sequential Minimal Optimization (SMO) algorithm
and shown around 96 % accuracy. Experiment result shows the
proposed system is effective and accurate for age classification.
Abstract: This research paper presents numerical studies of the
characteristics of warhead fragmentation in terms of initial velocities,
spray angles of fragments and fragment mass distribution of high
explosive (HE) warhead. The behavior of warhead fragmentation
depends on shape and size of warhead, thickness of casing, type of
explosive, number and position of detonator, and etc. This paper
focuses on the effects of material properties of warhead casing, i.e.
failure strain, initial yield and ultimate strength on the characteristics
of warhead fragmentation. It was found that initial yield and ultimate
strength of casing has minimal effects on the initial velocities and
spray angles of fragments. Moreover, a brittle warhead casing with
low failure strain tends to produce higher number of fragments with
less average fragment mass.
Abstract: In this article, a formal specification and verification of the Rabin public-key scheme in a formal proof system is presented. The idea is to use the two views of cryptographic verification: the computational approach relying on the vocabulary of probability theory and complexity theory and the formal approach based on ideas and techniques from logic and programming languages. A major objective of this article is the presentation of the first computer-proved implementation of the Rabin public-key scheme in Isabelle/HOL. Moreover, we explicate a (computer-proven) formalization of correctness as well as a computer verification of security properties using a straight-forward computation model in Isabelle/HOL. The analysis uses a given database to prove formal properties of our implemented functions with computer support. The main task in designing a practical formalization of correctness as well as efficient computer proofs of security properties is to cope with the complexity of cryptographic proving. We reduce this complexity by exploring a light-weight formalization that enables both appropriate formal definitions as well as efficient formal proofs. Consequently, we get reliable proofs with a minimal error rate augmenting the used database, what provides a formal basis for more computer proof constructions in this area.
Abstract: The main objective of this study was to determine if a
minimal increase in road light level (luminance) could lead to
improved driving performance among older adults. Older, middleaged
and younger adults were tested in a driving simulator following
vision and cognitive screening. Comparisons were made for the
performance of simulated night driving under two road light
conditions (0.6 and 2.5 cd/m2). At each light level, the effects of self
reported night driving avoidance were examined along with the
vision/cognitive performance. It was found that increasing road light
level from 0.6 cd/m2 to 2.5 cd/m2 resulted in improved recognition of
signage on straight highway segments. The improvement depends on
different driver-related factors such as vision and cognitive abilities,
and confidence. On curved road sections, the results showed that
driver-s performance worsened. It is concluded that while increasing
road lighting may be helpful to older adults especially for sign
recognition, it may also result in increased driving confidence and
thus reduced attention in some driving situations.
Abstract: Clustering algorithms are attractive for the task of class identification in spatial databases. However, the application to large spatial databases rises the following requirements for clustering algorithms: minimal requirements of domain knowledge to determine the input parameters, discovery of clusters with arbitrary shape and good efficiency on large databases. The well-known clustering algorithms offer no solution to the combination of these requirements. In this paper, a density based clustering algorithm (DCBRD) is presented, relying on a knowledge acquired from the data by dividing the data space into overlapped regions. The proposed algorithm discovers arbitrary shaped clusters, requires no input parameters and uses the same definitions of DBSCAN algorithm. We performed an experimental evaluation of the effectiveness and efficiency of it, and compared this results with that of DBSCAN. The results of our experiments demonstrate that the proposed algorithm is significantly efficient in discovering clusters of arbitrary shape and size.
Abstract: The Minimal Residual (MR) is modified for adaptive
filtering application. Three forms of MR based algorithm are
presented: i) the low complexity SPCG, ii) MREDSI, and iii)
MREDSII. The low complexity is a reduced complexity version of a
previously proposed SPCG algorithm. Approximations introduced
reduce the algorithm to an LMS type algorithm, but, maintain the
superior convergence of the SPCG algorithm. Both MREDSI and
MREDSII are MR based methods with Euclidean direction of search.
The choice of Euclidean directions is shown via simulation to give
better misadjustment compared to their gradient search counterparts.
Abstract: The issue of unintentional islanding in PV grid
interconnection still remains as a challenge in grid-connected
photovoltaic (PV) systems. This paper discusses the overview of
popularly used anti-islanding detection methods, practically applied
in PV grid-connected systems. Anti-islanding methods generally can
be classified into four major groups, which include passive methods,
active methods, hybrid methods and communication base methods.
Active methods have been the preferred detection technique over the
years due to very small non-detected zone (NDZ) in small scale
distribution generation. Passive method is comparatively simpler
than active method in terms of circuitry and operations. However, it
suffers from large NDZ that significantly reduces its performance.
Communication base methods inherit the advantages of active and
passive methods with reduced drawbacks. Hybrid method which
evolved from the combination of both active and passive methods
has been proven to achieve accurate anti-islanding detection by many
researchers. For each of the studied anti-islanding methods, the
operation analysis is described while the advantages and
disadvantages are compared and discussed. It is difficult to pinpoint a
generic method for a specific application, because most of the
methods discussed are governed by the nature of application and
system dependent elements. This study concludes that the setup and
operation cost is the vital factor for anti-islanding method selection in
order to achieve minimal compromising between cost and system
quality.
Abstract: Several works regarding facial recognition have dealt with methods which identify isolated characteristics of the face or with templates which encompass several regions of it. In this paper a new technique which approaches the problem holistically dispensing with the need to identify geometrical characteristics or regions of the face is introduced. The characterization of a face is achieved by randomly sampling selected attributes of the pixels of its image. From this information we construct a set of data, which correspond to the values of low frequencies, gradient, entropy and another several characteristics of pixel of the image. Generating a set of “p" variables. The multivariate data set with different polynomials minimizing the data fitness error in the minimax sense (L∞ - Norm) is approximated. With the use of a Genetic Algorithm (GA) it is able to circumvent the problem of dimensionality inherent to higher degree polynomial approximations. The GA yields the degree and values of a set of coefficients of the polynomials approximating of the image of a face. By finding a family of characteristic polynomials from several variables (pixel characteristics) for each face (say Fi ) in the data base through a resampling process the system in use, is trained. A face (say F ) is recognized by finding its characteristic polynomials and using an AdaBoost Classifier from F -s polynomials to each of the Fi -s polynomials. The winner is the polynomial family closer to F -s corresponding to target face in data base.
Abstract: Economic dispatch problem is an optimization problem where objective function is highly non linear, non-convex, non-differentiable and may have multiple local minima. Therefore, classical optimization methods may not converge or get trapped to any local minima. This paper presents a comparative study of four different evolutionary algorithms i.e. genetic algorithm, bacteria foraging optimization, ant colony optimization and particle swarm optimization for solving the economic dispatch problem. All the methods are tested on IEEE 30 bus test system. Simulation results are presented to show the comparative performance of these methods.
Abstract: Automatic keyphrase extraction is useful in efficiently
locating specific documents in online databases. While several
techniques have been introduced over the years, improvement on
accuracy rate is minimal. This research examines attribute scores for
author-supplied keyphrases to better understand how the scores affect
the accuracy rate of automatic keyphrase extraction. Five attributes
are chosen for examination: Term Frequency, First Occurrence, Last
Occurrence, Phrase Position in Sentences, and Term Cohesion
Degree. The results show that First Occurrence is the most reliable
attribute. Term Frequency, Last Occurrence and Term Cohesion
Degree display a wide range of variation but are still usable with
suggested tweaks. Only Phrase Position in Sentences shows a totally
unpredictable pattern. The results imply that the commonly used
ranking approach which directly extracts top ranked potential phrases
from candidate keyphrase list as the keyphrases may not be reliable.
Abstract: We consider the methods of construction simple
polygons for a set S of n points and applying them for searching the
minimal area polygon. In this paper we propose the approximate
algorithm, which generates the simple polygonalizations of a fixed
set of points and finds the minimal area polygon, in O (n3) time and
using O(n2) memory.
Abstract: A series of Ti based shape memory alloys with
composition of Ti50Ni49Cr1, Ti50Ni47Cr3 and Ti50Ni45Cr5 were
developed by vacuum arc-melting under a purified argon atmosphere.
The histometric and corrosion evaluation of Ti-Ni-Cr shape memory
alloys have been considered in this research work. The alloys were
developed by vacuum arc melting and implanted subcutaneously in
rabbits for 4, 8 and 12 weeks. Metallic implants were embedded in
order to determine the outcome of implantation on histometric and
corrosion evaluation of Ti-Ni-Cr metallic strips. Encapsulating
membrane formation around the alloys was minimal in the case of all
materials. After histomorphometric analyses it was possible to
demonstrate that there were no statistically significant differences
between the materials. Corrosion rate was also determined in this
study which is within acceptable range. The results showed the Ti-
Ni-Cr alloy was neither cytotoxic, nor have any systemic reaction on
living system in any of the test performed. Implantation shows good
compatibility and a potential of being used directly in vivo system.
Abstract: The present paper presents a finite element model and
analysis for the interaction between a piezoresistive tactile sensor and
biological tissues. The tactile sensor is proposed for use in minimally
invasive surgery to deliver tactile information of biological tissues to
surgeons. The proposed sensor measures the relative hardness of soft
contact objects as well as the contact force. Silicone rubbers were
used as the phantom of biological tissues. Finite element analysis of
the silicone rubbers and the mechanical structure of the sensor were
performed using COMSOL Multiphysics (v3.4) environment. The
simulation results verify the capability of the sensor to be used to
differentiate between different kinds of silicone rubber materials.
Abstract: Complexity, as a theoretical background has made it
easier to understand and explain the features and dynamic behavior
of various complex systems. As the common theoretical background
has confirmed, borrowing the terminology for design from the
natural sciences has helped to control and understand urban
complexity. Phenomena like self-organization, evolution and
adaptation are appropriate to describe the formerly inaccessible
characteristics of the complex environment in unpredictable bottomup
systems. Increased computing capacity has been a key element in
capturing the chaotic nature of these systems.
A paradigm shift in urban planning and architectural design has
forced us to give up the illusion of total control in urban
environment, and consequently to seek for novel methods for
steering the development. New methods using dynamic modeling
have offered a real option for more thorough understanding of
complexity and urban processes. At best new approaches may renew
the design processes so that we get a better grip on the complex
world via more flexible processes, support urban environmental
diversity and respond to our needs beyond basic welfare by liberating
ourselves from the standardized minimalism.
A complex system and its features are as such beyond human
ethics. Self-organization or evolution is either good or bad. Their
mechanisms are by nature devoid of reason. They are common in
urban dynamics in both natural processes and gas. They are features
of a complex system, and they cannot be prevented. Yet their
dynamics can be studied and supported.
The paradigm of complexity and new design approaches has been
criticized for a lack of humanity and morality, but the ethical
implications of scientific or computational design processes have not
been much discussed. It is important to distinguish the (unexciting)
ethics of the theory and tools from the ethics of computer aided
processes based on ethical decisions. Urban planning and architecture
cannot be based on the survival of the fittest; however, the natural
dynamics of the system cannot be impeded on grounds of being
“non-human".
In this paper the ethical challenges of using the dynamic models
are contemplated in light of a few examples of new architecture and
dynamic urban models and literature. It is suggested that ethical
challenges in computational design processes could be reframed
under the concepts of responsibility and transparency.
Abstract: The segmentation of endovascular tools in fluoroscopy images can be accurately performed automatically or by minimum user intervention, using known modern techniques. It has been proven in literature, but no clinical implementation exists so far because the computational time requirements of such technology have not yet been met. A classical segmentation scheme is composed of edge enhancement filtering, line detection, and segmentation. A new method is presented that consists of a vector that propagates in the image to track an edge as it advances. The filtering is performed progressively in the projected path of the vector, whose orientation allows for oriented edge detection, and a minimal image area is globally filtered. Such an algorithm is rapidly computed and can be implemented in real-time applications. It was tested on medical fluoroscopy images from an endovascular cerebral intervention. Ex- periments showed that the 2D tracking was limited to guidewires without intersection crosspoints, while the 3D implementation was able to cope with such planar difficulties.
Abstract: This paper employs a new approach to regulate the
blood glucose level of type I diabetic patient under an intensive
insulin treatment. The closed-loop control scheme incorporates
expert knowledge about treatment by using reinforcement learning
theory to maintain the normoglycemic average of 80 mg/dl and the
normal condition for free plasma insulin concentration in severe
initial state. The insulin delivery rate is obtained off-line by using Qlearning
algorithm, without requiring an explicit model of the
environment dynamics. The implementation of the insulin delivery
rate, therefore, requires simple function evaluation and minimal
online computations. Controller performance is assessed in terms of
its ability to reject the effect of meal disturbance and to overcome the
variability in the glucose-insulin dynamics from patient to patient.
Computer simulations are used to evaluate the effectiveness of the
proposed technique and to show its superiority in controlling
hyperglycemia over other existing algorithms
Abstract: Multi-agent system approach has proven to be an effective and appropriate abstraction level to construct whole models of a diversity of biological problems, integrating aspects which can be found both in "micro" and "macro" approaches when modeling this type of phenomena. Taking into account these considerations, this paper presents the important computational characteristics to be gathered into a novel bioinformatics framework built upon a multiagent architecture. The version of the tool presented herein allows studying and exploring complex problems belonging principally to structural biology, such as protein folding. The bioinformatics framework is used as a virtual laboratory to explore a minimalist model of protein folding as a test case. In order to show the laboratory concept of the platform as well as its flexibility and adaptability, we studied the folding of two particular sequences, one of 45-mer and another of 64-mer, both described by an HP model (only hydrophobic and polar residues) and coarse grained 2D-square lattice. According to the discussion section of this piece of work, these two sequences were chosen as breaking points towards the platform, in order to determine the tools to be created or improved in such a way to overcome the needs of a particular computation and analysis of a given tough sequence. The backwards philosophy herein is that the continuous studying of sequences provides itself important points to be added into the platform, to any time improve its efficiency, as is demonstrated herein.
Abstract: This investigation examines the effect of the sintering
temperature curve in manufactured nickel powder capillary structure
(wick) for a loop heat pipe (LHP). The sintering temperature curve is
composed of a region of increasing temperature; a region of constant
temperature and a region of declining temperature. The most important
region is that in which the temperature increases, as an index in the
stage in which the temperature increases. The wick of nickel powder is
manufactured in the stage of fixed sintering temperature and the time
between the stage of constant temperature and the stage of falling
temperature. When the slope of the curve in the region of increasing
temperature is unity (equivalent to 10 °C/min), the structure of the
wick is complete and the heat transfer performance is optimal. The
result of experiment test demonstrates that the heat transfer
performance is optimal at 320W; the minimal total thermal resistance
is approximately 0.18°C/W, and the heat flux is 17W/cm2; the internal
parameters of the wick are an effective pore radius of 3.1 μm, a
permeability of 3.25×10-13m2 and a porosity of 71%.
Abstract: The aim of this paper is to present the role of
myotonometry in assessment muscle viscoelasticity by measurement
of force index (IF) and stiffness (S) at thigh muscle groups. The
results are used for improve the muscle training. The method is based
on mechanic impulse on the muscle group, that involve a muscle
response like acceleration, speed and amplitude curves. From these
we have information about elasticity, stiffness beginning from
mechanic oscillations of muscle tissue. Using this method offer the
possibility for monitoring the muscle capacity for produce mechanic
energy, that allows a efficiency of movement with a minimal tissue
deformation.
Abstract: The new optimization method for fiber orientation
angle optimization of symmetrical multilayer plates like plywood is
proposed. Optimization method consists of seeking for minimal
compliance by choosing appropriate fiber orientation angle in outer
layers of flexural plate. The discrete values of fiber orientation angles
are used in method. Optimization results of simply supported plate
and multispan plate with uniformly distributed load are provided.
Results show that stiffness could be increased up to 20% by changing
wood fiber orientation angle in one or two outer layers.