Abstract: The design of a pattern classifier includes an attempt
to select, among a set of possible features, a minimum subset of
weakly correlated features that better discriminate the pattern classes.
This is usually a difficult task in practice, normally requiring the
application of heuristic knowledge about the specific problem
domain. The selection and quality of the features representing each
pattern have a considerable bearing on the success of subsequent
pattern classification. Feature extraction is the process of deriving
new features from the original features in order to reduce the cost of
feature measurement, increase classifier efficiency, and allow higher
classification accuracy. Many current feature extraction techniques
involve linear transformations of the original pattern vectors to new
vectors of lower dimensionality. While this is useful for data
visualization and increasing classification efficiency, it does not
necessarily reduce the number of features that must be measured
since each new feature may be a linear combination of all of the
features in the original pattern vector. In this paper a new approach is
presented to feature extraction in which feature selection, feature
extraction, and classifier training are performed simultaneously using
a genetic algorithm. In this approach each feature value is first
normalized by a linear equation, then scaled by the associated weight
prior to training, testing, and classification. A knn classifier is used to
evaluate each set of feature weights. The genetic algorithm optimizes
a vector of feature weights, which are used to scale the individual
features in the original pattern vectors in either a linear or a nonlinear
fashion. By this approach, the number of features used in classifying
can be finely reduced.
Abstract: The special and unique advantages of explosive
forming, has developed its use in different industries. Considering the
important influence of improving the current explosive forming
techniques on increasing the efficiency and control over the
explosive forming procedure, the effects of air and water as the
energy-conveying medium, and also their differences will be
illustrated in this paper. Hence, a large number of explosive forming
tests have been conducted on two sizes of thin walled cylindrical
shells by using air and water as the working medium. Comparative
diagrams of the maximum radial deflection of work-pieces of the
same size, as a function of the scaled distance, show that for the
points with the same values of scaled distance, the maximum radial
deformation caused by the under water explosive loading is 4 to 5
times more than the deflection of the shells under explosive forming,
while using air. Results of this experimental research have also been
compared with other studies which show that using water as the
energy conveying media increases the efficiency up to 4.8 times. The
effect of the media on failure modes of the shells, and the necking
mechanism of the walls of the specimens, while being explosively
loaded, are also discussed in this issue. Measuring the tested
specimens shows that, the increase in the internal volume has been
accompanied by necking of the walls, which finally results in the
radial rupture of the structure.
Abstract: Rational Emotive Behaviour Therapy is the first
cognitive behavior therapy which was introduced by Albert Ellis.
This is a systematic and structured psychotherapy which is effective
in treating various psychological problems. A patient, 25 years old
male, experienced intense fear and situational panic attack to return
to his faculty and to face his class-mates after a long absence (2
years). This social anxiety disorder was a major factor that impeded
the progress of his study. He was treated with the use of behavioural
technique such as relaxation breathing technique and cognitive
techniques such as imagery, cognitive restructuring, rationalization
technique and systematic desensitization. The patient reported
positive improvement in the anxiety disorder, able to progress well in
studies and lead a better quality of life as a student.
Abstract: This paper concerns the study of sustainable construction materials applied on the "Health Post", a prototype for the primary health care situated in alienated areas of the world. It's suitable for social and climatic Sub-Saharan context; however, it could be moved in other countries of the world with similar urgent needs. The idea is to create a Health Post with local construction materials that have a low environmental impact and promote the local workforce allowing reuse of traditional building techniques lowering production costs and transport. The aim of Primary Health Care Centre is to be a flexible and expandable structure identifying a modular form that can be repeated several times to expand its existing functions. In this way it could be not only a health care centre but also a socio-cultural facility.
Abstract: During recent years, attention in 'Green Computing'
has moved research into energy-saving techniques for home
computers to enterprise systems' Client and Server machines. Saving
energy or reduction of carbon footprints is one of the aspects of
Green Computing. The research in the direction of Green Computing
is more than just saving energy and reducing carbon foot prints. This
study provides a brief account of Green Computing. The emphasis of
this study is on current trends in Green Computing; challenges in the
field of Green Computing and the future trends of Green Computing.
Abstract: Speckle noise affects all coherent imaging systems
including medical ultrasound. In medical images, noise suppression
is a particularly delicate and difficult task. A tradeoff between noise
reduction and the preservation of actual image features has to be made
in a way that enhances the diagnostically relevant image content.
Even though wavelets have been extensively used for denoising
speckle images, we have found that denoising using contourlets gives
much better performance in terms of SNR, PSNR, MSE, variance and
correlation coefficient. The objective of the paper is to determine the
number of levels of Laplacian pyramidal decomposition, the number
of directional decompositions to perform on each pyramidal level and
thresholding schemes which yields optimal despeckling of medical
ultrasound images, in particular. The proposed method consists of the
log transformed original ultrasound image being subjected to contourlet
transform, to obtain contourlet coefficients. The transformed
image is denoised by applying thresholding techniques on individual
band pass sub bands using a Bayes shrinkage rule. We quantify the
achieved performance improvement.
Abstract: Most of the existing text mining approaches are
proposed, keeping in mind, transaction databases model. Thus, the
mined dataset is structured using just one concept: the “transaction",
whereas the whole dataset is modeled using the “set" abstract type. In
such cases, the structure of the whole dataset and the relationships
among the transactions themselves are not modeled and
consequently, not considered in the mining process.
We believe that taking into account structure properties of
hierarchically structured information (e.g. textual document, etc ...)
in the mining process, can leads to best results. For this purpose, an
hierarchical associations rule mining approach for textual documents
is proposed in this paper and the classical set-oriented mining
approach is reconsidered profits to a Direct Acyclic Graph (DAG)
oriented approach. Natural languages processing techniques are used
in order to obtain the DAG structure. Based on this graph model, an
hierarchical bottom up algorithm is proposed. The main idea is that
each node is mined with its parent node.
Abstract: In this paper, we focus on the fusion of images from
different sources using multiresolution wavelet transforms. Based on
reviews of popular image fusion techniques used in data analysis,
different pixel and energy based methods are experimented. A novel
architecture with a hybrid algorithm is proposed which applies pixel
based maximum selection rule to low frequency approximations and
filter mask based fusion to high frequency details of wavelet
decomposition. The key feature of hybrid architecture is the
combination of advantages of pixel and region based fusion in a
single image which can help the development of sophisticated
algorithms enhancing the edges and structural details. A Graphical
User Interface is developed for image fusion to make the research
outcomes available to the end user. To utilize GUI capabilities for
medical, industrial and commercial activities without MATLAB
installation, a standalone executable application is also developed
using Matlab Compiler Runtime.
Abstract: The steam cracking reactions are always accompanied with the formation of coke which deposits on the walls of the tubular reactors. The investigation has attempted to control catalytic coking by the applying aluminum, zinc and ceramic coating like aluminum-magnesium by thermal spray and pack cementation method. Rate of coke formation during steam cracking of naphtha has been investigated both for uncoated stainless steel (with different alloys) and metal coating constructed with thermal Spray and pack cementation method with metal powders of Aluminum, Aluminum-Magnesium, zinc, silicon, nickel and chromium. The results of the study show that passivating the surface of SS321 with a coating of Aluminum and Aluminum-Magnesium can significantly reduce the rate of coke deposition during naphtha pyrolysis. SEM and EDAX techniques (Philips XL Series) were used to examine the coke deposits formed by the metal-hydrocarbon reactions. Our objective was to separate the different stages by identifying the characteristic morphologies.
Abstract: On-line (near infrared) spectroscopy is widely used to support the operation of complex process systems. Information extracted from spectral database can be used to estimate unmeasured product properties and monitor the operation of the process. These techniques are based on looking for similar spectra by nearest neighborhood algorithms and distance based searching methods. Search for nearest neighbors in the spectral space is an NP-hard problem, the computational complexity increases by the number of points in the discrete spectrum and the number of samples in the database. To reduce the calculation time some kind of indexing could be used. The main idea presented in this paper is to combine indexing and visualization techniques to reduce the computational requirement of estimation algorithms by providing a two dimensional indexing that can also be used to visualize the structure of the spectral database. This 2D visualization of spectral database does not only support application of distance and similarity based techniques but enables the utilization of advanced clustering and prediction algorithms based on the Delaunay tessellation of the mapped spectral space. This means the prediction has not to use the high dimension space but can be based on the mapped space too. The results illustrate that the proposed method is able to segment (cluster) spectral databases and detect outliers that are not suitable for instance based learning algorithms.
Abstract: This paper attempts to discuss the evolution of the
retrieval techniques focusing on development, challenges and trends
of the image retrieval. It highlights both the already addressed and
outstanding issues. The explosive growth of image data leads to the
need of research and development of Image Retrieval. However,
Image retrieval researches are moving from keyword, to low level
features and to semantic features. Drive towards semantic features is
due to the problem of the keywords which can be very subjective and
time consuming while low level features cannot always describe high
level concepts in the users- mind.
Abstract: In this article, a formal specification and verification of the Rabin public-key scheme in a formal proof system is presented. The idea is to use the two views of cryptographic verification: the computational approach relying on the vocabulary of probability theory and complexity theory and the formal approach based on ideas and techniques from logic and programming languages. A major objective of this article is the presentation of the first computer-proved implementation of the Rabin public-key scheme in Isabelle/HOL. Moreover, we explicate a (computer-proven) formalization of correctness as well as a computer verification of security properties using a straight-forward computation model in Isabelle/HOL. The analysis uses a given database to prove formal properties of our implemented functions with computer support. The main task in designing a practical formalization of correctness as well as efficient computer proofs of security properties is to cope with the complexity of cryptographic proving. We reduce this complexity by exploring a light-weight formalization that enables both appropriate formal definitions as well as efficient formal proofs. Consequently, we get reliable proofs with a minimal error rate augmenting the used database, what provides a formal basis for more computer proof constructions in this area.
Abstract: In this study, the dispersion of heavy particles line in
an isotropic and incompressible three-dimensional turbulent flow has
been studied using the Kinematic Simulation techniques to find out
the evolution of the line fractal dimension. The fractal dimension of
the line is found in the case of different particle gravity (in practice,
different values of particle drift velocity) in the presence of small
particle inertia with a comparison with that obtained in the diffusion
case of material line at the same Reynolds number. It can be
concluded for the dispersion of heavy particles line in turbulent flow
that the particle gravity affect the fractal dimension of the line for
different particle gravity velocities in the range 0.2 < W < 2. With
the increase of the particle drift velocity, the fractal dimension of the
line decreases which may be explained as the particles pass many
scales in their journey in the direction of the gravity and the particles
trajectories do not affect by these scales at high particle drift
velocities.
Abstract: In this paper, an extended method of the directionally constrained minimization of power (DCMP) algorithm for broadband signals is proposed. The DCMP algorithm is one of the useful techniques of extracting a target signal from observed signals of a microphone array system. In the DCMP algorithm, output power of the microphone array is minimized under a constraint of constant responses to directions of arrival (DOAs) of specific signals. In our algorithm, by limiting the directional constraint to the perpendicular direction to the sensor array system, the calculating time is reduced.
Abstract: Text Mining is an important step of Knowledge
Discovery process. It is used to extract hidden information from notstructured
o semi-structured data. This aspect is fundamental because
much of the Web information is semi-structured due to the nested
structure of HTML code, much of the Web information is linked,
much of the Web information is redundant. Web Text Mining helps
whole knowledge mining process to mining, extraction and
integration of useful data, information and knowledge from Web
page contents.
In this paper, we present a Web Text Mining process able to
discover knowledge in a distributed and heterogeneous multiorganization
environment. The Web Text Mining process is based on
flexible architecture and is implemented by four steps able to
examine web content and to extract useful hidden information
through mining techniques. Our Web Text Mining prototype starts
from the recovery of Web job offers in which, through a Text Mining
process, useful information for fast classification of the same are
drawn out, these information are, essentially, job offer place and
skills.
Abstract: This paper invites to dialogue and reflections on
innovation and entrepreneurship by presenting concepts of innovation
leading to the introduction of a complex theoretical framework;
Cooperative Innovation (CO-IN). CO-IN is a didactic model
enhancing and scaffolding processes of cooperation creating
innovation drawing on a Scandinavian tradition.
CO-IN is based on a cross-sectorial and multidisciplinary
approach. We introduce the concept of complementarity to help
capture the validity of diversity and we suggest the concept of “the
space in between" to understand the creation of identity as a
collective mind. We see dialogue and the use of multi modal
techniques as essential tools for conceptualizations giving possibility
for clarification of the complexity and diversity leading to decision
making based on knowledge as commons.
We introduce the didactic design and present our empirical
findings from an innovation workshop in Argentina. In a final
paragraph we reflect on the design as a support of the development of
common ground, collective mind and collective action and the
creation of knowledge as commons to facilitate innovation and
entrepreneurship.
Abstract: In this paper, the implementation of low power,
high throughput convolutional filters for the one dimensional
Discrete Wavelet Transform and its inverse are presented. The
analysis filters have already been used for the implementation of a
high performance DWT encoder [15] with minimum memory
requirements for the JPEG 2000 standard. This paper presents the
design techniques and the implementation of the convolutional filters
included in the JPEG2000 standard for the forward and inverse DWT
for achieving low-power operation, high performance and reduced
memory accesses. Moreover, they have the ability of performing
progressive computations so as to minimize the buffering between
the decomposition and reconstruction phases. The experimental
results illustrate the filters- low power high throughput characteristics
as well as their memory efficient operation.
Abstract: This paper presents the application of Intelligent
Techniques to the various duties of Intelligent Condition Monitoring
Systems (ICMS) for Unmanned Aerial Vehicle (UAV) Robots. These
Systems are intended to support these Intelligent Robots in the event
of a Fault occurrence. Neural Networks are used for Diagnosis, whilst
Fuzzy Logic is intended for Prognosis and Remedy. The ultimate
goals of ICMS are to save large losses in financial cost, time and
data.
Abstract: The quality of a machined surface is becoming more and more important to justify the increasing demands of sophisticated component performance, longevity, and reliability. Usually, any machining operation leaves its own characteristic evidence on the machined surface in the form of finely spaced micro irregularities (surface roughness) left by the associated indeterministic characteristics of the different elements of the system: tool-machineworkpart- cutting parameters. However, one of the most influential sources in machining affecting surface roughness is the instantaneous state of tool edge. The main objective of the current work is to relate the in-process immeasurable cutting edge deformation and surface roughness to a more reliable easy-to-measure force signals using a robust non-linear time-dependent modeling regression techniques. Time-dependent modeling is beneficial when modern machining systems, such as adaptive control techniques are considered, where the state of the machined surface and the health of the cutting edge are monitored, assessed and controlled online using realtime information provided by the variability encountered in the measured force signals. Correlation between wear propagation and roughness variation is developed throughout the different edge lifetimes. The surface roughness is further evaluated in the light of the variation in both the static and the dynamic force signals. Consistent correlation is found between surface roughness variation and tool wear progress within its initial and constant regions. At the first few seconds of cutting, expected and well known trend of the effect of the cutting parameters is observed. Surface roughness is positively influenced by the level of the feed rate and negatively by the cutting speed. As cutting continues, roughness is affected, to different extents, by the rather localized wear modes either on the tool nose or on its flank areas. Moreover, it seems that roughness varies as wear attitude transfers from one mode to another and, in general, it is shown that it is improved as wear increases but with possible corresponding workpart dimensional inaccuracy. The dynamic force signals are found reasonably sensitive to simulate either the progressive or the random modes of tool edge deformation. While the frictional force components, feeding and radial, are found informative regarding progressive wear modes, the vertical (power) components is found more representative carrier to system instability resulting from the edge-s random deformation.
Abstract: Utilization of bagasse ash for silica sources is one of
the most common application for agricultural wastes and valuable
biomass byproducts in sugar milling. The high percentage silica
content from bagasse ash was used as silica source for sodium
silicate solution. Different heating temperature, time and acid
treatment were studies for silica extraction. The silica was
characterized using various techniques including X-ray fluorescence,
X-ray diffraction, Scanning electron microscopy, and Fourier
Transform Infrared Spectroscopy method,. The synthesis conditions
were optimized to obtain the bagasse ash with the maximum silica
content. The silica content of 91.57 percent was achieved from
heating of bagasse ash at 600°C for 3 hours under oxygen feeding
and HCl treatment. The result can be used as value added for bagasse
ash utilization and minimize the environmental impact of disposal
problems.