Abstract: An unsupervised classification algorithm is derived
by modeling observed data as a mixture of several mutually
exclusive classes that are each described by linear combinations of
independent non-Gaussian densities. The algorithm estimates the
data density in each class by using parametric nonlinear functions
that fit to the non-Gaussian structure of the data. This improves
classification accuracy compared with standard Gaussian mixture
models. When applied to textures, the algorithm can learn basis
functions for images that capture the statistically significant structure
intrinsic in the images. We apply this technique to the problem of
unsupervised texture classification and segmentation.
Abstract: The purpose of this research was to study the inspector performance by using computer based training (CBT). Visual inspection task was printed circuit board (PCB) simulated on several types of defects. Subjects were 16 undergraduate randomly selected from King Mongkut-s University of Technology Thonburi and test for 20/20. Then, they were equally divided on performance into two groups (control and treatment groups) and were provided information before running the experiment. Only treatment group was provided feedback information after first experiment. Results revealed that treatment group was showed significantly difference at the level of 0.01. The treatment group showed high percentage on defects detected. Moreover, the attitude of inspectors on using the CBT to inspection was showed on good. These results have been showed that CBT could be used for training to improve inspector performance.
Abstract: In this paper, we present a new method for
incorporating global shift invariance in support vector machines.
Unlike other approaches which incorporate a feature extraction stage,
we first scale the image and then classify it by using the modified
support vector machines classifier. Shift invariance is achieved by
replacing dot products between patterns used by the SVM classifier
with the maximum cross-correlation value between them. Unlike the
normal approach, in which the patterns are treated as vectors, in our
approach the patterns are treated as matrices (or images). Crosscorrelation
is computed by using computationally efficient
techniques such as the fast Fourier transform. The method has been
tested on the ORL face database. The tests indicate that this method
can improve the recognition rate of an SVM classifier.
Abstract: The automatic discrimination of seismic signals is an important practical goal for the earth-science observatories due to the large amount of information that they receive continuously. An essential discrimination task is to allocate the incoming signal to a group associated with the kind of physical phenomena producing it. In this paper, we present new techniques for seismic signals classification: local, regional and global discrimination. These techniques were tested on seismic signals from the data base of the National Geophysical Institute of the Centre National pour la Recherche Scientifique et Technique (Morocco) by using the Moroccan software for seismic signals analysis.
Abstract: Pollution emission levels of aircraft engines are a
nowadays high concern. Any technological advance that could reduce
emission levels is always welcome. In what concerns aircraft engines,
a possible solution for this problem could be the use of regenerators
and intercoolers. These components might reduce the specific fuel
consumption, increase efficiency and specific thrust and consequently
reduce the pollution levels of the engine. This is not a novel solution.
These heat exchangers are already is use in stationary engines. For
aircraft engines, the extra weight of the needed hardware could
overcome the fuel saved. This work compares a conventional engine
with configurations that use intercoolers and regenerators.
Abstract: In this study, the dispersed model is used to predict
gas phase concentration, liquid drop concentration. The venturi
scrubber efficiency is calculated by gas phase concentration. The
modified model has been validated with available experimental data
of Johnstone, Field and Tasler for a range of throat gas velocities,
liquid to gas ratios and particle diameters and is used to study the
effect of some design parameters on collection efficiency.
Abstract: The Informational Infrastructures of small and medium-sized manufacturing enterprises are relatively poor, there are serious shortages of capitals which can be invested in informatization construction, computer hardware and software resources, and human resources. To address the informatization issue in small and medium-sized manufacturing enterprises, and enable them to the application of advanced management thinking and enhance their competitiveness, the paper establish a manufacturing-oriented small and medium-sized enterprises informatization platform based on the ASP business intelligence technology, which effectively improves the scientificity of enterprises decision and management informatization.
Abstract: Hypernetworks are a generalized graph structure
representing higher-order interactions between variables. We present a
method for self-organizing hypernetworks to learn an associative
memory of sentences and to recall the sentences from this memory.
This learning method is inspired by the “mental chemistry" model of
cognition and the “molecular self-assembly" technology in
biochemistry. Simulation experiments are performed on a corpus of
natural-language dialogues of approximately 300K sentences
collected from TV drama captions. We report on the sentence
completion performance as a function of the order of word-interaction
and the size of the learning corpus, and discuss the plausibility of this
architecture as a cognitive model of language learning and memory.
Abstract: Location-based services (LBS) exploit the known
location of a user to provide services dependent on their geographic
context and personalized needs [1].
The development and arrival of broadband mobile data networks
supported with mobile terminals equipped with new location
technologies like GPS have finally created opportunities for
implementation of LBS applications. But, from the other side,
collecting location information data in general raises privacy
concerns.
This paper presents results from two surveys of LBS acceptance in
Croatia. The first survey was administered on 181 students, and the
second extended survey involved pattern of 180 Croatian citizens.
We developed questionnaire which consists of descriptions of 15
different applications with scale which measures perceptions and
attitudes of users towards these applications.
We report the results to identify potential commercial applications
for LBS in B2C segment. Our findings suggest that some types of
applications like emergency&safety services and navigation have
significantly higher rate of acceptance than other types.
Abstract: In aerospace applications, interactions of airflow with
aircraft structures can result in undesirable structural deformations.
This structural deformation in turn, can be predicted if the natural
modes of the structure are known. This can be achieved through
conventional modal testing that requires a known excitation force in
order to extract these dynamic properties. This technique can be
experimentally complex because of the need for artificial excitation
and it is also does not represent actual operational condition. The
current work presents part of research work that address the practical
implementation of operational modal analysis (OMA) applied to a
cantilevered hybrid composite plate employing single contactless
sensing system via laser vibrometer. OMA technique extracts the
modal parameters based only on the measurements of the dynamic
response. The OMA results were verified with impact hammer modal
testing and good agreement was obtained.
Abstract: In China, with the rapid urbanization and
industrialization, and highly accelerated economic development have
resulted in degradation of water resource. The water quality
deterioration usual result from eutrophication in most cases, so how to
dispose this type pollution water higher efficiently is an urgent task.
Hower, different with traditional technology, constructed wetlands are
effective treatment systems that can be very useful because they are
simple technology and low operational cost. A pilot-scale treatment
including constructed wetlands was constructed at XingYun Lake,
Yuxi, China, and operated as primary treatment measure before
eutrophic-lake water draining to riverine landscape. Water quality
indices were determined during the experiment, the results indicated
that treatment removal efficiencies were high for Nitrate nitrogen,
Chlorophyll–a and Algae, the final removal efficiency reached to
95.20%, 93.33% and 99.87% respectively, but the removal efficiency
of Total phosphorous and Total nitrogen only reach to 68.83% and
50.00% respectively.
Abstract: The mixing of pollutions and sediments in near shore regions of natural water bodies depends heavily on the characteristics such as the strength and frequency of flow instability. In the present paper, the instability of natural convection induced by absorption of solar radiation in littoral regions is considered. Spectral analysis is conducted on the quasi-steady state flow to reveal the power and frequency modes of the instability at various positions. Results indicate that the power of instability, the number of frequency modes, the prominence of higher frequency modes, and the highest frequency mode increase with the offshore distance and/or Rayleigh number. Harmonic modes are present at relatively low Rayleigh numbers. For a given offshore distance, the position with the strongest power of instability is located adjacent to the sloping bottom while the frequency modes are the same over the local depth. As the Rayleigh number increases, the unstable region extends toward the shore.
Abstract: Today’s technology is heavily dependent on web applications. Web applications are being accepted by users at a very rapid pace. These have made our work efficient. These include webmail, online retail sale, online gaming, wikis, departure and arrival of trains and flights and list is very long. These are developed in different languages like PHP, Python, C#, ASP.NET and many more by using scripts such as HTML and JavaScript. Attackers develop tools and techniques to exploit web applications and legitimate websites. This has led to rise of web application security; which can be broadly classified into Declarative Security and Program Security. The most common attacks on the applications are by SQL Injection and XSS which give access to unauthorized users who totally damage or destroy the system. This paper presents a detailed literature description and analysis on Web Application Security, examples of attacks and steps to mitigate the vulnerabilities.
Abstract: Self-organizing map (SOM) is a well known data
reduction technique used in data mining. It can reveal structure in
data sets through data visualization that is otherwise hard to detect
from raw data alone. However, interpretation through visual
inspection is prone to errors and can be very tedious. There are
several techniques for the automatic detection of clusters of code
vectors found by SOM, but they generally do not take into account
the distribution of code vectors; this may lead to unsatisfactory
clustering and poor definition of cluster boundaries, particularly
where the density of data points is low. In this paper, we propose the
use of an adaptive heuristic particle swarm optimization (PSO)
algorithm for finding cluster boundaries directly from the code
vectors obtained from SOM. The application of our method to
several standard data sets demonstrates its feasibility. PSO algorithm
utilizes a so-called U-matrix of SOM to determine cluster boundaries;
the results of this novel automatic method compare very favorably to
boundary detection through traditional algorithms namely k-means
and hierarchical based approach which are normally used to interpret
the output of SOM.
Abstract: Laser Metal Deposition (LMD) is an additive manufacturing process with capabilities that include: producing new
part directly from 3 Dimensional Computer Aided Design (3D CAD)
model, building new part on the existing old component and repairing an existing high valued component parts that would have
been discarded in the past. With all these capabilities and its advantages over other additive manufacturing techniques, the
underlying physics of the LMD process is yet to be fully understood probably because of high interaction between the processing
parameters and studying many parameters at the same time makes it
further complex to understand. In this study, the effect of laser power
and powder flow rate on physical properties (deposition height and
deposition width), metallurgical property (microstructure) and
mechanical (microhardness) properties on laser deposited most
widely used aerospace alloy are studied. Also, because the Ti6Al4V
is very expensive, and LMD is capable of reducing buy-to-fly ratio
of aerospace parts, the material utilization efficiency is also studied.
Four sets of experiments were performed and repeated to establish repeatability using laser power of 1.8 kW and 3.0 kW, powder flow
rate of 2.88 g/min and 5.67 g/min, and keeping the gas flow rate and
scanning speed constant at 2 l/min and 0.005 m/s respectively. The
deposition height / width are found to increase with increase in laser
power and increase in powder flow rate. The material utilization is favoured by higher power while higher powder flow rate reduces
material utilization. The results are presented and fully discussed.
Abstract: Medical images require special safety and confidentiality because critical judgment is done on the information provided by medical images. Transmission of medical image via internet or mobile phones demands strong security and copyright protection in telemedicine applications. Here, highly secured and robust watermarking technique is proposed for transmission of image data via internet and mobile phones. The Region of Interest (ROI) and Non Region of Interest (RONI) of medical image are separated. Only RONI is used for watermark embedding. This technique results in exact recovery of watermark with standard medical database images of size 512x512, giving 'correlation factor' equals to 1. The correlation factor for different attacks like noise addition, filtering, rotation and compression ranges from 0.90 to 0.95. The PSNR with weighting factor 0.02 is up to 48.53 dBs. The presented scheme is non blind and embeds hospital logo of 64x64 size.
Abstract: Nowadays, the demand for high product quality
focuses extensive attention to the quality of machined surface. The
(CNC) milling machine facilities provides a wide variety of
parameters set-up, making the machining process on the glass
excellent in manufacturing complicated special products compared to
other machining processes. However, the application of grinding
process on the CNC milling machine could be an ideal solution to
improve the product quality, but adopting the right machining
parameters is required. In glass milling operation, several machining
parameters are considered to be significant in affecting surface
roughness. These parameters include the lubrication pressure, spindle
speed, feed rate and depth of cut. In this research work, a fuzzy logic
model is offered to predict the surface roughness of a machined
surface in glass milling operation using CBN grinding tool. Four
membership functions are allocated to be connected with each input
of the model. The predicted results achieved via fuzzy logic model
are compared to the experimental result. The result demonstrated
settlement between the fuzzy model and experimental results with the
93.103% accuracy.
Abstract: R&D risk management has been suggested as one of
the management approaches for accomplishing the goals of public
R&D investment. The investment in basic science and core technology
development is the essential roles of government for securing the
social base needed for continuous economic growth. And, it is also an
important role of the science and technology policy sectors to generate
a positive environment in which the outcomes of public R&D can be
diffused in a stable fashion by controlling the uncertainties and risk
factors in advance that may arise during the application of such
achievements to society and industry. Various policies have already
been implemented to manage uncertainties and variables that may
have negative impact on accomplishing public R& investment goals.
But we may derive new policy measures for complementing the
existing policies and for exploring progress direction by analyzing
them in a policy package from the viewpoint of R&D risk
management.
Abstract: Text Mining is around applying knowledge discovery techniques to unstructured text is termed knowledge discovery in text (KDT), or Text data mining or Text Mining. In Neural Network that address classification problems, training set, testing set, learning rate are considered as key tasks. That is collection of input/output patterns that are used to train the network and used to assess the network performance, set the rate of adjustments. This paper describes a proposed back propagation neural net classifier that performs cross validation for original Neural Network. In order to reduce the optimization of classification accuracy, training time. The feasibility the benefits of the proposed approach are demonstrated by means of five data sets like contact-lenses, cpu, weather symbolic, Weather, labor-nega-data. It is shown that , compared to exiting neural network, the training time is reduced by more than 10 times faster when the dataset is larger than CPU or the network has many hidden units while accuracy ('percent correct') was the same for all datasets but contact-lences, which is the only one with missing attributes. For contact-lences the accuracy with Proposed Neural Network was in average around 0.3 % less than with the original Neural Network. This algorithm is independent of specify data sets so that many ideas and solutions can be transferred to other classifier paradigms.
Abstract: The problem of mapping tasks onto a computational grid with the aim to minimize the power consumption and the makespan subject to the constraints of deadlines and architectural requirements is considered in this paper. To solve this problem, we propose a solution from cooperative game theory based on the concept of Nash Bargaining Solution. The proposed game theoretical technique is compared against several traditional techniques. The experimental results show that when the deadline constraints are tight, the proposed technique achieves superior performance and reports competitive performance relative to the optimal solution.