Abstract: A neuron can emit spikes in an irregular time basis and by averaging over a certain time window one would ignore a lot of information. It is known that in the context of fast information processing there is no sufficient time to sample an average firing rate of the spiking neurons. The present work shows that the spiking neurons are capable of computing the radial basis functions by storing the relevant information in the neurons' delays. One of the fundamental findings of the this research also is that when using overlapping receptive fields to encode the data patterns it increases the network-s clustering capacity. The clustering algorithm that is discussed here is interesting from computer science and neuroscience point of view as well as from a perspective.
Abstract: Acoustic Imaging based sound localization using microphone
array is a challenging task in digital-signal processing.
Discrete Fourier transform (DFT) based near-field acoustical holography
(NAH) is an important acoustical technique for sound source
localization and provide an efficient solution to the ill-posed problem.
However, in practice, due to the usage of small curtailed aperture
and its consequence of significant spectral leakage, the DFT could
not reconstruct the active-region-of-sound (AROS) effectively, especially
near the edges of aperture. In this paper, we emphasize the
fundamental problems of DFT-based NAH, provide a solution to
spectral leakage effect by the extrapolation based on linear predictive
coding and 2D Tukey windowing. This approach has been tested to
localize the single and multi-point sound sources. We observe that
incorporating extrapolation technique increases the spatial resolution,
localization accuracy and reduces spectral leakage when small curtail
aperture with a lower number of sensors accounts.
Abstract: Society has grown to rely on Internet services, and the
number of Internet users increases every day. As more and more
users become connected to the network, the window of opportunity
for malicious users to do their damage becomes very great and
lucrative. The objective of this paper is to incorporate different
techniques into classier system to detect and classify intrusion from
normal network packet. Among several techniques, Steady State
Genetic-based Machine Leaning Algorithm (SSGBML) will be used
to detect intrusions. Where Steady State Genetic Algorithm (SSGA),
Simple Genetic Algorithm (SGA), Modified Genetic Algorithm and
Zeroth Level Classifier system are investigated in this research.
SSGA is used as a discovery mechanism instead of SGA. SGA
replaces all old rules with new produced rule preventing old good
rules from participating in the next rule generation. Zeroth Level
Classifier System is used to play the role of detector by matching
incoming environment message with classifiers to determine whether
the current message is normal or intrusion and receiving feedback
from environment. Finally, in order to attain the best results,
Modified SSGA will enhance our discovery engine by using Fuzzy
Logic to optimize crossover and mutation probability. The
experiments and evaluations of the proposed method were performed
with the KDD 99 intrusion detection dataset.
Abstract: In-core memory requirement is a bottleneck in solving
large three dimensional Navier-Stokes finite element problem
formulations using sparse direct solvers. Out-of-core solution
strategy is a viable alternative to reduce the in-core memory
requirements while solving large scale problems. This study
evaluates the performance of various out-of-core sequential solvers
based on multifrontal or supernodal techniques in the context of
finite element formulations for three dimensional problems on a
Windows platform. Here three different solvers, HSL_MA78,
MUMPS and PARDISO are compared. The performance of these
solvers is evaluated on a 64-bit machine with 16GB RAM for finite
element formulation of flow through a rectangular channel. It is
observed that using out-of-core PARDISO solver, relatively large
problems can be solved. The implementation of Newton and
modified Newton's iteration is also discussed.
Abstract: In today-s era of plasma and laser cutting, machines using oxy-acetylene flame are also meritorious due to their simplicity and cost effectiveness. The objective to devise a Computer controlled Oxy-Fuel profile cutting machine arose from the increasing demand for metal cutting with respect to edge quality, circularity and lesser formation of redeposit material. The System has an 8 bit micro controller based embedded system, which assures stipulated time response. A new window based Application software was devised which takes a standard CAD file .DXF as input and converts it into numerical data required for the controller. It uses VB6 as a front end whereas MS-ACCESS and AutoCAD as back end. The system is designed around AT89C51RD2, powerful 8 bit, ISP micro controller from Atmel and is optimized to achieve cost effectiveness and also maintains the required accuracy and reliability for complex shapes. The backbone of the system is a cleverly designed mechanical assembly along with the embedded system resulting in an accuracy of about 10 microns while maintaining perfect linearity in the cut. This results in substantial increase in productivity. The observed results also indicate reduced inter laminar spacing of pearlite with an increase in the hardness of the edge region.
Abstract: Sorting appears the most attention among all computational tasks over the past years because sorted data is at the heart of many computations. Sorting is of additional importance to parallel computing because of its close relation to the task of routing data among processes, which is an essential part of many parallel algorithms. Many parallel sorting algorithms have been investigated for a variety of parallel computer architectures. In this paper, three parallel sorting algorithms have been implemented and compared in terms of their overall execution time. The algorithms implemented are the odd-even transposition sort, parallel merge sort and parallel rank sort. Cluster of Workstations or Windows Compute Cluster has been used to compare the algorithms implemented. The C# programming language is used to develop the sorting algorithms. The MPI (Message Passing Interface) library has been selected to establish the communication and synchronization between processors. The time complexity for each parallel sorting algorithm will also be mentioned and analyzed.
Abstract: A frequency grouping approach for multi-channel
instantaneous blind source separation (I-BSS) of convolutive
mixtures is proposed for a lower net residual inter-symbol
interference (ISI) and inter-channel interference (ICI) than the
conventional short-time Fourier transform (STFT) approach. Starting
in the time domain, STFTs are taken with overlapping windows to
convert the convolutive mixing problem into frequency domain
instantaneous mixing. Mixture samples at the same frequency but
from different STFT windows are grouped together forming unique
frequency groups.
The individual frequency group vectors are input to the I-BSS
algorithm of choice, from which the output samples are dispersed
back to their respective STFT windows. After applying the inverse
STFT, the resulting time domain signals are used to construct the
complete source estimates via the weighted overlap-add method
(WOLA). The proposed algorithm is tested for source deconvolution
given two mixtures, and simulated along with the STFT approach to
illustrate its superiority for fairly motionless sources.
Abstract: Median filters with larger windows offer greater smoothing and are more robust than the median filters of smaller windows. However, the larger median smoothers (the median filters with the larger windows) fail to track low order polynomial trends in the signals. Due to this, constant regions are produced at the signal corners, leading to the loss of fine details. In this paper, an algorithm, which combines the ability of the 3-point median smoother in preserving the low order polynomial trends and the superior noise filtering characteristics of the larger median smoother, is introduced. The proposed algorithm (called the combiner algorithm in this paper) is evaluated for its performance on a test image corrupted with different types of noise and the results obtained are included.
Abstract: This paper presents a portable robot that is to use for
welding process in shipbuilding yard. It has six degree of freedom and
3kg payload capability. Its weight is 21.5kg so that human workers can
carry it to the work place. Its body mainly made of magnesium alloy
and aluminum alloy for few parts that require high strength. Since the
distance between robot and controller should be 50m at most, the robot
controller controls the robot through EtherCAT. RTX and KPA are
used for real time EtherCAT control on Windows XP. The
performance of the developed robot was satisfactory, in welding of U
type cell in shipbuilding yard.
Abstract: this paper gives a novel approach towards real-time speed estimation of multiple traffic vehicles using fuzzy logic and image processing techniques with proper arrangement of camera parameters. The described algorithm consists of several important steps. First, the background is estimated by computing median over time window of specific frames. Second, the foreground is extracted using fuzzy similarity approach (FSA) between estimated background pixels and the current frame pixels containing foreground and background. Third, the traffic lanes are divided into two parts for both direction vehicles for parallel processing. Finally, the speeds of vehicles are estimated by Maximum a Posterior Probability (MAP) estimator. True ground speed is determined by utilizing infrared sensors for three different vehicles and the results are compared to the proposed algorithm with an accuracy of ± 0.74 kmph.
Abstract: Visualizing sound and noise often help us to determine
an appropriate control over the source localization. Near-field acoustic
holography (NAH) is a powerful tool for the ill-posed problem.
However, in practice, due to the small finite aperture size, the discrete
Fourier transform, FFT based NAH couldn-t predict the activeregion-
of-interest (AROI) over the edges of the plane. Theoretically
few approaches were proposed for solving finite aperture problem.
However most of these methods are not quite compatible for the
practical implementation, especially near the edge of the source. In
this paper, a zip-stuffing extrapolation approach has suggested with
2D Kaiser window. It is operated on wavenumber complex space
to localize the predicted sources. We numerically form a practice
environment with touch impact databases to test the localization of
sound source. It is observed that zip-stuffing aperture extrapolation
and 2D window with evanescent components provide more accuracy
especially in the small aperture and its derivatives.
Abstract: This paper proposes an efficient method for the design
of two channel quadrature mirror filter (QMF) bank. To achieve
minimum value of reconstruction error near to perfect reconstruction,
a linear optimization process has been proposed. Prototype low pass
filter has been designed using Kaiser window function. The modified
algorithm has been developed to optimize the reconstruction error
using linear objective function through iteration method. The result
obtained, show that the performance of the proposed algorithm is
better than that of the already exists methods.
Abstract: Panoramic view generation has always offered
novel and distinct challenges in the field of image processing.
Panoramic view generation is nothing but construction of bigger
view mosaic image from set of partial images of the desired view.
The paper presents a solution to one of the problems of image
seascape formation where some of the partial images are color and
others are grayscale. The simplest solution could be to convert all
image parts into grayscale images and fusing them to get grayscale
image panorama. But in the multihued world, obtaining the colored
seascape will always be preferred. This could be achieved by picking
colors from the color parts and squirting them in grayscale parts of
the seascape. So firstly the grayscale image parts should be colored
with help of color image parts and then these parts should be fused to
construct the seascape image.
The problem of coloring grayscale images has no exact solution.
In the proposed technique of panoramic view generation, the job of
transferring color traits from reference color image to grayscale
image is done by palette based method. In this technique, the color
palette is prepared using pixel windows of some degrees taken from
color image parts. Then the grayscale image part is divided into pixel
windows with same degrees. For every window of grayscale image
part the palette is searched and equivalent color values are found,
which could be used to color grayscale window. For palette
preparation we have used RGB color space and Kekre-s LUV color
space. Kekre-s LUV color space gives better quality of coloring. The
searching time through color palette is improved over the exhaustive
search using Kekre-s fast search technique.
After coloring the grayscale image pieces the next job is fusion of
all these pieces to obtain panoramic view. For similarity estimation
between partial images correlation coefficient is used.
Abstract: To strengthen the capital market, there is a need to
integrate the capital markets within the region by removing legal or informal restriction, specifically, stock market liberalization. Thus the paper is to investigate the effects of the subsequent stock market liberalization on stock market integration in 4 ASEAN countries (Malaysia, Indonesia, Thailand, Singapore) and Korea from 1997 to 2007. The correlation between stock market liberalization and stock
market integration are to be examined by analyzing the stock prices
and returns within the region and in comparison with the world
MSCI index. Event study method is to be used with windows of ±12
months and T-7 + T. The results show that the subsequent stock
market liberalization generally, gives minor positive effects to stock
returns, except for one or two countries. The subsequent
liberalization also integrates the markets short-run and long-run.
Abstract: We present the development of a system of programs designed for the compilation and execution of applications for handheld computers. In introduction we describe the purpose of the project and its components. The next two paragraphs present the first two components of the project (the scanner and parser generators). Then we describe the Object Pascal compiler and the virtual machines for Windows and Palm OS. In conclusion we emphasize the ways in which the project can be extended.
Abstract: The tree structured approach of non-uniform filterbank
(NUFB) is normally used in perfect reconstruction (PR). The PR is
not always feasible due to certain limitations, i.e, constraints in
selecting design parameters, design complexity and some times
output is severely affected by aliasing error if necessary and
sufficient conditions of PR is not satisfied perfectly. Therefore, there
has been generalized interest of researchers to go for near perfect
reconstruction (NPR). In this proposed work, an optimized tree
structure technique is used for the design of NPR non-uniform
filterbank. Window functions of Blackman family are used to design
the prototype FIR filter. A single variable linear optimization is used
to minimize the amplitude distortion. The main feature of the
proposed design is its simplicity with linear phase property.
Abstract: We proposed a technique to identify road traffic
congestion levels from velocity of mobile sensors with high accuracy
and consistent with motorists- judgments. The data collection utilized
a GPS device, a webcam, and an opinion survey. Human perceptions
were used to rate the traffic congestion levels into three levels: light,
heavy, and jam. Then the ratings and velocity were fed into a
decision tree learning model (J48). We successfully extracted vehicle
movement patterns to feed into the learning model using a sliding
windows technique. The parameters capturing the vehicle moving
patterns and the windows size were heuristically optimized. The
model achieved accuracy as high as 99.68%. By implementing the
model on the existing traffic report systems, the reports will cover
comprehensive areas. The proposed method can be applied to any
parts of the world.
Abstract: The paper discusses a computationally efficient
method for the design of prototype filters required for the
implementation of an M-band cosine modulated filter bank. The
prototype filter is formulated as an optimum interpolated FIR filter.
The optimum interpolation factor requiring minimum number of
multipliers is used. The model filter as well as the image suppressor
will be designed using the Kaiser window. The method will seek to
optimize a single parameter namely cutoff frequency to minimize the
distortion in the overlapping passband.
Abstract: Rapid Application Development (RAD) enables ever
expanding needs for speedy development of computer application
programs that are sophisticated, reliable, and full-featured. Visual
Basic was the first RAD tool for the Windows operating system, and
too many people say still it is the best. To provide very good
attraction in visual basic 6 applications, this paper directing to use
VRML scenes over the visual basic environment.
Abstract: The temporal nature of negative selection is an under exploited area. In a negative selection system, newly generated antibodies go through a maturing phase, and the survivors of the phase then wait to be activated by the incoming antigens after certain number of matches. These without having enough matches will age and die, while these with enough matches (i.e., being activated) will become active detectors. A currently active detector may also age and die if it cannot find any match in a pre-defined (lengthy) period of time. Therefore, what matters in a negative selection system is the dynamics of the involved parties in the current time window, not the whole time duration, which may be up to eternity. This property has the potential to define the uniqueness of negative selection in comparison with the other approaches. On the other hand, a negative selection system is only trained with “normal" data samples. It has to learn and discover unknown “abnormal" data patterns on the fly by itself. Consequently, it is more appreciate to utilize negation selection as a system for pattern discovery and recognition rather than just pattern recognition. In this paper, we study the potential of using negative selection in discovering unknown temporal patterns.