Abstract: Smart Grids employ wireless sensor networks for
their control and monitoring. Sensors are characterized by limitations
in the processing power, energy supply and memory spaces, which
require a particular attention on the design of routing and data
management algorithms.
Since most routing algorithms for sensor networks, focus on
finding energy efficient paths to prolong the lifetime of sensor
networks, the power of sensors on efficient paths depletes quickly,
and consequently sensor networks become incapable of monitoring
events from some parts of their target areas. In consequence, the
design of routing protocols should consider not only energy
efficiency paths, but also energy efficient algorithms in general.
In this paper we propose an energy efficient routing protocol for
wireless sensor networks without the support of any location
information system. The reliability and the efficiency of this protocol
have been demonstrated by simulation studies where we compare
them to the legacy protocols. Our simulation results show that these
algorithms scale well with network size and density.
Abstract: Here are many methods for designing and
implementation of virtual laboratories, because of their special
features. The most famous architectural designs are based on
the events. This model of architecting is so efficient for virtual
laboratories implemented on a local network. Later, serviceoriented
architecture, gave the remote access ability to them
and Peer-To-Peer architecture, hired to exchanging data with
higher quality and more speed. Other methods, such as Agent-
Based architecting, are trying to solve the problems of
distributed processing in a complicated laboratory system.
This study, at first, reviews the general principles of
designing a virtual laboratory, and then compares the different
methods based on EDA, SOA and Agent-Based architecting
to present weaknesses and strengths of each method. At the
end, we make the best choice for design, based on existing
conditions and requirements.
Abstract: A gene network gives the knowledge of the regulatory
relationships among the genes. Each gene has its activators and
inhibitors that regulate its expression positively and negatively
respectively. Genes themselves are believed to act as activators and
inhibitors of other genes. They can even activate one set of genes and
inhibit another set. Identifying gene networks is one of the most
crucial and challenging problems in Bioinformatics. Most work done
so far either assumes that there is no time delay in gene regulation or
there is a constant time delay. We here propose a Dynamic Time-
Lagged Correlation Based Method (DTCBM) to learn the gene
networks, which uses time-lagged correlation to find the potential
gene interactions, and then uses a post-processing stage to remove
false gene interactions to common parents, and finally uses dynamic
correlation thresholds for each gene to construct the gene network.
DTCBM finds correlation between gene expression signals shifted in
time, and therefore takes into consideration the multi time delay
relationships among the genes. The implementation of our method is
done in MATLAB and experimental results on Saccharomyces
cerevisiae gene expression data and comparison with other methods
indicate that it has a better performance.
Abstract: Texture classification is an important image processing
task with a broad application range. Many different techniques for
texture classification have been explored. Using sparse approximation
as a feature extraction method for texture classification is a relatively
new approach, and Skretting et al. recently presented the Frame
Texture Classification Method (FTCM), showing very good results on
classical texture images. As an extension of that work the FTCM is
here tested on a real world application as detection of abnormalities
in mammograms. Some extensions to the original FTCM that are
useful in some applications are implemented; two different smoothing
techniques and a vector augmentation technique. Both detection of
microcalcifications (as a primary detection technique and as a last
stage of a detection scheme), and soft tissue lesions in mammograms
are explored. All the results are interesting, and especially the results
using FTCM on regions of interest as the last stage in a detection
scheme for microcalcifications are promising.
Abstract: This paper describes an enhanced cookie-based
method for counting the visitors of web sites by using a web log
processing system that aims to cope with the ambitious goal of
creating countrywide statistics about the browsing practices of real
human individuals. The focus is put on describing a new more
efficient way of detecting human beings behind web users by placing
different identifiers on the client computers. We briefly introduce our
processing system designed to handle the massive amount of data
records continuously gathered from the most important content
providers of the Hungary. We conclude by showing statistics of
different time spans comparing the efficiency of multiple visitor
counting methods to the one presented here, and some interesting
charts about content providers and web usage based on real data
recorded in 2007 will also be presented.
Abstract: Due to the fast development of technology, the
competition of technological products is turbulent; therefore, it is
important to understand the market trend, consumers- demand and
preferences. As the smartphones are prevalent, the main purpose of
this paper is to utilize Analytic Hierarchy Process (AHP) to analyze
consumer-s purchase evaluation factors of smartphones. Through the
AHP expert questionnaire, the smartphones- main functions are
classified as “user interface", “mobile commerce functions",
“hardware and software specifications", “entertainment functions" and
“appearance and design", five aspects to analyze the weights. Then
four evaluation criteria are evaluated under each aspect to rank the
weights. Based on an analysis of data shows that consumers consider
when purchase factors are “hardware and software specifications",
“user interface", “appearance and design", “mobile commerce
functions" and “entertainment functions" in sequence. The “hardware
and software specifications" aspect obtains the weight of 33.18%; it is
the most important factor that consumers are taken into account. In
addition, the most important evaluation criteria are central processing
unit, operating system, touch screen, and battery function in sequence.
The results of the study can be adopted as reference data for mobile
phone manufacturers in the future on the design and marketing
strategy to satisfy the voice of customer.
Abstract: Polymer melt compressibility and mold surface roughness, which are generally ignored during the filling stage of the conventional injection molding, may become increasingly significant in micro injection molding where the parts become smaller. By employing the 2.5D generalized Hele-Shaw model, we presented here the effects of polymer compressibility and mold surface roughness on mold-filling in a micro-thickness cavity. To elucidate the effects of surface roughness, numerical investigations were conducted using a cavity flat plate which has two halves with different surface roughness. This allows the comparison of flow field on two different halves under identical processing conditions but with different roughness. Results show that polymer compressibility and mold surface roughness have effects on mold filling in micro injection molding. There is in shrinkage reduction as the density is increased due to polymer melt compressibility during the filling stage.
Abstract: As a structure for processing string problem, suffix
array is certainly widely-known and extensively-studied. But if the
string access pattern follows the “90/10" rule, suffix array can not take
advantage of the fact that we often find something that we have just
found. Although the splay tree is an efficient data structure for small
documents when the access pattern follows the “90/10" rule, it
requires many structures and an excessive amount of pointer
manipulations for efficiently processing and searching large
documents. In this paper, we propose a new and conceptually powerful
data structure, called splay suffix arrays (SSA), for string search. This
data structure combines the features of splay tree and suffix arrays into
a new approach which is suitable to implementation on both
conventional and clustered computers.
Abstract: Optical character recognition of cursive scripts
presents a number of challenging problems in both segmentation and
recognition processes in different languages, including Persian. In
order to overcome these problems, we use a newly developed Persian
word segmentation method and a recognition-based segmentation
technique to overcome its segmentation problems. This method is
robust as well as flexible. It also increases the system-s tolerances to
font variations. The implementation results of this method on a
comprehensive database show a high degree of accuracy which meets
the requirements for commercial use. Extended with a suitable pre
and post-processing, the method offers a simple and fast framework
to develop a full OCR system.
Abstract: The mobile systems are powered by batteries.
Reducing the system power consumption is a key to increase its
autonomy. It is known that mostly the systems are dealing with time
varying signals. Thus, we aim to achieve power efficiency by smartly
adapting the system processing activity in accordance with the input
signal local characteristics. It is done by completely rethinking the
processing chain, by adopting signal driven sampling and processing.
In this context, a signal driven filtering technique, based on the level
crossing sampling is devised. It adapts the sampling frequency and
the filter order by analysing the input signal local variations. Thus, it
correlates the processing activity with the signal variations. It leads
towards a drastic computational gain of the proposed technique
compared to the classical one.
Abstract: This paper includes two novel techniques for skew
estimation of binary document images. These algorithms are based on
connected component analysis and Hough transform. Both these
methods focus on reducing the amount of input data provided to
Hough transform. In the first method, referred as word centroid
approach, the centroids of selected words are used for skew detection.
In the second method, referred as dilate & thin approach, the selected
characters are blocked and dilated to get word blocks and later
thinning is applied. The final image fed to Hough transform has the
thinned coordinates of word blocks in the image. The methods have
been successful in reducing the computational complexity of Hough
transform based skew estimation algorithms. Promising experimental
results are also provided to prove the effectiveness of the proposed
methods.
Abstract: The aim of this study was to extract sugar from
sugarcane using high electric field pulse (HELP) as a non-thermal cell permeabilization method. The result of this study showed that it
is possible to permeablize sugar cane cells using HELP at very short times (less than 10 sec.) and at room temperature. Increasing the field strength (from 0.5kV/cm to 2kV/cm) and pulse number (1 to 12) led to increasing the permeabilization of sugar cane cells. The energy
consumption during HELP treatment of sugar cane (2.4 kJ/kg) was about 100 times less compared to thermal cell disintegration at 85
Abstract: The purpose of my research proposal is to
demonstrate that there is a relationship between EEG and
endometrial cancer.
The above relationship is based on an Aristotelian Syllogism;
since it is known that the 14-3-3 protein is related to the electrical
activity of the brain via control of the flow of Na+ and K+ ions and
since it is also known that many types of cancer are associated with
14-3-3 protein, it is possible that there is a relationship between EEG
and cancer. This research will be carried out by well-defined
diagnostic indicators, obtained via the EEG, using signal processing
procedures and pattern recognition tools such as neural networks in
order to recognize the endometrial cancer type. The current research
shall compare the findings from EEG and hysteroscopy performed on
women of a wide age range. Moreover, this practice could be
expanded to other types of cancer. The implementation of this
methodology will be completed with the creation of an ontology.
This ontology shall define the concepts existing in this research-s
domain and the relationships between them. It will represent the
types of relationships between hysteroscopy and EEG findings.
Abstract: This paper proposes an architecture of dynamically
reconfigurable arithmetic circuit. Dynamic reconfiguration is a
technique to realize required functions by changing hardware
construction during operations. The proposed circuit is based on a
complex number multiply-accumulation circuit which is used
frequently in the field of digital signal processing. In addition, the
proposed circuit performs real number double precision arithmetic
operations. The data formats are single and double precision floating
point number based on IEEE754. The proposed circuit is designed
using VHDL, and verified the correct operation by simulations and
experiments.
Abstract: Revolutions Applications such as telecommunications, hands-free communications, recording, etc. which need at least one microphone, the signal is usually infected by noise and echo. The important application is the speech enhancement, which is done to remove suppressed noises and echoes taken by a microphone, beside preferred speech. Accordingly, the microphone signal has to be cleaned using digital signal processing DSP tools before it is played out, transmitted, or stored. Engineers have so far tried different approaches to improving the speech by get back the desired speech signal from the noisy observations. Especially Mobile communication, so in this paper will do reconstruction of the speech signal, observed in additive background noise, using the Kalman filter technique to estimate the parameters of the Autoregressive Process (AR) in the state space model and the output speech signal obtained by the MATLAB. The accurate estimation by Kalman filter on speech would enhance and reduce the noise then compare and discuss the results between actual values and estimated values which produce the reconstructed signals.
Abstract: The paper presents the results of microhardness and
microstructure of low carbon steel surface melted using carbon
dioxide laser with a wavelength of 10.6μm and a maximum output
power of 2000W. The processing parameters such as the laser power,
and the scanning rate were investigated in this study. After surface
melting two distinct regions formed corresponding to the melted zone
MZ, and the heat affected zone HAZ. The laser melted region
displayed a cellular fine structures while the HAZ displayed
martensite or bainite structure. At different processing parameters,
the original microstructure of this steel (Ferrite+Pearlite) has been
transformed to new phases of martensitic and bainitic structures. The
fine structure and the high microhardness are evidence of the high
cooling rates which follow the laser melting. The melting pool and
the transformed microstructure in the laser surface melted region of
carbon steel showed clear dependence on laser power and scanning
rate.
Abstract: Simultaneous Saccharification and Fermentation (SSF) of sugarcane bagasse by cellulase and Pachysolen tannophilus MTCC *1077 were investigated in the present study. Important process variables for ethanol production form pretreated bagasse were optimized using Response Surface Methodology (RSM) based on central composite design (CCD) experiments. A 23 five level CCD experiments with central and axial points was used to develop a statistical model for the optimization of process variables such as incubation temperature (25–45°) X1, pH (5.0–7.0) X2 and fermentation time (24–120 h) X3. Data obtained from RSM on ethanol production were subjected to the analysis of variance (ANOVA) and analyzed using a second order polynomial equation and contour plots were used to study the interactions among three relevant variables of the fermentation process. The fermentation experiments were carried out using an online monitored modular fermenter 2L capacity. The processing parameters setup for reaching a maximum response for ethanol production was obtained when applying the optimum values for temperature (32°C), pH (5.6) and fermentation time (110 h). Maximum ethanol concentration (3.36 g/l) was obtained from 50 g/l pretreated sugarcane bagasse at the optimized process conditions in aerobic batch fermentation. Kinetic models such as Monod, Modified Logistic model, Modified Logistic incorporated Leudeking – Piret model and Modified Logistic incorporated Modified Leudeking – Piret model have been evaluated and the constants were predicted.
Abstract: In this paper three different approaches for person
verification and identification, i.e. by means of fingerprints, face and
voice recognition, are studied. Face recognition uses parts-based
representation methods and a manifold learning approach. The
assessment criterion is recognition accuracy. The techniques under
investigation are: a) Local Non-negative Matrix Factorization
(LNMF); b) Independent Components Analysis (ICA); c) NMF with
sparse constraints (NMFsc); d) Locality Preserving Projections
(Laplacianfaces). Fingerprint detection was approached by classical
minutiae (small graphical patterns) matching through image
segmentation by using a structural approach and a neural network as
decision block. As to voice / speaker recognition, melodic cepstral
and delta delta mel cepstral analysis were used as main methods, in
order to construct a supervised speaker-dependent voice recognition
system. The final decision (e.g. “accept-reject" for a verification
task) is taken by using a majority voting technique applied to the
three biometrics. The preliminary results, obtained for medium
databases of fingerprints, faces and voice recordings, indicate the
feasibility of our study and an overall recognition precision (about
92%) permitting the utilization of our system for a future complex
biometric card.
Abstract: Over the past decades, automatic face recognition has become a highly active research area, mainly due to the countless application possibilities in both the private as well as the public sector. Numerous algorithms have been proposed in the literature to cope with the problem of face recognition, nevertheless, a group of methods commonly referred to as appearance based have emerged as the dominant solution to the face recognition problem. Many comparative studies concerned with the performance of appearance based methods have already been presented in the literature, not rarely with inconclusive and often with contradictory results. No consent has been reached within the scientific community regarding the relative ranking of the efficiency of appearance based methods for the face recognition task, let alone regarding their susceptibility to appearance changes induced by various environmental factors. To tackle these open issues, this paper assess the performance of the three dominant appearance based methods: principal component analysis, linear discriminant analysis and independent component analysis, and compares them on equal footing (i.e., with the same preprocessing procedure, with optimized parameters for the best possible performance, etc.) in face verification experiments on the publicly available XM2VTS database. In addition to the comparative analysis on the XM2VTS database, ten degraded versions of the database are also employed in the experiments to evaluate the susceptibility of the appearance based methods on various image degradations which can occur in "real-life" operating conditions. Our experimental results suggest that linear discriminant analysis ensures the most consistent verification rates across the tested databases.
Abstract: Dill contains range of phytochemicals, such as vitamin C and polyphenols, which significantly contribute to their total antioxidant activity. The aim of the current research was to determine the best blanching method for processing of dill prior to microwave vacuum drying based on the content of phenolic compounds, vitamin C and free radical scavenging activity. Two blanching mediums were used – water and steam, and for part of the samples microwave pretreatment was additionally used. Evaluation of vitamin C, phenolic contents and scavenging of DPPH˙ radical in dried dill was performed. Blanching had an effect on all tested parameters and the blanching conditions are very important. After evaluation of the results, as the best method for dill pretreatment was established blanching at 90 °C for 30 seconds.