Abstract: The ability to distinguish missense nucleotide
substitutions that contribute to harmful effect from those that do not
is a difficult problem usually accomplished through functional in
vivo analyses. In this study, instead current biochemical methods, the
effects of missense mutations upon protein structure and function
were assayed by means of computational methods and information
from the databases. For this order, the effects of new missense
mutations in exon 5 of PTEN gene upon protein structure and
function were examined. The gene coding for PTEN was identified
and localized on chromosome region 10q23.3 as the tumor
suppressor gene. The utilization of these methods were shown that
c.319G>A and c.341T>G missense mutations that were recognized in
patients with breast cancer and Cowden disease, could be pathogenic.
This method could be use for analysis of missense mutation in others
genes.
Abstract: Particle detection in very noisy and low contrast images
is an active field of research in image processing. In this article, a
method is proposed for the efficient detection and sizing of subsurface
spherical particles, which is used for the processing of softly fused
Au nanoparticles. Transmission Electron Microscopy is used for
imaging the nanoparticles, and the proposed algorithm has been
tested with the two-dimensional projected TEM images obtained.
Results are compared with the data obtained by transmission optical
spectroscopy, as well as with conventional circular object detection
algorithms.
Abstract: In this paper, the dynamic analysis of fuel storage
tanks has been studied and some equations are presented for the
created fluid waves due to storage tank motions. Also, the equations
for finite elements of fluid and structure interactions, and boundary
conditions dominant on structure and fluid, were researched. In this
paper, a numerical simulation is performed for the dynamic analysis
of a storage tank contained a fluid. This simulation has carried out by
ANSYS software, using FSI solver (Fluid and Structure Interaction
solver), and by considering the simulated fluid dynamic motions due
to earthquake loading, based on velocities and movements of
structure and fluid according to all boundary conditions dominant on
structure and fluid.
Abstract: Unified Speech Audio Coding (USAC), the latest MPEG standardization for unified speech and audio coding, uses a speech/audio classification algorithm to distinguish speech and audio segments of the input signal. The quality of the recovered audio can be increased by well-designed orchestra/percussion classification and subsequent processing. However, owing to the shortcoming of the system, introducing an orchestra/percussion classification and modifying subsequent processing can enormously increase the quality of the recovered audio. This paper proposes an orchestra/percussion classification algorithm for the USAC system which only extracts 3 scales of Mel-Frequency Cepstral Coefficients (MFCCs) rather than traditional 13 scales of MFCCs and use Iterative Dichotomiser 3 (ID3) Decision Tree rather than other complex learning method, thus the proposed algorithm has lower computing complexity than most existing algorithms. Considering that frequent changing of attributes may lead to quality loss of the recovered audio signal, this paper also design a modified subsequent process to help the whole classification system reach an accurate rate as high as 97% which is comparable to classical 99%.
Abstract: In this study we survey the method for fast finding a minimum link path between two arbitrary points within a simple polygon, which can pass only through the vertices, with preprocessing.
Abstract: In this paper, the problem of reducing switching
activity in on-chip buses at the stage of high-level synthesis is
considered, and a high-level low power bus binding based on dynamic
bit reordering is proposed. Whereas conventional methods use a fixed
bit ordering between variables within a bus, the proposed method
switches a bit ordering dynamically to obtain a switching activity
reduction. As a result, the proposed method finds a binding solution
with a smaller value of total switching activity (TSA). Experimental
result shows that the proposed method obtains a binding solution
having 12.0-34.9% smaller TSA compared with the conventional
methods.
Abstract: This paper proposes a novel frequency offset (FO) estimator for orthogonal frequency division multiplexing. Simplicity is most significant feature of this algorithm and can be repeated to achieve acceptable accuracy. Also fractional and integer part of FO is estimated jointly with use of the same algorithm. To do so, instead of using conventional algorithms that usually use correlation function, we use DFT of received signal. Therefore, complexity will be reduced and we can do synchronization procedure by the same hardware that is used to demodulate OFDM symbol. Finally, computer simulation shows that the accuracy of this method is better than other conventional methods.
Abstract: This paper describes the optimization of a complex
dairy farm simulation model using two quite different methods of
optimization, the Genetic algorithm (GA) and the Lipschitz
Branch-and-Bound (LBB) algorithm. These techniques have been
used to improve an agricultural system model developed by Dexcel
Limited, New Zealand, which describes a detailed representation of
pastoral dairying scenarios and contains an 8-dimensional parameter
space. The model incorporates the sub-models of pasture growth and
animal metabolism, which are themselves complex in many cases.
Each evaluation of the objective function, a composite 'Farm
Performance Index (FPI)', requires simulation of at least a one-year
period of farm operation with a daily time-step, and is therefore
computationally expensive. The problem of visualization of the
objective function (response surface) in high-dimensional spaces is
also considered in the context of the farm optimization problem.
Adaptations of the sammon mapping and parallel coordinates
visualization are described which help visualize some important
properties of the model-s output topography. From this study, it is
found that GA requires fewer function evaluations in optimization
than the LBB algorithm.
Abstract: Smith Predictor control is theoretically a good solution to the problem of controlling the time delay systems. However, it seldom gets use because it is almost impossible to find out a precise mathematical model of the practical system and very sensitive to uncertain system with variable time-delay. In this paper is concerned with a design method of smith predictor for temperature control system by Coefficient Diagram Method (CDM). The simulation results show that the control system with smith predictor design by CDM is stable and robust whilst giving the desired time domain system performance.
Abstract: This paper presents an application of 5S lean technology to a production facility. Due to increased demand, high product variety, and a push production system, the plant has suffered from excessive wastes, unorganized workstations, and unhealthy work environment. This has translated into increased production cost, frequent delays, and low workers morale. Under such conditions, it has become difficult, if not impossible, to implement effective continuous improvement studies. Hence, the lean project is aimed at diagnosing the production process, streamlining the workflow, removing/reducing process waste, cleaning the production environment, improving plant layout, and organizing workstations. 5S lean technology is utilized for achieving project objectives. The work was a combination of both culture changes and tangible/physical changes on the shop floor. The project has drastically changed the plant and developed the infrastructure for a successful implementation of continuous improvement as well as other best practices and quality initiatives.
Abstract: To reduce the carbon dioxide emission into the
atmosphere, adsorption is believed to be one of the most attractive
methods for post-combustion treatment of flue gas. In this work,
activated carbon (AC) was modified by polyethylenimine (PEI) via
impregnation in order to enhance CO2 adsorption capacity. The
adsorbents were produced at 0.04, 0.16, 0.22, 0.25, and 0.28 wt%
PEI/AC. The adsorption was carried out at a temperature range from
30 °C to 75 °C and five different gas pressures up to 1 atm. TG-DTA,
FT-IR, UV-visible spectrometer, and BET were used to characterize
the adsorbents. Effects of PEI loading on the AC for the CO2
adsorption were investigated. Effectiveness of the adsorbents on the
CO2 adsorption including CO2 adsorption capacity and adsorption
temperature was also investigated. Adsorption capacities of CO2 were
enhanced with the increase in the amount of PEI from 0.04 to 0.22
wt% PEI before the capacities decreased onwards from0.25 wt% PEI
at 30 °C. The 0.22 wt% PEI/AC showed higher adsorption capacity
than the AC for adsorption at 50 °C to 75 °C.
Abstract: In a world worried about water resources with the
shadow of drought and famine looming all around, the quality of
water is as important as its quantity. The source of all concerns is the
constant reduction of per capita quality water for different uses.
Iran With an average annual precipitation of 250 mm compared to
the 800 mm world average, Iran is considered a water scarce country
and the disparity in the rainfall distribution, the limitations of
renewable resources and the population concentration in the margins
of desert and water scarce areas have intensified the problem.
The shortage of per capita renewable freshwater and its poor
quality in large areas of the country, which have saline, brackish or
hard water resources, and the profusion of natural and artificial
pollutant have caused the deterioration of water quality.
Among methods of treatment and use of these waters one can refer
to the application of membrane technologies, which have come into
focus in recent years due to their great advantages. This process is
quite efficient in eliminating multi-capacity ions; and due to the
possibilities of production at different capacities, application as
treatment process in points of use, and the need for less energy in
comparison to Reverse Osmosis processes, it can revolutionize the
water and wastewater sector in years to come. The article studied the
different capacities of water resources in the Persian Gulf and Oman
Sea watershed basins, and processes the possibility of using
nanofiltration process to treat brackish and non-conventional waters
in these basins.
Abstract: This paper illustrates the use of a combined neural
network model for classification of electrocardiogram (ECG) beats.
We present a trainable neural network ensemble approach to develop
customized electrocardiogram beat classifier in an effort to further
improve the performance of ECG processing and to offer
individualized health care.
We process a three stage technique for detection of premature
ventricular contraction (PVC) from normal beats and other heart
diseases. This method includes a denoising, a feature extraction and a
classification. At first we investigate the application of stationary
wavelet transform (SWT) for noise reduction of the
electrocardiogram (ECG) signals. Then feature extraction module
extracts 10 ECG morphological features and one timing interval
feature. Then a number of multilayer perceptrons (MLPs) neural
networks with different topologies are designed.
The performance of the different combination methods as well as
the efficiency of the whole system is presented. Among them,
Stacked Generalization as a proposed trainable combined neural
network model possesses the highest recognition rate of around 95%.
Therefore, this network proves to be a suitable candidate in ECG
signal diagnosis systems. ECG samples attributing to the different
ECG beat types were extracted from the MIT-BIH arrhythmia
database for the study.
Abstract: In this paper, the C1-conforming finite element method is analyzed for a class of nonlinear fourth-order hyperbolic partial differential equation. Some a priori bounds are derived using Lyapunov functional, and existence, uniqueness and regularity for the weak solutions are proved. Optimal error estimates are derived for both semidiscrete and fully discrete schemes.
Abstract: This paper describes a complex energy signal model
that is isomorphic with digital human fingerprint images. By using
signal models, the problem of fingerprint matching is transformed
into the signal processing problem of finding a correlation between
two complex signals that differ by phase-rotation and time-scaling. A
technique for minutiae matching that is independent of image
translation, rotation and linear-scaling, and is resistant to missing
minutiae is proposed. The method was tested using random data
points. The results show that for matching prints the scaling and
rotation angles are closely estimated and a stronger match will have a
higher correlation.
Abstract: Macrobenthos distribution along the coastal waters of
Penang National Park was studid to estimate the effect of different
environmental parameters at three stations, during six sampling
months, from June 2010 to April 2011. The aim of this survey was to
investigate different environment stress over soft bottom polychaete
community along Teluk Ketapang and Pantai Acheh (Penang
National Park) over a year period. Variations in the polychaete
community were evaluated using univariate and multivariate
methods. A total of 604 individuals were examined which was
grouped into 23 families. Family Nereidae was the most abundant
(22.68%), followed by Spionidae (22.02%), Hesionidae (12.58%),
Nephtylidae (9.27%) and Orbiniidae (8.61%). It is noticeable that
good results can only be obtained on the basis of good taxonomic
resolution. The maximum Shannon-Wiener diversity (H'=2.16) was
recorded at distance 200m and 1200m (August 2010) in Teluk
Ketapang and lowest value of diversity was found at distance 1200m
(December 2010) in Teluk Ketapang.
Abstract: Nowadays, hard disk is one of the most popular storage components. In hard disk industry, the hard disk drive must pass various complex processes and tested systems. In each step, there are some failures. To reduce waste from these failures, we must find the root cause of those failures. Conventionall data analysis method is not effective enough to analyze the large capacity of data. In this paper, we proposed the Hough method for straight line detection that helps to detect straight line defect patterns that occurs in hard disk drive. The proposed method will help to increase more speed and accuracy in failure analysis.
Abstract: The clustering ensembles combine multiple partitions
generated by different clustering algorithms into a single clustering
solution. Clustering ensembles have emerged as a prominent method
for improving robustness, stability and accuracy of unsupervised
classification solutions. So far, many contributions have been done to
find consensus clustering. One of the major problems in clustering
ensembles is the consensus function. In this paper, firstly, we
introduce clustering ensembles, representation of multiple partitions,
its challenges and present taxonomy of combination algorithms.
Secondly, we describe consensus functions in clustering ensembles
including Hypergraph partitioning, Voting approach, Mutual
information, Co-association based functions and Finite mixture
model, and next explain their advantages, disadvantages and
computational complexity. Finally, we compare the characteristics of
clustering ensembles algorithms such as computational complexity,
robustness, simplicity and accuracy on different datasets in previous
techniques.
Abstract: Orthogonal Frequency Division Multiplexing
(OFDM) is an efficient method of data transmission for high speed
communication systems. However, the main drawback of OFDM
systems is that, it suffers from the problem of high Peak-to-Average
Power Ratio (PAPR) which causes inefficient use of the High Power
Amplifier and could limit transmission efficiency. OFDM consist of
large number of independent subcarriers, as a result of which the
amplitude of such a signal can have high peak values. In this paper,
we propose an effective reduction scheme that combines DCT and
SLM techniques. The scheme is composed of the DCT followed by
the SLM using the Riemann matrix to obtain phase sequences for the
SLM technique. The simulation results show PAPR can be greatly
reduced by applying the proposed scheme. In comparison with
OFDM, while OFDM had high values of PAPR –about 10.4dB our
proposed method achieved about 4.7dB reduction of the PAPR with
low complexities computation. This approach also avoids
randomness in phase sequence selection, which makes it simpler to
decode at the receiver. As an added benefit, the matrices can be
generated at the receiver end to obtain the data signal and hence it is
not required to transmit side information (SI).
Abstract: In this paper, we consider the problem for identifying the unknown source in the Poisson equation. A modified Tikhonov regularization method is presented to deal with illposedness of the problem and error estimates are obtained with an a priori strategy and an a posteriori choice rule to find the regularization parameter. Numerical examples show that the proposed method is effective and stable.