Abstract: In this paper we consider the problem of distributed adaptive estimation in wireless sensor networks for two different observation noise conditions. In the first case, we assume that there are some sensors with high observation noise variance (noisy sensors) in the network. In the second case, different variance for observation noise is assumed among the sensors which is more close to real scenario. In both cases, an initial estimate of each sensor-s observation noise is obtained. For the first case, we show that when there are such sensors in the network, the performance of conventional distributed adaptive estimation algorithms such as incremental distributed least mean square (IDLMS) algorithm drastically decreases. In addition, detecting and ignoring these sensors leads to a better performance in a sense of estimation. In the next step, we propose a simple algorithm to detect theses noisy sensors and modify the IDLMS algorithm to deal with noisy sensors. For the second case, we propose a new algorithm in which the step-size parameter is adjusted for each sensor according to its observation noise variance. As the simulation results show, the proposed methods outperforms the IDLMS algorithm in the same condition.
Abstract: Despite many success stories of manufacturing safety, many organizations are still reluctant, perceiving it as cost increasing and time consuming. The clear contributor may be due to the use of lagging indicators rather than leading indicator measures. The study therefore proposes a combinatorial model for determining the best safety strategy. A combination theory and cost benefit analysis was employed to develop a monetary saving / loss function in terms value of preventions and cost of prevention strategy. Documentations, interviews and structured questionnaire were employed to collect information on Before-And-After safety programme records from a Tobacco company between periods of 1993-2001(for pre-safety) and 2002-2008 (safety period) for the model application. Three combinatorial alternatives A, B, C were obtained resulting into 4, 6 and 4 strategies respectively with PPE and Training being predominant. A total of 728 accidents were recorded for a 9 year period of pre-safety programme and 163 accidents were recorded for 7 years period of safety programme. Six preventions activities (alternative B) yielded the best results. However, all the years of operation experienced except year 2004. The study provides a leading resources for planning successful safety programme
Abstract: The purpose of this research was develop a biological
nutrient removal (BNR) system which has low energy consumption, sludge production, and land usage. These indicate that BNR system could be a alternative of future wastewater treatment in ubiquitous
city(U-city). Organics and nitrogen compounds could be removed by this system so that secondary or tertiary stages of wastewater treatment satisfy their standards. This system was composed of oxic and anoxic
filter filed with PVDC and POM media. Anoxic/oxic filter system operated under empty bed contact time of 4 hours by increasing
recirculation ratio from 0 to 100 %. The system removals of total nitrogen and COD were 76.3% and 93%, respectively. To be observed
internal behavior in this system SCOD, NH3-N, and NO3-N were
conducted and removal shows range of 25~100%, 59~99%, and
70~100%, respectively.
Abstract: In this paper, the least-squares design of variable fractional-delay (VFD) finite impulse response (FIR) digital differentiators is proposed. The used transfer function is formulated so that Farrow structure can be applied to realize the designed system. Also, the symmetric characteristics of filter coefficients are derived, which leads to the complexity reduction by saving almost a half of the number of coefficients. Moreover, all the elements of related vectors or matrices for the optimal process can be represented in closed forms, which make the design easier. Design example is also presented to illustrate the effectiveness of the proposed method.
Abstract: Noise level has critical effects on the diagnostic
performance of signal-averaged electrocardiogram (SAECG), because
the true starting and end points of QRS complex would be masked by
the residual noise and sensitive to the noise level. Several studies and
commercial machines have used a fixed number of heart beats
(typically between 200 to 600 beats) or set a predefined noise level
(typically between 0.3 to 1.0 μV) in each X, Y and Z lead to perform
SAECG analysis. However different criteria or methods used to
perform SAECG would cause the discrepancies of the noise levels
among study subjects. According to the recommendations of 1991
ESC, AHA and ACC Task Force Consensus Document for the use of
SAECG, the determinations of onset and offset are related closely to
the mean and standard deviation of noise sample. Hence this study
would try to perform SAECG using consistent root-mean-square
(RMS) noise levels among study subjects and analyze the noise level
effects on SAECG. This study would also evaluate the differences
between normal subjects and chronic renal failure (CRF) patients in
the time-domain SAECG parameters.
The study subjects were composed of 50 normal Taiwanese and 20
CRF patients. During the signal-averaged processing, different RMS
noise levels were adjusted to evaluate their effects on three time
domain parameters (1) filtered total QRS duration (fQRSD), (2) RMS
voltage of the last QRS 40 ms (RMS40), and (3) duration of the low
amplitude signals below 40 μV (LAS40). The study results
demonstrated that the reduction of RMS noise level can increase
fQRSD and LAS40 and decrease the RMS40, and can further increase
the differences of fQRSD and RMS40 between normal subjects and
CRF patients. The SAECG may also become abnormal due to the
reduction of RMS noise level. In conclusion, it is essential to establish
diagnostic criteria of SAECG using consistent RMS noise levels for
the reduction of the noise level effects.
Abstract: The processing of the electrocardiogram (ECG) signal consists essentially in the detection of the characteristic points of
signal which are an important tool in the diagnosis of heart diseases. The most suitable are the detection of R waves. In this paper, we
present various mathematical tools used for filtering ECG using digital filtering and Discreet Wavelet Transform (DWT) filtering. In
addition, this paper will include two main R peak detection methods
by applying a windowing process: The first method is based on calculations derived, the second is a time-frequency method based on
Dyadic Wavelet Transform DyWT.
Abstract: Optical Bursts Switching (OBS) is a relatively new
optical switching paradigm. Contention and burst loss in OBS
networks are major concerns. To resolve contentions, an interesting
alternative to discarding the entire data burst is to partially drop the
burst. Partial burst dropping is based on burst segmentation concept
that its implementation is constrained by some technical challenges,
besides the complexity added to the algorithms and protocols on both
edge and core nodes. In this paper, the burst segmentation concept is
investigated, and an implementation scheme is proposed and
evaluated. An appropriate dropping policy that effectively manages
the size of the segmented data bursts is presented. The dropping
policy is further supported by a new control packet format that
provides constant transmission overhead.
Abstract: Amazing development of the information technology,
communications and internet expansion as well as the requirements
of the city managers to new ideas to run the city and higher
participation of the citizens encourage us to complete the electronic
city as soon as possible. The foundations of this electronic city are in
information technology. People-s participation in metropolitan
management is a crucial topic. Information technology does not
impede this matter. It can ameliorate populace-s participation and
better interactions between the citizens and the city managers.
Citizens can proffer their ideas, beliefs and votes through digital
mass media based upon the internet and computerization plexuses on
the topical matters to receive appropriate replies and services. They
can participate in urban projects by becoming cognizant of the city
views. The most significant challenges are as follows: information
and communicative management, altering citizens- views, as well as
legal and office documents
Electronic city obstacles have been identified in this research. The
required data were forgathered through questionnaires to identify the
barriers from a statistical community comprising specialists and
practitioners of the ministry of information technology and
communication, the municipality information technology
organization.
The conclusions demonstrate that the prioritized electronic city
application barriers in Iran are as follows:
The support quandaries (non-financial ones), behavioral, cultural
and educational plights, the security, legal and license predicaments,
the hardware, orismological and infrastructural curbs, the software
and fiscal problems.
Abstract: Growing world population has fundamental impacts
and often catastrophic on natural habitat. The immethodical
consumption of energy, destruction of the forests and extinction of
plant and animal species are the consequence of this experience.
Urban sustainability and sustainable urban development, that is so
spoken these days, should be considered as a strategy, goal and
policy, beyond just considering environmental issues and protection.
The desert-s climate has made a bunch of problems for its residents.
Very hot and dry climate in summers of the Iranian desert areas,
when there was no access to modern energy source and mechanical
cooling systems in the past, made Iranian architects to design a
natural ventilation system in their buildings. The structure, like a
tower going upward the roof, besides its ornamental application and
giving a beautiful view to the building, was used as a spontaneous
ventilation system. In this paper, it has been tried to name the
problems of the area and it-s inconvenience, then some answers has
pointed out in order to solve the problems and as an alternative
solution BADGIR (wind-catcher) has been introduced as a solution
knowing that it has been playing a major role in dealing with the
problems.
Abstract: In this paper we propose a robust environmental sound classification approach, based on spectrograms features driven from log-Gabor filters. This approach includes two methods. In the first methods, the spectrograms are passed through an appropriate log-Gabor filter banks and the outputs are averaged and underwent an optimal feature selection procedure based on a mutual information criteria. The second method uses the same steps but applied only to three patches extracted from each spectrogram.
To investigate the accuracy of the proposed methods, we conduct experiments using a large database containing 10 environmental sound classes. The classification results based on Multiclass Support Vector Machines show that the second method is the most efficient with an average classification accuracy of 89.62 %.
Abstract: The purpose of the experiments described in this article was the comparison of integrated fixed film activated sludge (IFAS) and activated sludge (AS) system. The IFAS applied system consists of the cigarette filter rods (wasted filter in tobacco factories) as a biofilm carrier. The comparison with activated sludge was performed by two parallel treatment lines. Organic substance, ammonia and TP removal was investigated over four month period. Synthetic wastewater was prepared with ordinary tap water and glucose as the main sources of carbon and energy, plus balanced macro and micro nutrients. COD removal percentages of 94.55%, and 81.62% were achieved for IFAS and activated sludge system, respectively. Also, ammonia concentration significantly decreased by increasing the HRT in both systems. The average ammonia removal of 97.40 % and 96.34% were achieved for IFAS and activated sludge system, respectively. The removal efficiency of total phosphorus (TP-P) was 60.64%, higher than AS process by 56.63% respectively.
Abstract: In this paper, we apply the PQ theory with shunt active power filter in an unbalanced and distorted power system voltage to compensate the perturbations generated by non linear load. The power factor is also improved in the current source. The PLL system is used to extract the fundamental component of the even sequence under conditions mentioned of the power system voltage.
Abstract: This paper presents a new fingerprint coding technique
based on contourlet transform and multistage vector quantization.
Wavelets have shown their ability in representing natural images that
contain smooth areas separated with edges. However, wavelets
cannot efficiently take advantage of the fact that the edges usually
found in fingerprints are smooth curves. This issue is addressed by
directional transforms, known as contourlets, which have the
property of preserving edges. The contourlet transform is a new
extension to the wavelet transform in two dimensions using
nonseparable and directional filter banks. The computation and
storage requirements are the major difficulty in implementing a
vector quantizer. In the full-search algorithm, the computation and
storage complexity is an exponential function of the number of bits
used in quantizing each frame of spectral information. The storage
requirement in multistage vector quantization is less when compared
to full search vector quantization. The coefficients of contourlet
transform are quantized by multistage vector quantization. The
quantized coefficients are encoded by Huffman coding. The results
obtained are tabulated and compared with the existing wavelet based
ones.
Abstract: Female breast cancer is the second in frequency after cervical cancer. Surgery is the most common treatment for breast cancer, followed by chemotherapy as a treatment of choice. Although effective, it causes serious side effects. Controlled-release drug delivery is an alternative method to improve the efficacy and safety of the treatment. It can release the dosage of drug between the minimum effect concentration (MEC) and minimum toxic concentration (MTC) within tumor tissue and reduce the damage of normal tissue and the side effect. Because an in vivo experiment of this system can be time-consuming and labor-intensive, a mathematical model is desired to study the effects of important parameters before the experiments are performed. Here, we describe a 3D mathematical model to predict the release of doxorubicin from pluronic gel to treat human breast cancer. This model can, ultimately, be used to effectively design the in vivo experiments.
Abstract: The standard investigational method for obstructive
sleep apnea syndrome (OSAS) diagnosis is polysomnography (PSG),
which consists of a simultaneous, usually overnight recording of
multiple electro-physiological signals related to sleep and
wakefulness. This is an expensive, encumbering and not a readily
repeated protocol, and therefore there is need for simpler and easily
implemented screening and detection techniques. Identification of
apnea/hypopnea events in the screening recordings is the key factor
for the diagnosis of OSAS. The analysis of a solely single-lead
electrocardiographic (ECG) signal for OSAS diagnosis, which may
be done with portable devices, at patient-s home, is the challenge of
the last years. A novel artificial neural network (ANN) based
approach for feature extraction and automatic identification of
respiratory events in ECG signals is presented in this paper. A
nonlinear principal component analysis (NLPCA) method was
considered for feature extraction and support vector machine for
classification/recognition. An alternative representation of the
respiratory events by means of Kohonen type neural network is
discussed. Our prospective study was based on OSAS patients of the
Clinical Hospital of Pneumology from Iaşi, Romania, males and
females, as well as on non-OSAS investigated human subjects. Our
computed analysis includes a learning phase based on cross signal
PSG annotation.
Abstract: As more people from non-technical backgrounds
are becoming directly involved with large-scale ontology
development, the focal point of ontology research has shifted
from the more theoretical ontology issues to problems
associated with the actual use of ontologies in real-world,
large-scale collaborative applications. Recently the National
Science Foundation funded a large collaborative ontology
development project for which a new formal ontology model,
the Ontology Abstract Machine (OAM), was developed to
satisfy some unique functional and data representation
requirements. This paper introduces the OAM model and the
related algorithms that enable maintenance of an ontology that
supports node-based user access. The successful software
implementation of the OAM model and its subsequent
acceptance by a large research community proves its validity
and its real-world application value.
Abstract: The aim of this study was to remove the two principal
noises which disturb the surface electromyography signal
(Diaphragm). These signals are the electrocardiogram ECG artefact
and the power line interference artefact. The algorithm proposed
focuses on a new Lean Mean Square (LMS) Widrow adaptive
structure. These structures require a reference signal that is correlated
with the noise contaminating the signal. The noise references are
then extracted : first with a noise reference mathematically
constructed using two different cosine functions; 50Hz (the
fundamental) function and 150Hz (the first harmonic) function for
the power line interference and second with a matching pursuit
technique combined to an LMS structure for the ECG artefact
estimation. The two removal procedures are attained without the use
of supplementary electrodes. These techniques of filtering are
validated on real records of surface diaphragm electromyography
signal. The performance of the proposed methods was compared with
already conducted research results.
Abstract: This paper describes a new measuring algorithm for
three-dimensional (3-D) braided composite material .Braided angle is
an important parameter of braided composites. The objective of this
paper is to present an automatic measuring system. In the paper, the
algorithm is performed by using vcµ6.0 language on PC. An
advanced filtered algorithm for image of 3-D braided composites
material performs has been developed. The procedure is completely
automatic and relies on the gray scale information content of the
images and their local wavelet transform modulus maxims.
Experimental results show that the proposed method is feasible.
The algorithm was tested on both carbon-fiber and glass-fiber
performs.
Abstract: Chronic hepatitis B can evolve to cirrhosis and liver
cancer. Interferon is the only effective treatment, for carefully selected
patients, but it is very expensive. Some of the selection criteria are
based on liver biopsy, an invasive, costly and painful medical procedure.
Therefore, developing efficient non-invasive selection systems,
could be in the patients benefit and also save money. We investigated
the possibility to create intelligent systems to assist the Interferon
therapeutical decision, mainly by predicting with acceptable accuracy
the results of the biopsy. We used a knowledge discovery in integrated
medical data - imaging, clinical, and laboratory data. The resulted
intelligent systems, tested on 500 patients with chronic hepatitis
B, based on C5.0 decision trees and boosting, predict with 100%
accuracy the results of the liver biopsy. Also, by integrating the other
patients selection criteria, they offer a non-invasive support for the
correct Interferon therapeutic decision. To our best knowledge, these
decision systems outperformed all similar systems published in the
literature, and offer a realistic opportunity to replace liver biopsy in
this medical context.
Abstract: To solve the problem of multisensor data fusion under
non-Gaussian channel noise. The advanced M-estimates are known
to be robust solution while trading off some accuracy. In order to
improve the estimation accuracy while still maintaining the equivalent
robustness, a two-stage robust fusion algorithm is proposed using
preliminary rejection of outliers then an optimal linear fusion. The
numerical experiments show that the proposed algorithm is equivalent
to the M-estimates in the case of uncorrelated local estimates and
significantly outperforms the M-estimates when local estimates are
correlated.