Abstract: There is a world-wide need for the development of sustainable management strategies to control pest infestation and the development of phosphine (PH3) resistance in lesser grain borer (Rhyzopertha dominica). Computer simulation models can provide a relatively fast, safe and inexpensive way to weigh the merits of various management options. However, the usefulness of simulation models relies on the accurate estimation of important model parameters, such as mortality. Concentration and time of exposure are both important in determining mortality in response to a toxic agent. Recent research indicated the existence of two resistance phenotypes in R. dominica in Australia, weak and strong, and revealed that the presence of resistance alleles at two loci confers strong resistance, thus motivating the construction of a two-locus model of resistance. Experimental data sets on purified pest strains, each corresponding to a single genotype of our two-locus model, were also available. Hence it became possible to explicitly include mortalities of the different genotypes in the model. In this paper we described how we used two generalized linear models (GLM), probit and logistic models, to fit the available experimental data sets. We used a direct algebraic approach generalized inverse matrix technique, rather than the traditional maximum likelihood estimation, to estimate the model parameters. The results show that both probit and logistic models fit the data sets well but the former is much better in terms of small least squares (numerical) errors. Meanwhile, the generalized inverse matrix technique achieved similar accuracy results to those from the maximum likelihood estimation, but is less time consuming and computationally demanding.
Abstract: Discrimination between different classes of environmental
sounds is the goal of our work. The use of a sound recognition
system can offer concrete potentialities for surveillance and
security applications. The first paper contribution to this research
field is represented by a thorough investigation of the applicability
of state-of-the-art audio features in the domain of environmental
sound recognition. Additionally, a set of novel features obtained by
combining the basic parameters is introduced. The quality of the
features investigated is evaluated by a HMM-based classifier to which
a great interest was done. In fact, we propose to use a Multi-Style
training system based on HMMs: one recognizer is trained on a
database including different levels of background noises and is used
as a universal recognizer for every environment. In order to enhance
the system robustness by reducing the environmental variability, we
explore different adaptation algorithms including Maximum Likelihood
Linear Regression (MLLR), Maximum A Posteriori (MAP)
and the MAP/MLLR algorithm that combines MAP and MLLR.
Experimental evaluation shows that a rather good recognition rate
can be reached, even under important noise degradation conditions
when the system is fed by the convenient set of features.
Abstract: In this report we present a rule-based approach to
detect anomalous telephone calls. The method described here uses
subscriber usage CDR (call detail record) data sampled over two
observation periods: study period and test period. The study period
contains call records of customers- non-anomalous behaviour.
Customers are first grouped according to their similar usage
behaviour (like, average number of local calls per week, etc). For
customers in each group, we develop a probabilistic model to describe
their usage. Next, we use maximum likelihood estimation (MLE) to
estimate the parameters of the calling behaviour. Then we determine
thresholds by calculating acceptable change within a group. MLE is
used on the data in the test period to estimate the parameters of the
calling behaviour. These parameters are compared against thresholds.
Any deviation beyond the threshold is used to raise an alarm. This
method has the advantage of identifying local anomalies as compared
to techniques which identify global anomalies. The method is tested
for 90 days of study data and 10 days of test data of telecom
customers. For medium to large deviations in the data in test window,
the method is able to identify 90% of anomalous usage with less than
1% false alarm rate.
Abstract: This paper describes a novel method for automatic
estimation of the contours of weld defect in radiography images.
Generally, the contour detection is the first operation which we apply
in the visual recognition system. Our approach can be described as a
region based maximum likelihood formulation of parametric
deformable contours. This formulation provides robustness against
the poor image quality, and allows simultaneous estimation of the
contour parameters together with other parameters of the model.
Implementation is performed by a deterministic iterative algorithm
with minimal user intervention. Results testify for the very good
performance of the approach especially in synthetic weld defect
images.
Abstract: Zero inflated Strict Arcsine model is a newly developed model which is found to be appropriate in modeling overdispersed count data. In this study, maximum likelihood estimation method is used in estimating the parameters for zero inflated strict arcsine model. Bootstrapping is then employed to compute the confidence intervals for the estimated parameters.
Abstract: In this paper we propose a Particle Swarm heuristic
optimized Multi-Antenna (MA) system. Efficient MA systems
detection is performed using a robust stochastic evolutionary
computation algorithm based on movement and intelligence of
swarms. This iterative particle swarm optimized (PSO) detector
significantly reduces the computational complexity of conventional
Maximum Likelihood (ML) detection technique. The simulation
results achieved with this proposed MA-PSO detection algorithm
show near optimal performance when compared with ML-MA
receiver. The performance of proposed detector is convincingly
better for higher order modulation schemes and large number of
antennas where conventional ML detector becomes non-practical.
Abstract: This study empirically examines the long run equilibrium relationship between South Africa’s exports and imports using quarterly data from 1985 to 2012. The theoretical framework used for the study is based on Johansen’s Maximum Likelihood cointegration technique which tests for both the existence and number of cointegration vectors that exists. The study finds that both the series are integrated of order one and are cointegrated. A statistically significant cointegrating relationship is found to exist between exports and imports. The study models this unique linear and lagged relationship using a Vector Error Correction Model (VECM). The findings of the study confirm the existence of a long run equilibrium relationship between exports and imports.
Abstract: We constructed a method of noise reduction for
JPEG-compressed image based on Bayesian inference using the
maximizer of the posterior marginal (MPM) estimate. In this method,
we tried the MPM estimate using two kinds of likelihood, both of
which enhance grayscale images converted into the JPEG-compressed
image through the lossy JPEG image compression. One is the
deterministic model of the likelihood and the other is the probabilistic
one expressed by the Gaussian distribution. Then, using the Monte
Carlo simulation for grayscale images, such as the 256-grayscale
standard image “Lena" with 256 × 256 pixels, we examined the
performance of the MPM estimate based on the performance measure
using the mean square error. We clarified that the MPM estimate via
the Gaussian probabilistic model of the likelihood is effective for
reducing noises, such as the blocking artifacts and the mosquito noise,
if we set parameters appropriately. On the other hand, we found that
the MPM estimate via the deterministic model of the likelihood is not
effective for noise reduction due to the low acceptance ratio of the
Metropolis algorithm.
Abstract: Com Poisson distribution is capable of modeling the count responses irrespective of their mean variance relation and the parameters of this distribution when fitted to a simple cross sectional data can be efficiently estimated using maximum likelihood (ML) method. In the regression setup, however, ML estimation of the parameters of the Com Poisson based generalized linear model is computationally intensive. In this paper, we propose to use quasilikelihood (QL) approach to estimate the effect of the covariates on the Com Poisson counts and investigate the performance of this method with respect to the ML method. QL estimates are consistent and almost as efficient as ML estimates. The simulation studies show that the efficiency loss in the estimation of all the parameters using QL approach as compared to ML approach is quite negligible, whereas QL approach is lesser involving than ML approach.
Abstract: Mendelian Disease Genes represent a collection of single points of failure for the various systems they constitute. Such genes have been shown, on average, to encode longer proteins than 'non-disease' proteins. Existing models suggest that this results from the increased likeli-hood of longer genes undergoing mutations. Here, we show that in saturated mutagenesis experiments performed on model organisms, where the likelihood of each gene mutating is one, a similar relationship between length and the probability of a gene being lethal was observed. We thus suggest an extended model demonstrating that the likelihood of a mutated gene to produce a severe phenotype is length-dependent. Using the occurrence of conserved domains, we bring evidence that this dependency results from a correlation between protein length and the number of functions it performs. We propose that protein length thus serves as a proxy for protein cardinality in different networks required for the organism's survival and well-being. We use this example to argue that the collection of Mendelian Disease Genes can, and should, be used to study the rules governing systems vulnerability in living organisms.
Abstract: In this paper, we evaluate the choice of suitable
quantization characteristics for both the decoder messages and the
received samples in Low Density Parity Check (LDPC) coded
systems using M-QAM (Quadrature Amplitude Modulation)
schemes. The analysis involves the demapper block that provides
initial likelihood values for the decoder, by relating its quantization
strategy of the decoder. A mapping strategy refers to the grouping of
bits within a codeword, where each m-bit group is used to select a
2m-ary signal in accordance with the signal labels. Further we
evaluate the system with mapping strategies like Consecutive-Bit
(CB) and Bit-Reliability (BR). A new demapper version, based on
approximate expressions, is also presented to yield a low complexity
hardware implementation.
Abstract: In this paper some procedures for building confidence intervals for the reliability in stress-strength models are discussed and empirically compared. The particular case of a bivariate normal setup is considered. The confidence intervals suggested are obtained employing approximations or asymptotic properties of maximum likelihood estimators. The coverage and the precision of these intervals are empirically checked through a simulation study. An application to real paired data is also provided.
Abstract: This paper develops an unscented grid-based filter
and a smoother for accurate nonlinear modeling and analysis
of time series. The filter uses unscented deterministic sampling
during both the time and measurement updating phases, to approximate
directly the distributions of the latent state variable. A
complementary grid smoother is also made to enable computing
of the likelihood. This helps us to formulate an expectation
maximisation algorithm for maximum likelihood estimation of
the state noise and the observation noise. Empirical investigations
show that the proposed unscented grid filter/smoother compares
favourably to other similar filters on nonlinear estimation tasks.
Abstract: This paper examines predictability in stock return in
developed and emergingmarkets by testing long memory in stock
returns using wavelet approach. Wavelet-based maximum likelihood
estimator of the fractional integration estimator is superior to the
conventional Hurst exponent and Geweke and Porter-Hudak
estimator in terms of asymptotic properties and mean squared error.
We use 4-year moving windows to estimate the fractional integration
parameter. Evidence suggests that stock return may not be predictable
indeveloped countries of the Asia-Pacificregion. However,
predictability of stock return insome developing countries in this
region such as Indonesia, Malaysia and Philippines may not be ruled
out. Stock return in the Thailand stock market appears to be not
predictable after the political crisis in 2008.
Abstract: We investigated statistical performance of Bayesian inference using maximum entropy and MAP estimation for several models which approximated wave-fronts in remote sensing using SAR interferometry. Using Monte Carlo simulation for a set of wave-fronts generated by assumed true prior, we found that the method of maximum entropy realized the optimal performance around the Bayes-optimal conditions by using model of the true prior and the likelihood representing optical measurement due to the interferometer. Also, we found that the MAP estimation regarded as a deterministic limit of maximum entropy almost achieved the same performance as the Bayes-optimal solution for the set of wave-fronts. Then, we clarified that the MAP estimation perfectly carried out phase unwrapping without using prior information, and also that the MAP estimation realized accurate phase unwrapping using conjugate gradient (CG) method, if we assumed the model of the true prior appropriately.
Abstract: Numerical analysis naturally finds applications in all
fields of engineering and the physical sciences, but in the
21st century, the life sciences and even the arts have adopted
elements of scientific computations. The numerical data analysis
became key process in research and development of all the fields [6].
In this paper we have made an attempt to analyze the specified
numerical patterns with reference to the association rule mining
techniques with minimum confidence and minimum support mining
criteria. The extracted rules and analyzed results are graphically
demonstrated. Association rules are a simple but very useful form of
data mining that describe the probabilistic co-occurrence of certain
events within a database [7]. They were originally designed to
analyze market-basket data, in which the likelihood of items being
purchased together within the same transactions are analyzed.
Abstract: In biological and biomedical research motif finding tools are important in locating regulatory elements in DNA sequences. There are many such motif finding tools available, which often yield position weight matrices and significance indicators. These indicators, p-values and E-values, describe the likelihood that a motif alignment is generated by the background process, and the expected number of occurrences of the motif in the data set, respectively. The various tools often estimate these indicators differently, making them not directly comparable. One approach for comparing motifs from different tools, is computing the E-value as the product of the p-value and the number of possible alignments in the data set. In this paper we explore the combinatorics of the motif alignment models OOPS, ZOOPS, and ANR, and propose a generic algorithm for computing the number of possible combinations accurately. We also show that using the wrong alignment model can give E-values that significantly diverge from their true values.
Abstract: The accuracy of estimated stability and control
derivatives of a light aircraft from flight test data were evaluated. The light aircraft, named ChangGong-91, is the first certified aircraft from
the Korean government. The output error method, which is a maximum likelihood estimation technique and considers measurement
noise only, was used to analyze the aircraft responses measures. The
multi-step control inputs were applied in order to excite the short period mode for the longitudinal and Dutch-roll mode for the lateral-directional motion. The estimated stability/control derivatives of Chan Gong-91 were analyzed for the assessment of handling
qualities comparing them with those of similar aircraft. The accuracy of the flight derivative estimates derived from flight test measurement
was examined in engineering judgment, scatter and Cramer-Rao bound, which turned out to be satisfactory with minor defects..
Abstract: In this paper, we evaluate the performance of the
Hybrid-MIMO Receiver Scheme (HMRS) in Cognitive Radio
network (CR-network). We investigate the efficiency of the proposed
scheme which the energy level and user number of primary user are
varied according to the characteristic of CR-network. HMRS can
allow users to transmit either Space-Time Block Code (STBC) or
Spatial-Multiplexing (SM) streams simultaneously by using
Successive Interference Cancellation (SIC) and Maximum Likelihood
Detection (MLD). From simulation, the results indicate that the
interference level effects to the performance of HMRS. Moreover,
the exact closed-form capacity of the proposed scheme is derived and
compared with STBC scheme.
Abstract: An automatic speech recognition system for the
formal Arabic language is needed. The Quran is the most formal
spoken book in Arabic, it is spoken all over the world. In this
research, an automatic speech recognizer for Quranic based speakerindependent
was developed and tested. The system was developed
based on the tri-phone Hidden Markov Model and Maximum
Likelihood Linear Regression (MLLR). The MLLR computes a set
of transformations which reduces the mismatch between an initial
model set and the adaptation data. It uses the regression class tree, as
well as, estimates a set of linear transformations for the mean and
variance parameters of a Gaussian mixture HMM system. The 30th
Chapter of the Quran, with five of the most famous readers of the
Quran, was used for the training and testing of the data. The chapter
includes about 2000 distinct words. The advantages of using the
Quranic verses as the database in this developed recognizer are the
uniqueness of the words and the high level of orderliness between
verses. The level of accuracy from the tested data ranged 68 to 85%.