Abstract: This paper presents a deep-learning mechanism for classifying computer generated images and photographic images. The proposed method accounts for a convolutional layer capable of automatically learning correlation between neighbouring pixels. In the current form, Convolutional Neural Network (CNN) will learn features based on an image's content instead of the structural features of the image. The layer is particularly designed to subdue an image's content and robustly learn the sensor pattern noise features (usually inherited from image processing in a camera) as well as the statistical properties of images. The paper was assessed on latest natural and computer generated images, and it was concluded that it performs better than the current state of the art methods.
Abstract: Alzheimer's prevalence is on the rise, and the disease comes with problems like cessation of treatment, high cost of treatment, and the lack of early detection methods. The pathology of this disease causes the formation of protein deposits in the brain of patients called plaque amyloid. Generally, the diagnosis of this disease is done by performing tests such as a cerebrospinal fluid, CT scan, MRI, and spinal cord fluid testing, or mental testing tests and eye tracing tests. In this paper, we tried to use the Medial Temporal Atrophy (MTA) method and the Leave One Out (LOO) cycle to extract the statistical properties of the three Fz, Pz, and Cz channels of ERP signals for early diagnosis of this disease. In the process of CT scan images, the accuracy of the results is 81% for the healthy person and 88% for the severe patient. After the process of ERP signaling, the accuracy of the results for a healthy person in the delta band in the Cz channel is 81% and in the alpha band the Pz channel is 90%. In the results obtained from the signal processing, the results of the severe patient in the delta band of the Cz channel were 89% and in the alpha band Pz channel 92%.
Abstract: In this paper, we extend the versatility and usefulness of GIS as a methodology for any river basin hydrologic characteristics analysis (HCA). The Gurara River basin located in North-Central Nigeria is presented in this study. It is an on-going research using spatial Digital Elevation Model (DEM) and Arc-Hydro tools to take inventory of the basin characteristics in order to predict water abstraction quantification on streamflow regime. One of the main concerns of hydrological modelling is the quantification of runoff from rainstorm events. In practice, the soil conservation service curve (SCS) method and the Conventional procedure called rational technique are still generally used these traditional hydrological lumped models convert statistical properties of rainfall in river basin to observed runoff and hydrograph. However, the models give little or no information about spatially dispersed information on rainfall and basin physical characteristics. Therefore, this paper synthesizes morphometric parameters in generating runoff. The expected results of the basin characteristics such as size, area, shape, slope of the watershed and stream distribution network analysis could be useful in estimating streamflow discharge. Water resources managers and irrigation farmers could utilize the tool for determining net return from available scarce water resources, where past data records are sparse for the aspect of land and climate.
Abstract: In this paper, we present a quantum statistical
mechanical formulation from our recently analytical expressions for
partial-wave transition matrix of a three-particle system. We report
the quantum reactive cross sections for three-body scattering
processes 1+(2,3)→1+(2,3) as well as recombination
1+(2,3)→1+(3,1) between one atom and a weakly-bound dimer. The
analytical expressions of three-particle transition matrices and their
corresponding cross-sections were obtained from the threedimensional
Faddeev equations subjected to the rank-two non-local
separable potentials of the generalized Yamaguchi form. The
equilibrium quantum statistical mechanical properties such partition
function and equation of state as well as non-equilibrium quantum
statistical properties such as transport cross-sections and their
corresponding transport collision integrals were formulated
analytically. This leads to obtain the transport properties, such as
viscosity and diffusion coefficient of a moderate dense gas.
Abstract: In this paper, we provided a literature survey on the
artificial stock problem (ASM). The paper began by exploring the
complexity of the stock market and the needs for ASM. ASM
aims to investigate the link between individual behaviors (micro
level) and financial market dynamics (macro level). The variety of
patterns at the macro level is a function of the AFM complexity. The
financial market system is a complex system where the relationship
between the micro and macro level cannot be captured analytically.
Computational approaches, such as simulation, are expected to
comprehend this connection. Agent-based simulation is a simulation
technique commonly used to build AFMs. The paper proceeds by
discussing the components of the ASM. We consider the roles
of behavioral finance (BF) alongside the traditionally risk-averse
assumption in the construction of agent’s attributes. Also, the
influence of social networks in the developing of agents interactions is
addressed. Network topologies such as a small world, distance-based,
and scale-free networks may be utilized to outline economic
collaborations. In addition, the primary methods for developing
agents learning and adaptive abilities have been summarized.
These incorporated approach such as Genetic Algorithm, Genetic
Programming, Artificial neural network and Reinforcement Learning.
In addition, the most common statistical properties (the stylized facts)
of stock that are used for calibration and validation of ASM are
discussed. Besides, we have reviewed the major related previous
studies and categorize the utilized approaches as a part of these
studies. Finally, research directions and potential research questions
are argued. The research directions of ASM may focus on the macro
level by analyzing the market dynamic or on the micro level by
investigating the wealth distributions of the agents.
Abstract: This paper presents two techniques, local feature
extraction using image spectrum and low frequency spectrum
modelling using GMM to capture the underlying statistical
information to improve the performance of face recognition
system. Local spectrum features are extracted using overlap sub
block window that are mapped on the face image. For each of this
block, spatial domain is transformed to frequency domain using
DFT. A low frequency coefficient is preserved by discarding high
frequency coefficients by applying rectangular mask on the
spectrum of the facial image. Low frequency information is non-
Gaussian in the feature space and by using combination of several
Gaussian functions that has different statistical properties, the best
feature representation can be modelled using probability density
function. The recognition process is performed using maximum
likelihood value computed using pre-calculated GMM components.
The method is tested using FERET datasets and is able to achieved
92% recognition rates.
Abstract: We have investigated statistical properties of the defect turbulence in 1D CGLE wherein many body interaction is involved between local depressing wave (LDW) and local standing wave (LSW). It is shown that the counting number fluctuation of LDW is subject to the sub-Poisson statistics (SUBP). The physical origin of the SUBP can be ascribed to pair extinction of LDWs based on the master equation approach. It is also shown that the probability density function (pdf) of inter-LDW distance can be identified by the hyper gamma distribution. Assuming a superstatistics of the exponential distribution (Poisson configuration), a plausible explanation is given. It is shown further that the pdf of amplitude of LDW has a fattail. The underlying mechanism of its fluctuation is examined by introducing a generalized fractional Poisson configuration.
Abstract: An automated wood recognition system is designed to
classify tropical wood species.The wood features are extracted based
on two feature extractors: Basic Grey Level Aura Matrix (BGLAM)
technique and statistical properties of pores distribution (SPPD)
technique. Due to the nonlinearity of the tropical wood species
separation boundaries, a pre classification stage is proposed which
consists ofKmeans clusteringand kernel discriminant analysis (KDA).
Finally, Linear Discriminant Analysis (LDA) classifier and KNearest
Neighbour (KNN) are implemented for comparison purposes.
The study involves comparison of the system with and without pre
classification using KNN classifier and LDA classifier.The results
show that the inclusion of the pre classification stage has improved
the accuracy of both the LDA and KNN classifiers by more than
12%.
Abstract: Extensive rainfall disaggregation approaches have been developed and applied in climate change impact studies such as flood risk assessment and urban storm water management.In this study, five rainfall models that were capable ofdisaggregating daily rainfall data into hourly one were investigated for the rainfall record in theChangi Airport, Singapore. The objectives of this study were (i) to study the temporal characteristics of hourly rainfall in Singapore, and (ii) to evaluate the performance of variousdisaggregation models. The used models included: (i) Rectangular pulse Poisson model (RPPM), (ii) Bartlett-Lewis Rectangular pulse model (BLRPM), (iii) Bartlett-Lewis model with 2 cell types (BL2C), (iv) Bartlett-Lewis Rectangular with cell depth distribution dependent on duration (BLRD), and (v) Neyman-Scott Rectangular pulse model (NSRPM). All of these models werefitted using hourly rainfall data ranging from 1980 to 2005 (which was obtained from Changimeteorological station).The study results indicated that the weight scheme of inversely proportional variance could deliver more accurateoutputs for fitting rainfall patterns in tropical areas, and BLRPM performedrelatively better than other disaggregation models.
Abstract: In this paper we propose a method for modeling the
correlation between the received signals by two or more antennas
operating in a multipath environment. Considering the maximum
excess delay in the channel being modeled, an elliptical region
surrounding both transmitter and receiver antennas is produced. A
number of scatterers are randomly distributed in this region and
scatter the incoming waves. The amplitude and phase of incoming
waves are computed and used to obtain statistical properties of the
received signals. This model has the distinguishable advantage of
being applicable for any configuration of antennas. Furthermore the
common PDF (Probability Distribution Function) of received wave
amplitudes for any pair of antennas can be calculated and used to
produce statistical parameters of received signals.
Abstract: The paper deals with an analysis of visibility records collected from 210 European airports to obtain a realistic estimation of the availability of Free Space Optical (FSO) data links. Commercially available optical links usually operate in the 850nm waveband. Thus the influence of the atmosphere on the optical beam and on the visible light is similar. Long-term visibility records represent an invaluable source of data for the estimation of the quality of service of FSO links. The model used characterizes both the statistical properties of fade depths and the statistical properties of individual fade durations. Results are presented for Italy, France, and Germany.
Abstract: A novel file splitting technique for the reduction of the nth-order entropy of text files is proposed. The technique is based on mapping the original text file into a non-ASCII binary file using a new codeword assignment method and then the resulting binary file is split into several subfiles each contains one or more bits from each codeword of the mapped binary file. The statistical properties of the subfiles are studied and it is found that they reflect the statistical properties of the original text file which is not the case when the ASCII code is used as a mapper. The nth-order entropy of these subfiles are determined and it is found that the sum of their entropies is less than that of the original text file for the same values of extensions. These interesting statistical properties of the resulting subfiles can be used to achieve better compression ratios when conventional compression techniques are applied to these subfiles individually and on a bit-wise basis rather than on character-wise basis.
Abstract: RC4 was used as an encryption algorithm in WEP(Wired Equivalent Privacy) protocol that is a standardized for 802.11 wireless network. A few attacks followed, indicating certain weakness in the design. In this paper, we proposed a new variant of RC4 stream cipher. The new version of the cipher does not only appear to be more secure, but its keystream also has large period, large complexity and good statistical properties.
Abstract: In this paper, the statistical properties of filtered or convolved signals are considered by deriving the resulting density functions as well as the exact mean and variance expressions given a prior knowledge about the statistics of the individual signals in the filtering or convolution process. It is shown that the density function after linear convolution is a mixture density, where the number of density components is equal to the number of observations of the shortest signal. For circular convolution, the observed samples are characterized by a single density function, which is a sum of products.
Abstract: The problem of ranking (rank regression) has become popular in the machine learning community. This theory relates to problems, in which one has to predict (guess) the order between objects on the basis of vectors describing their observed features. In many ranking algorithms a convex loss function is used instead of the 0-1 loss. It makes these procedures computationally efficient. Hence, convex risk minimizers and their statistical properties are investigated in this paper. Fast rates of convergence are obtained under conditions, that look similarly to the ones from the classification theory. Methods used in this paper come from the theory of U-processes as well as empirical processes.
Abstract: Breast cancer detection techniques have been reported
to aid radiologists in analyzing mammograms. We note that most
techniques are performed on uncompressed digital mammograms.
Mammogram images are huge in size necessitating the use of
compression to reduce storage/transmission requirements. In this
paper, we present an algorithm for the detection of
microcalcifications in the JPEG2000 domain. The algorithm is based
on the statistical properties of the wavelet transform that the
JPEG2000 coder employs. Simulation results were carried out at
different compression ratios. The sensitivity of this algorithm ranges
from 92% with a false positive rate of 4.7 down to 66% with a false
positive rate of 2.1 using lossless compression and lossy compression
at a compression ratio of 100:1, respectively.
Abstract: This paper attempts to establish the fact that Multi
State Network Classification is essential for performance
enhancement of Transport protocols over Satellite based Networks. A
model to classify Multi State network condition taking into
consideration both congestion and channel error is evolved. In order
to arrive at such a model an analysis of the impact of congestion and
channel error on RTT values has been carried out using ns2. The
analysis results are also reported in the paper. The inference drawn
from this analysis is used to develop a novel statistical RTT based
model for multi state network classification.
An Adaptive Multi State Proactive Transport Protocol consisting
of Proactive Slow Start, State based Error Recovery, Timeout Action
and Proactive Reduction is proposed which uses the multi state
network state classification model. This paper also confirms through
detail simulation and analysis that a prior knowledge about the
overall characteristics of the network helps in enhancing the
performance of the protocol over satellite channel which is
significantly affected due to channel noise and congestion.
The necessary augmentation of ns2 simulator is done for
simulating the multi state network classification logic. This
simulation has been used in detail evaluation of the protocol under
varied levels of congestion and channel noise. The performance
enhancement of this protocol with reference to established protocols
namely TCP SACK and Vegas has been discussed. The results as
discussed in this paper clearly reveal that the proposed protocol
always outperforms its peers and show a significant improvement in
very high error conditions as envisaged in the design of the protocol.
Abstract: The Improved Generalized Diversity Index (IGDI)
has been proposed as a tool that can be used to identify areas that
have high conservation value and measure the ecological condition of
an area. IGDI is based on the species relative abundances. This paper
is concerned with particular attention is given to comparisons
involving the MacArthur model of species abundances. The
properties and performance of various species indices were assessed.
Both IGDI and species richness increased with sampling area
according to a power function. IGDI were also found to be acceptable
ecological indicators of conditions and consistently outperformed
coefficient of conservatism indices.
Abstract: It is well known that during the developments in the
economic sector and through the financial crises occur everywhere in
the whole world, volatility measurement is the most important
concept in financial time series. Therefore in this paper we discuss
the volatility for Amman stocks market (Jordan) for certain period of
time. Since wavelet transform is one of the most famous filtering
methods and grows up very quickly in the last decade, we compare
this method with the traditional technique, Fast Fourier transform to
decide the best method for analyzing the volatility. The comparison
will be done on some of the statistical properties by using Matlab
program.