Abstract: Wind is among the potential energy resources which
can be harnessed to generate wind energy for conversion into
electrical power. Due to the variability of wind speed with time and
height, it becomes difficult to predict the generated wind energy more
optimally. In this paper, an attempt is made to establish a
probabilistic model fitting the wind speed data recorded at
Makambako site in Tanzania. Wind speeds and direction were
respectively measured using anemometer (type AN1) and wind Vane
(type WD1) both supplied by Delta-T-Devices at a measurement
height of 2 m. Wind speeds were then extrapolated for the height of
10 m using power law equation with an exponent of 0.47. Data were
analysed using MINITAB statistical software to show the variability
of wind speeds with time and height, and to determine the underlying
probability model of the extrapolated wind speed data. The results
show that wind speeds at Makambako site vary cyclically over time;
and they conform to the Weibull probability distribution. From these
results, Weibull probability density function can be used to predict
the wind energy.
Abstract: In this paper, a Dynamic Economic Dispatch (DED) model is developed for the system consisting of both thermal generators and wind turbines. The inclusion of a significant amount of wind energy into power systems has resulted in additional constraints on DED to accommodate the intermittent nature of the output. The probability of stochastic wind power based on the Weibull probability density function is included in the model as a constraint; A Here-and-Now Approach. The Environmental Protection Agency-s hourly emission target, which gives the maximum emission during the day, is used as a constraint to reduce the atmospheric pollution. A 69-bus test system with non-smooth cost function is used to illustrate the effectiveness of the proposed model compared with static economic dispatch model with including the wind power.
Abstract: Since dealing with high dimensional data is
computationally complex and sometimes even intractable, recently
several feature reductions methods have been developed to reduce
the dimensionality of the data in order to simplify the calculation
analysis in various applications such as text categorization, signal
processing, image retrieval, gene expressions and etc. Among feature
reduction techniques, feature selection is one the most popular
methods due to the preservation of the original features.
In this paper, we propose a new unsupervised feature selection
method which will remove redundant features from the original
feature space by the use of probability density functions of various
features. To show the effectiveness of the proposed method, popular
feature selection methods have been implemented and compared.
Experimental results on the several datasets derived from UCI
repository database, illustrate the effectiveness of our proposed
methods in comparison with the other compared methods in terms of
both classification accuracy and the number of selected features.
Abstract: This paper reports the feasibility of the ARMA model
to describe a bursty video source transmitting over a AAL5 ATM link
(VBR traffic). The traffic represents the activity of the action movie
"Lethal Weapon 3" transmitted over the ATM network using the Fore
System AVA-200 ATM video codec with a peak rate of 100 Mbps
and a frame rate of 25. The model parameters were estimated for a
single video source and independently multiplexed video sources. It
was found that the model ARMA (2, 4) is well-suited for the real data
in terms of average rate traffic profile, probability density function,
autocorrelation function, burstiness measure, and the pole-zero
distribution of the filter model.
Abstract: Relay based communication has gained considerable importance in the recent years. In this paper we find the end-toend statistics of a two hop non-regenerative relay branch, each hop being Nakagami-m faded. Closed form expressions for the probability density functions of the signal envelope at the output of a selection combiner and a maximal ratio combiner at the destination node are also derived and analytical formulations are verified through computer simulation. These density functions are useful in evaluating the system performance in terms of bit error rate and outage probability.
Abstract: Carbon disulfide is widely used for the production of
viscose rayon, rubber, and other organic materials and it is a
feedstock for the synthesis of sulfuric acid. The objective of this
paper is to analyze possibilities for efficient production of CS2 from
sour natural gas reformation (H2SMR) (2H2S+CH4 =CS2 +4H2) .
Also, the effect of H2S to CH4 feed ratio and reaction temperature on
carbon disulfide production is investigated numerically in a
reforming reactor. The chemical reaction model is based on an
assumed Probability Density Function (PDF) parameterized by the
mean and variance of mixture fraction and β-PDF shape. The results
show that the major factors influencing CS2 production are reactor
temperature. The yield of carbon disulfide increases with increasing
H2S to CH4 feed gas ratio (H2S/CH4≤4). Also the yield of C(s)
increases with increasing temperature until the temperature reaches
to 1000°K, and then due to increase of CS2 production and
consumption of C(s), yield of C(s) drops with further increase in the
temperature. The predicted CH4 and H2S conversion and yield of
carbon disulfide are in good agreement with result of Huang and TRaissi.
Abstract: Probabilistic characteristics of seismic responses of the
Partially Restrained connection rotation (PRCR) and panel zone
deformation (PZD) installed in older steel moment frames were
investigated in accordance with statistical inference in
decision-making process. The 4, 6 and 8 story older steel moment
frames with clip angle and T-stub connections were designed and
analyzed using 2%/50yrs ground motions in four cities of the
Mid-America earthquake region. The probability density function and
cumulative distribution function of PRCR and PZD were determined
by the goodness-of-fit tests based on probabilistic parameters
measured from the results of the nonlinear time-history analyses. The
obtained probabilistic parameters and distributions can be used to find
out what performance level mainly PR connections and panel zones
satisfy and how many PR connections and panel zones experience a
serious damage under the Mid-America ground motions.
Abstract: In high powered dense wavelength division
multiplexed (WDM) systems with low chromatic dispersion,
four-wave mixing (FWM) can prove to be a major source of noise.
The MultiCanonical Monte Carlo Method (MCMC) and the Split
Step Fourier Method (SSFM) are combined to accurately evaluate the
probability density function of the decision variable of a receiver,
limited by FWM. The combination of the two methods leads to more
accurate results, and offers the possibility of adding other optical
noises such as the Amplified Spontaneous Emission (ASE) noise.
Abstract: In this paper, a new probability density function (pdf)
is proposed to model the statistics of wavelet coefficients, and a
simple Kalman-s filter is derived from the new pdf using Bayesian
estimation theory. Specifically, we decompose the speckled image
into wavelet subbands, we apply the Kalman-s filter to the high
subbands, and reconstruct a despeckled image from the modified
detail coefficients. Experimental results demonstrate that our method
compares favorably to several other despeckling methods on test
synthetic aperture radar (SAR) images.
Abstract: This work presents a fusion of Log Gabor Wavelet
(LGW) and Maximum a Posteriori (MAP) estimator as a speech
enhancement tool for acoustical background noise reduction. The
probability density function (pdf) of the speech spectral amplitude is
approximated by a Generalized Laplacian Distribution (GLD).
Compared to earlier estimators the proposed method estimates the
underlying statistical model more accurately by appropriately
choosing the model parameters of GLD. Experimental results show
that the proposed estimator yields a higher improvement in
Segmental Signal-to-Noise Ratio (S-SNR) and lower Log-Spectral
Distortion (LSD) in two different noisy environments compared to
other estimators.
Abstract: In view of their importance and usefulness in reliability theory and probability distributions, several generalizations of the inverse Gaussian distribution and the Krtzel function are investigated in recent years. This has motivated the authors to introduce and study a new generalization of the inverse Gaussian distribution and the Krtzel function associated with a product of a Bessel function of the third kind )(zKQ and a Z - Fox-Wright generalized hyper geometric function introduced in this paper. The introduced function turns out to be a unified gamma-type function. Its incomplete forms are also discussed. Several properties of this gamma-type function are obtained. By means of this generalized function, we introduce a generalization of inverse Gaussian distribution, which is useful in reliability analysis, diffusion processes, and radio techniques etc. The inverse Gaussian distribution thus introduced also provides a generalization of the Krtzel function. Some basic statistical functions associated with this probability density function, such as moments, the Mellin transform, the moment generating function, the hazard rate function, and the mean residue life function are also obtained.KeywordsFox-Wright function, Inverse Gaussian distribution, Krtzel function & Bessel function of the third kind.
Abstract: In this paper, we present a comparative study between two computer vision systems for objects recognition and tracking, these algorithms describe two different approach based on regions constituted by a set of pixels which parameterized objects in shot sequences. For the image segmentation and objects detection, the FCM technique is used, the overlapping between cluster's distribution is minimized by the use of suitable color space (other that the RGB one). The first technique takes into account a priori probabilities governing the computation of various clusters to track objects. A Parzen kernel method is described and allows identifying the players in each frame, we also show the importance of standard deviation value research of the Gaussian probability density function. Region matching is carried out by an algorithm that operates on the Mahalanobis distance between region descriptors in two subsequent frames and uses singular value decomposition to compute a set of correspondences satisfying both the principle of proximity and the principle of exclusion.