Abstract: This paper presents a novel statistical description of
the counterpoise effective length due to lightning surges, where the
(impulse) effective length had been obtained by means of regressive
formulas applied to the transient simulation results. The effective
length is described in terms of a statistical distribution function, from
which median, mean, variance, and other parameters of interest could
be readily obtained. The influence of lightning current amplitude,
lightning front duration, and soil resistivity on the effective length has
been accounted for, assuming statistical nature of these parameters. A
method for determining the optimal counterpoise length, in terms of
the statistical impulse effective length, is also presented. It is based on
estimating the number of dangerous events associated with lightning
strikes. Proposed statistical description and the associated method
provide valuable information which could aid the design engineer in
optimising physical lengths of counterpoises in different grounding
arrangements and soil resistivity situations.
Abstract: This paper presents reliability indices evaluation of the
rotor core magnetization of the induction motor operated as a self
excited induction generator by using probability distribution approach
and Monte Carlo simulation. Parallel capacitors with calculated
minimum capacitive value across the terminals of the induction motor
operated as a SEIG with unregulated shaft speed have been connected
during the experimental study. A three phase, 4 poles, 50Hz, 5.5 hp,
12.3A, 230V induction motor coupled with DC Shunt Motor was
tested in the electrical machine laboratory with variable reactive loads.
Based on this experimental study, it is possible to choose a reliable
induction machines operated as a SEIG for unregulated renewable
energy application in remote area or where grid is not available.
Failure density function, cumulative failure distribution function,
survivor function, hazard model, probability of success and
probability of failure for reliability evaluation of the three phase
induction motor operating as a SEIG have been presented graphically
in this paper.
Abstract: reliability-based methodology for the assessment
and evaluation of reinforced concrete (R/C) structural elements of
concrete structures is presented herein. The results of the reliability
analysis and assessment for R/C structural elements were verified by
the results obtained through deterministic methods. The outcomes of
the reliability-based analysis were compared against currently
adopted safety limits that are incorporated in the reliability indices
β’s, according to international standards and codes. The methodology
is based on probabilistic analysis using reliability concepts and
statistics of the main random variables that are relevant to the subject
matter, and for which they are to be used in the performance-function
equation(s) associated with the structural elements under study.
These methodology techniques can result in reliability index β, which
is commonly known as the reliability index or reliability measure
value that can be utilized to assess and evaluate the safety, human
risk, and functionality of the structural component. Also, these
methods can result in revised partial safety factor values for certain
target reliability indices that can be used for the purpose of
redesigning the R/C elements of the building and in which they could
assist in considering some other remedial actions to improve the
safety and functionality of the member.
Abstract: Estimation of a proportion has many applications in
economics and social studies. A common application is the estimation
of the low income proportion, which gives the proportion of people
classified as poor into a population. In this paper, we present this
poverty indicator and propose to use the logistic regression estimator
for the problem of estimating the low income proportion. Various
sampling designs are presented. Assuming a real data set obtained
from the European Survey on Income and Living Conditions, Monte
Carlo simulation studies are carried out to analyze the empirical
performance of the logistic regression estimator under the various
sampling designs considered in this paper. Results derived from
Monte Carlo simulation studies indicate that the logistic regression
estimator can be more accurate than the customary estimator under
the various sampling designs considered in this paper. The stratified
sampling design can also provide more accurate results.
Abstract: Photoacoustic imaging (PAI) is a non-invasive and
non-ionizing imaging modality that combines the absorption contrast
of light with ultrasound resolution. Laser is used to deposit optical
energy into a target (i.e., optical fluence). Consequently, the target
temperature rises, and then thermal expansion occurs that leads to
generating a PA signal. In general, most image reconstruction
algorithms for PAI assume uniform fluence within an imaging object.
However, it is known that optical fluence distribution within the
object is non-uniform. This could affect the reconstruction of PA
images. In this study, we have investigated the influence of optical
fluence distribution on PA back-propagation imaging using finite
element method. The uniform fluence was simulated as a triangular
waveform within the object of interest. The non-uniform fluence
distribution was estimated by solving light propagation within a
tissue model via Monte Carlo method. The results show that the PA
signal in the case of non-uniform fluence is wider than the uniform
case by 23%. The frequency spectrum of the PA signal due to the
non-uniform fluence has missed some high frequency components in
comparison to the uniform case. Consequently, the reconstructed
image with the non-uniform fluence exhibits a strong smoothing
effect.
Abstract: This paper proposes a linear mixed model (LMM) with spatial effects to forecast rice and cassava yields in Thailand at the same time. A multivariate conditional autoregressive (MCAR) model is assumed to present the spatial effects. A Bayesian method is used for parameter estimation via Gibbs sampling Markov Chain Monte Carlo (MCMC). The model is applied to the rice and cassava yields monthly data which have been extracted from the Office of Agricultural Economics, Ministry of Agriculture and Cooperatives of Thailand. The results show that the proposed model has better performance in most provinces in both fitting part and validation part compared to the simple exponential smoothing and conditional auto regressive models (CAR) from our previous study.
Abstract: In this paper, we investigated the effect of real valued transformation of the spectral matrix of the received data for Angles Of Arrival estimation problem. Indeed, the unitary transformation of Partial Propagator (UPP) for narrowband sources is proposed and applied on Uniform Linear Array (ULA).
Monte Carlo simulations proved the performance of the UPP spectrum comparatively with Forward Backward Partial Propagator (FBPP) and Unitary Propagator (UP). The results demonstrates that when some of the sources are fully correlated and closer than the Rayleigh angular limit resolution of the broadside array, the UPP method outperforms the FBPP in both of spatial resolution and complexity.
Abstract: A forecasting model for steel demand uncertainty in Thailand is proposed. It consists of trend, autocorrelation, and outliers in a hierarchical Bayesian frame work. The proposed model uses a cumulative Weibull distribution function, latent first-order autocorrelation, and binary selection, to account for trend, time-varying autocorrelation, and outliers, respectively. The Gibbs sampling Markov Chain Monte Carlo (MCMC) is used for parameter estimation. The proposed model is applied to steel demand index data in Thailand. The root mean square error (RMSE), mean absolute percentage error (MAPE), and mean absolute error (MAE) criteria are used for model comparison. The study reveals that the proposed model is more appropriate than the exponential smoothing method.
Abstract: This paper is an attempt to describe some of the results that had been found through a journey of study in the field of particle physics. This study consists of two parts, one about the measurement of the cross section of the decay of the Z particle in two electrons, and the other deals with the measurement of the cross section of the multi-photon absorption process using a beam of Laser in the Liquid Argon Time Projection Chamber.
The first part of the paper concerns the results based on the analysis of a data sample containing 8120 ee candidates to reconstruct the mass of the Z particle for each event where each event has an ee pair with PT(e) > 20GeV, and η(e) < 2.5. Monte Carlo templates of the reconstructed Z particle were produced as a function of the Z mass scale. The distribution of the reconstructed Z mass in the data was compared to the Monte Carlo templates, where the total cross section is calculated to be equal to 1432pb.
The second part concerns the Liquid Argon Time Projection Chamber, LAr TPC, the results of the interaction of the UV Laser, Nd-YAG with λ= 266mm, with LAr and through the study of the multi-photon ionization process as a part of the R&D at Bern University. The main result of this study was the cross section of the process of the multi-photon ionization process of the LAr, σe = 1.24±0.10stat±0.30sys.10 -56cm4.
Abstract: The problem of estimating a proportion has important
applications in the field of economics, and in general, in many areas
such as social sciences. A common application in economics is
the estimation of the headcount index. In this paper, we define the
general headcount index as a proportion. Furthermore, we introduce
a new quantitative method for estimating the headcount index. In
particular, we suggest to use the logistic regression estimator for the
problem of estimating the headcount index. Assuming a real data set,
results derived from Monte Carlo simulation studies indicate that the
logistic regression estimator can be more accurate than the traditional
estimator of the headcount index.
Abstract: The unit root tests based on the robust estimator for the first-order autoregressive process are proposed and compared with the unit root tests based on the ordinary least squares (OLS) estimator. The percentiles of the null distributions of the unit root test are also reported. The empirical probabilities of Type I error and powers of the unit root tests are estimated via Monte Carlo simulation. Simulation results show that all unit root tests can control the probability of Type I error for all situations. The empirical power of the unit root tests based on the robust estimator are higher than the unit root tests based on the OLS estimator.
Abstract: Risk Evaluation is an important step in protecting your workers and your business, as well as complying with the law. It helps you focus on the risks that really matter in your workplace – the ones with the potential to cause real harm. We are in this paper introduce basics of risk assessment then we mention some of ways to risk evaluation by computer especially Monte Carlo simulation and Microsoft project.
We use Program Evaluation and Review Technique (PERT) to deal with Risks in Industrial Facilities in Evaluation and Assessment for this risk. Using PERT Technique in Microsoft Project by the PERT toolbar and using PERTMASTER Program with Primavera Program we evaluate many hazards and make calculations for that by mathematical equation to make right decisions. We define and calculate risk factor and risk severity to ranking the type of the risk then dealing with it using in that many ways like probability computation, curves, and tables. By introducing variables in the equation of functions in computer programs we calculate the risk in the time and the cost in general case and then mention some examples in industrial facilities field.
Abstract: The two common approaches to Structural Equation Modeling (SEM) are the Covariance-Based SEM (CB-SEM) and Partial Least Squares SEM (PLS-SEM). There is much debate on the performance of CB-SEM and PLS-SEM for small sample size and when distributions are nonnormal. This study evaluates the performance of CB-SEM and PLS-SEM under normality and nonnormality conditions via a simulation. Monte Carlo Simulation in R programming language was employed to generate data based on the theoretical model with one endogenous and four exogenous variables. Each latent variable has three indicators. For normal distributions, CB-SEM estimates were found to be inaccurate for small sample size while PLS-SEM could produce the path estimates. Meanwhile, for a larger sample size, CB-SEM estimates have lower variability compared to PLS-SEM. Under nonnormality, CB-SEM path estimates were inaccurate for small sample size. However, CB-SEM estimates are more accurate than those of PLS-SEM for sample size of 50 and above. The PLS-SEM estimates are not accurate unless sample size is very large.
Abstract: This paper investigates the suitability of Latin Hypercube sampling (LHS) for composite electric power system reliability analysis. Each sample generated in LHS is mapped into an equivalent system state and used for evaluating the annualized system and load point indices. DC loadflow based state evaluation model is solved for each sampled contingency state. The indices evaluated are loss of load probability, loss of load expectation, expected demand not served and expected energy not supplied. The application of the LHS is illustrated through case studies carried out using RBTS and IEEE-RTS test systems. Results obtained are compared with non-sequential Monte Carlo simulation and state enumeration analytical approaches. An error analysis is also carried out to check the LHS method’s ability to capture the distributions of the reliability indices. It is found that LHS approach estimates indices nearer to actual value and gives tighter bounds of indices than non-sequential Monte Carlo simulation.
Abstract: In many practical applications in various areas, such as engineering, science and social science, it is known that there exist bounds on the values of unknown parameters. For example, values of some measurements for controlling machines in an industrial process, weight or height of subjects, blood pressures of patients and retirement ages of public servants. When interval estimation is considered in a situation where the parameter to be estimated is bounded, it has been argued that the classical Neyman procedure for setting confidence intervals is unsatisfactory. This is due to the fact that the information regarding the restriction is simply ignored. It is, therefore, of significant interest to construct confidence intervals for the parameters that include the additional information on parameter values being bounded to enhance the accuracy of the interval estimation. Therefore in this paper, we propose a new confidence interval for the coefficient of variance where the population mean and standard deviation are bounded. The proposed interval is evaluated in terms of coverage probability and expected length via Monte Carlo simulation.
Abstract: In this paper, we propose two new confidence intervals for the inverse of a normal mean with a known coefficient of variation. One of new confidence intervals for the inverse of a normal mean with a known coefficient of variation is constructed based on the pivotal statistic Z where Z is a standard normal distribution and another confidence interval is constructed based on the generalized confidence interval, presented by Weerahandi. We examine the performance of these confidence intervals in terms of coverage probabilities and average lengths via Monte Carlo simulation.
Abstract: Motivated by the recent work of Herbert, Hayen, Macaskill and Walter [Interval estimation for the difference of two independent variances. Communications in Statistics, Simulation and Computation, 40: 744-758, 2011.], we investigate, in this paper, new confidence intervals for the difference between two normal population variances based on the generalized confidence interval of Weerahandi [Generalized Confidence Intervals. Journal of the American Statistical Association, 88(423): 899-905, 1993.] and the closed form method of variance estimation of Zou, Huo and Taleban [Simple confidence intervals for lognormal means and their differences with environmental applications. Environmetrics 20: 172-180, 2009]. Monte Carlo simulation results indicate that our proposed confidence intervals give a better coverage probability than that of the existing confidence interval. Also two new confidence intervals perform similarly based on their coverage probabilities and their average length widths.
Abstract: Inferring the network structure from time series data
is a hard problem, especially if the time series is short and noisy.
DNA microarray is a technology allowing to monitor the mRNA
concentration of thousands of genes simultaneously that produces
data of these characteristics. In this study we try to investigate the
influence of the experimental design on the quality of the result.
More precisely, we investigate the influence of two different types of
random single gene perturbations on the inference of genetic networks
from time series data. To obtain an objective quality measure for
this influence we simulate gene expression values with a biologically
plausible model of a known network structure. Within this framework
we study the influence of single gene knock-outs in opposite to
linearly controlled expression for single genes on the quality of the
infered network structure.
Abstract: A high-performance Monte Carlo simulation, which
simultaneously takes diffusion-controlled and chain-length-dependent
bimolecular termination reactions into account, is developed to
simulate atom transfer radical copolymerization of styrene and nbutyl
acrylate. As expected, increasing initial feed fraction of styrene
raises the fraction of styrene-styrene dyads (fAA) and reduces that of
n-butyl acrylate dyads (fBB). The trend of variation in randomness
parameter (fAB) during the copolymerization also varies significantly.
Also, there is a drift in copolymer heterogeneity and the highest drift
occurs in the initial feeds containing lower percentages of styrene, i.e.
20% and 5%.
Abstract: The many feasible alternatives and conflicting
objectives make equipment selection in materials handling a
complicated task. This paper presents utilizing Monte Carlo (MC)
simulation combined with the Analytic Hierarchy Process (AHP) to
evaluate and select the most appropriate Material Handling
Equipment (MHE). The proposed hybrid model was built on the base
of material handling equation to identify main and sub criteria critical
to MHE selection. The criteria illustrate the properties of the material
to be moved, characteristics of the move, and the means by which the
materials will be moved. The use of MC simulation beside the AHP
is very powerful where it allows the decision maker to represent
his/her possible preference judgments as random variables. This will
reduce the uncertainty of single point judgment at conventional AHP,
and provide more confidence in the decision problem results. A small
business pharmaceutical company is used as an example to illustrate
the development and application of the proposed model.