Abstract: Edgeworth Approximation, Bootstrap and Monte Carlo Simulations have a considerable impact on the achieving certain results related to different problems taken into study. In our paper, we have treated a financial case related to the effect that have the components of a Cash-Flow of one of the most successful businesses in the world, as the financial activity, operational activity and investing activity to the cash and cash equivalents at the end of the three-months period. To have a better view of this case we have created a Vector Autoregression model, and after that we have generated the impulse responses in the terms of Asymptotic Analysis (Edgeworth Approximation), Monte Carlo Simulations and Residual Bootstrap based on the standard errors of every series created. The generated results consisted of the common tendencies for the three methods applied, that consequently verified the advantage of the three methods in the optimization of the model that contains many variants.
Abstract: This research attempts to investigate the effects of heteroscedasticity and periodicity in a Panel Data Regression Model (PDRM) by extending previous works on balanced panel data estimation within the context of fitting PDRM for Banks audit fee. The estimation of such model was achieved through the derivation of Joint Lagrange Multiplier (LM) test for homoscedasticity and zero-serial correlation, a conditional LM test for zero serial correlation given heteroscedasticity of varying degrees as well as conditional LM test for homoscedasticity given first order positive serial correlation via a two-way error component model. Monte Carlo simulations were carried out for 81 different variations, of which its design assumed a uniform distribution under a linear heteroscedasticity function. Each of the variation was iterated 1000 times and the assessment of the three estimators considered are based on Variance, Absolute bias (ABIAS), Mean square error (MSE) and the Root Mean Square (RMSE) of parameters estimates. Eighteen different models at different specified conditions were fitted, and the best-fitted model is that of within estimator when heteroscedasticity is severe at either zero or positive serial correlation value. LM test results showed that the tests have good size and power as all the three tests are significant at 5% for the specified linear form of heteroscedasticity function which established the facts that Banks operations are severely heteroscedastic in nature with little or no periodicity effects.
Abstract: Developing countries are nowadays confronted with great challenges related to domestic sanitation services in view of the imminent water scarcity. Contemporary sanitation technologies established in these countries are likely to pose health risks unless waste management standards are followed properly. This paper provides a solution to sustainable sanitation with the development of an innovative toilet system, called Nano Membrane Toilet (NMT), which has been developed by Cranfield University and sponsored by the Bill & Melinda Gates Foundation. The particular technology converts human faeces into energy through gasification and provides treated wastewater from urine through membrane filtration. In order to evaluate the environmental profile of the NMT system, a deterministic life cycle assessment (LCA) has been conducted in SimaPro software employing the Ecoinvent v3.3 database. The particular study has determined the most contributory factors to the environmental footprint of the NMT system. However, as sensitivity analysis has identified certain critical operating parameters for the robustness of the LCA results, adopting a stochastic approach to the Life Cycle Inventory (LCI) will comprehensively capture the input data uncertainty and enhance the credibility of the LCA outcome. For that purpose, Monte Carlo simulations, in combination with an artificial neural network (ANN) model, have been conducted for the input parameters of raw material, produced electricity, NOX emissions, amount of ash and transportation of fertilizer. The given analysis has provided the distribution and the confidence intervals of the selected impact categories and, in turn, more credible conclusions are drawn on the respective LCIA (Life Cycle Impact Assessment) profile of NMT system. Last but not least, the specific study will also yield essential insights into the methodological framework that can be adopted in the environmental impact assessment of other complex engineering systems subject to a high level of input data uncertainty.
Abstract: This study examines conditional Value at Risk by applying the GJR-EVT-Copula model, and finds the optimal portfolio for eight Dow Jones Islamic-conventional pairs. Our methodology consists of modeling the data by a bivariate GJR-GARCH model in which we extract the filtered residuals and then apply the Peak over threshold model (POT) to fit the residual tails in order to model marginal distributions. After that, we use pair-copula to find the optimal portfolio risk dependence structure. Finally, with Monte Carlo simulations, we estimate the Value at Risk (VaR) and the conditional Value at Risk (CVaR). The empirical results show the VaR and CVaR values for an equally weighted portfolio of Dow Jones Islamic-conventional pairs. In sum, we found that the optimal investment focuses on Islamic-conventional US Market index pairs because of high investment proportion; however, all other index pairs have low investment proportion. These results deliver some real repercussions for portfolio managers and policymakers concerning to optimal asset allocations, portfolio risk management and the diversification advantages of these markets.
Abstract: Classically, an energy detector is implemented in time domain (TD). However, frequency domain (FD) based energy detector has demonstrated an improved performance. This paper presents a comparison between the two approaches as to analyze their pros and cons. A detailed performance analysis of the classical TD energy-detector and the periodogram based detector is performed. Exact and approximate mathematical expressions for probability of false alarm (Pf) and probability of detection (Pd) are derived for both approaches. The derived expressions naturally lead to an analytical as well as intuitive reasoning for the improved performance of (Pf) and (Pd) in different scenarios. Our analysis suggests the dependence improvement on buffer sizes. Pf is improved in FD, whereas Pd is enhanced in TD based energy detectors. Finally, Monte Carlo simulations results demonstrate the analysis reached by the derived expressions.
Abstract: Uncertainties related to fatigue damage estimation of
non-linear systems are highly dependent on the tail behaviour
and extreme values of the stress range distribution. By using
a combination of the First Order Reliability Method (FORM)
and Monte Carlo simulations (MCS), the accuracy of the fatigue
estimations may be improved for the same computational efforts.
The method is applied to a bottom-fixed, monopile-supported large
offshore wind turbine, which is a non-linear and dynamically sensitive
system. Different curve fitting techniques to the fatigue damage
distribution have been used depending on the sea-state dependent
response characteristics, and the effect of a bi-linear S-N curve is
discussed. Finally, analyses are performed on several environmental
conditions to investigate the long-term applicability of this multistep
method. Wave loads are calculated using state-of-the-art theory, while
wind loads are applied with a simplified model based on rotor thrust
coefficients.
Abstract: A multilayer passive shield composed of low-activity
lead (Pb), copper (Cu), tin (Sn) and iron (Fe) was designed and
manufactured for a coaxial HPGe detector placed at a surface
laboratory for reducing background radiation and radiation dose to
the personnel. The performance of the shield was evaluated and
efficiency curves of the detector were plotted by using of various
standard sources in different distances. Monte Carlo simulations and
a set of TLD chips were used for dose estimation in two distances of
20 and 40 cm. The results show that the shield reduced background
spectrum and the personnel dose more than 95%.
Abstract: A cyclostationary Gaussian linearization method is
formulated for investigating the time average response of nonlinear
system under sinusoidal signal and white noise excitation. The
quantitative measure of cyclostationary mean, variance, spectrum of
mean amplitude, and mean power spectral density of noise are
analyzed. The qualitative response behavior of stochastic jump and
bifurcation are investigated. The validity of the present approach in
predicting the quantitative and qualitative statistical responses is
supported by utilizing Monte Carlo simulations. The present analysis
without imposing restrictive analytical conditions can be directly
derived by solving non-linear algebraic equations. The analytical
solution gives reliable quantitative and qualitative prediction of mean
and noise response for the Duffing system subjected to both sinusoidal
signal and white noise excitation.
Abstract: Geometric and mechanical properties all influence the
resistance of RC structures and may, in certain combination of
property values, increase the risk of a brittle failure of the whole
system.
This paper presents a statistical and probabilistic investigation on
the resistance of RC beams designed according to Eurocodes 2 and 8,
and subjected to multiple failure modes, under both the natural
variation of material properties and the uncertainty associated with
cross-section and transverse reinforcement geometry. A full
probabilistic model based on JCSS Probabilistic Model Code is
derived. Different beams are studied through material nonlinear
analysis via Monte Carlo simulations. The resistance model is
consistent with Eurocode 2. Both a multivariate statistical evaluation
and the data clustering analysis of outcomes are then performed.
Results show that the ultimate load behaviour of RC beams
subjected to flexural and shear failure modes seems to be mainly
influenced by the combination of the mechanical properties of both
longitudinal reinforcement and stirrups, and the tensile strength of
concrete, of which the latter appears to affect the overall response of
the system in a nonlinear way. The model uncertainty of the
resistance model used in the analysis plays undoubtedly an important
role in interpreting results.
Abstract: Estimation of a proportion has many applications in
economics and social studies. A common application is the estimation
of the low income proportion, which gives the proportion of people
classified as poor into a population. In this paper, we present this
poverty indicator and propose to use the logistic regression estimator
for the problem of estimating the low income proportion. Various
sampling designs are presented. Assuming a real data set obtained
from the European Survey on Income and Living Conditions, Monte
Carlo simulation studies are carried out to analyze the empirical
performance of the logistic regression estimator under the various
sampling designs considered in this paper. Results derived from
Monte Carlo simulation studies indicate that the logistic regression
estimator can be more accurate than the customary estimator under
the various sampling designs considered in this paper. The stratified
sampling design can also provide more accurate results.
Abstract: In this paper, we investigated the effect of real valued transformation of the spectral matrix of the received data for Angles Of Arrival estimation problem. Indeed, the unitary transformation of Partial Propagator (UPP) for narrowband sources is proposed and applied on Uniform Linear Array (ULA).
Monte Carlo simulations proved the performance of the UPP spectrum comparatively with Forward Backward Partial Propagator (FBPP) and Unitary Propagator (UP). The results demonstrates that when some of the sources are fully correlated and closer than the Rayleigh angular limit resolution of the broadside array, the UPP method outperforms the FBPP in both of spatial resolution and complexity.
Abstract: In this article, we consider the estimation of P[Y < X], when strength, X and stress, Y are two independent variables of Burr Type XII distribution. The MLE of the R based on one simple iterative procedure is obtained. Assuming that the common parameter is known, the maximum likelihood estimator, uniformly minimum variance unbiased estimator and Bayes estimator of P[Y < X] are discussed. The exact confidence interval of the R is also obtained. Monte Carlo simulations are performed to compare the different proposed methods.
Abstract: In this paper some procedures for building confidence intervals for the reliability in stress-strength models are discussed and empirically compared. The particular case of a bivariate normal setup is considered. The confidence intervals suggested are obtained employing approximations or asymptotic properties of maximum likelihood estimators. The coverage and the precision of these intervals are empirically checked through a simulation study. An application to real paired data is also provided.
Abstract: This paper proposes a new performance characterization for the test strategy intended for second order filters denominated Transient Analysis Method (TRAM). We evaluate the ability of the addressed test strategy for detecting deviation faults under simultaneous statistical fluctuation of the non-faulty parameters. For this purpose, we use Monte Carlo simulations and a fault model that considers as faulty only one component of the filter under test while the others components adopt random values (within their tolerance band) obtained from their statistical distributions. The new data reported here show (for the filters under study) the presence of hard-to-test components and relatively low fault coverage values for small deviation faults. These results suggest that the fault coverage value obtained using only nominal values for the non-faulty components (the traditional evaluation of TRAM) seem to be a poor predictor of the test performance.
Abstract: The study of non-equilibrium systems has attracted
increasing interest in recent years, mainly due to the lack of
theoretical frameworks, unlike their equilibrium counterparts.
Studying the steady state and/or simple systems is thus one of the
main interests. Hence in this work we have focused our attention on
the driven lattice gas model (DLG model) consisting of interacting
particles subject to an external field E. The dynamics of the system
are given by hopping of particles to nearby empty sites with rates
biased for jumps in the direction of E. Having used small two
dimensional systems of DLG model, the stochastic properties at nonequilibrium
steady state were analytically studied. To understand the
non-equilibrium phenomena, we have applied the analytic approach
via master equation to calculate probability function and analyze
violation of detailed balance in term of the fluctuation-dissipation
theorem. Monte Carlo simulations have been performed to validate
the analytic results.
Abstract: A challenging problem in radar signal processing is to
achieve reliable target detection in the presence of interferences. In
this paper, we propose a novel algorithm for automatic censoring of
radar interfering targets in log-normal clutter. The proposed
algorithm, termed the forward automatic censored cell averaging
detector (F-ACCAD), consists of two steps: removing the corrupted
reference cells (censoring) and the actual detection. Both steps are
performed dynamically by using a suitable set of ranked cells to
estimate the unknown background level and set the adaptive
thresholds accordingly. The F-ACCAD algorithm does not require
any prior information about the clutter parameters nor does it require
the number of interfering targets. The effectiveness of the F-ACCAD
algorithm is assessed by computing, using Monte Carlo simulations,
the probability of censoring and the probability of detection in
different background environments.
Abstract: In this paper, we consider a multi user multiple input
multiple output (MU-MIMO) based cooperative reporting system for
cognitive radio network. In the reporting network, the secondary
users forward the primary user data to the common fusion center
(FC). The FC is equipped with linear equalizers and an energy
detector to make the decision about the spectrum. The primary user
data are considered to be a digital video broadcasting - terrestrial
(DVB-T) signal. The sensing channel and the reporting channel are
assumed to be an additive white Gaussian noise and an independent
identically distributed Raleigh fading respectively. We analyzed the
detection probability of MU-MIMO system with linear equalizers and
arrived at the closed form expression for average detection
probability. Also the system performance is investigated under
various MIMO scenarios through Monte Carlo simulations.
Abstract: This paper proposes, implements and evaluates an original discretization method for continuous random variables, in order to estimate the reliability of systems for which stress and strength are defined as complex functions, and whose reliability is not derivable through analytic techniques. This method is compared to other two discretizing approaches appeared in literature, also through a comparative study involving four engineering applications. The results show that the proposal is very efficient in terms of closeness of the estimates to the true (simulated) reliability. In the study we analyzed both a normal and a non-normal distribution for the random variables: this method is theoretically suitable for each parametric family.
Abstract: Wind farms (WFs) with high level of penetration are
being established in power systems worldwide more rapidly than
other renewable resources. The Independent System Operator (ISO),
as a policy maker, should propose appropriate places for WF
installation in order to maximize the benefits for the investors. There
is also a possibility of congestion relief using the new installation of
WFs which should be taken into account by the ISO when proposing
the locations for WF installation. In this context, efficient wind farm
(WF) placement method is proposed in order to reduce burdens on
congested lines. Since the wind speed is a random variable and load
forecasts also contain uncertainties, probabilistic approaches are used
for this type of study. AC probabilistic optimal power flow (P-OPF)
is formulated and solved using Monte Carlo Simulations (MCS). In
order to reduce computation time, point estimate methods (PEM) are
introduced as efficient alternative for time-demanding MCS.
Subsequently, WF optimal placement is determined using generation
shift distribution factors (GSDF) considering a new parameter
entitled, wind availability factor (WAF). In order to obtain more
realistic results, N-1 contingency analysis is employed to find the
optimal size of WF, by means of line outage distribution factors
(LODF). The IEEE 30-bus test system is used to show and compare
the accuracy of proposed methodology.
Abstract: This paper evaluate the multilevel modulation for
different techniques such as amplitude shift keying (M-ASK), MASK,
differential phase shift keying (M-ASK-Bipolar), Quaternary
Amplitude Shift Keying (QASK) and Quaternary Polarization-ASK
(QPol-ASK) at a total bit rate of 107 Gbps. The aim is to find a costeffective
very high speed transport solution. Numerical investigation
was performed using Monte Carlo simulations. The obtained results
indicate that some modulation formats can be operated at 100Gbps
in optical communication systems with low implementation effort
and high spectral efficiency.