Reliability Analysis of Underground Pipelines Using Subset Simulation

An advanced Monte Carlo simulation method, called Subset Simulation (SS) for the time-dependent reliability prediction for underground pipelines has been presented in this paper. The SS can provide better resolution for low failure probability level with efficient investigating of rare failure events which are commonly encountered in pipeline engineering applications. In SS method, random samples leading to progressive failure are generated efficiently and used for computing probabilistic performance by statistical variables. SS gains its efficiency as small probability event as a product of a sequence of intermediate events with larger conditional probabilities. The efficiency of SS has been demonstrated by numerical studies and attention in this work is devoted to scrutinise the robustness of the SS application in pipe reliability assessment. It is hoped that the development work can promote the use of SS tools for uncertainty propagation in the decision-making process of underground pipelines network reliability prediction.

Influence of Noise on the Inference of Dynamic Bayesian Networks from Short Time Series

In this paper we investigate the influence of external noise on the inference of network structures. The purpose of our simulations is to gain insights in the experimental design of microarray experiments to infer, e.g., transcription regulatory networks from microarray experiments. Here external noise means, that the dynamics of the system under investigation, e.g., temporal changes of mRNA concentration, is affected by measurement errors. Additionally to external noise another problem occurs in the context of microarray experiments. Practically, it is not possible to monitor the mRNA concentration over an arbitrary long time period as demanded by the statistical methods used to learn the underlying network structure. For this reason, we use only short time series to make our simulations more biologically plausible.

Analysis of Linear Equalizers for Cooperative Multi-User MIMO Based Reporting System

In this paper, we consider a multi user multiple input multiple output (MU-MIMO) based cooperative reporting system for cognitive radio network. In the reporting network, the secondary users forward the primary user data to the common fusion center (FC). The FC is equipped with linear equalizers and an energy detector to make the decision about the spectrum. The primary user data are considered to be a digital video broadcasting - terrestrial (DVB-T) signal. The sensing channel and the reporting channel are assumed to be an additive white Gaussian noise and an independent identically distributed Raleigh fading respectively. We analyzed the detection probability of MU-MIMO system with linear equalizers and arrived at the closed form expression for average detection probability. Also the system performance is investigated under various MIMO scenarios through Monte Carlo simulations.

Environmental and Economic Scenario Analysis of the Redundant Golf Courses in Japan

Commercial infrastructures intended for use as leisure retreats such as golf and ski resorts have been extensively developed in many rural areas of Japan. However, following the burst of the economic bubble in the 1990s, several existing resorts faced tough management decisions and some were forced to close their business. In this study, six alternative management options for restructuring the existing golf courses (park, cemetery, biofuel production, reforestation, pasturing and abandonment) are examined and their environmental and economic impacts are quantitatively assessed. In addition, restructuring scenarios of these options and an ex-ante assessment model are developed. The scenario analysis by Monte Carlo simulation shows a clear trade-off between GHG savings and benefit/cost (B/C) ratios, of which “Restoring Nature" scenario absorbs the most CO2 among the four scenarios considered, but its B/C ratio is the lowest. This study can be used to select or examine options and scenarios of golf course management and rural environmental management policies.

A Framework of Monte Carlo Simulation for Examining the Uncertainty-Investment Relationship

This paper argues that increased uncertainty, in certain situations, may actually encourage investment. Since earlier studies mostly base their arguments on the assumption of geometric Brownian motion, the study extends the assumption to alternative stochastic processes, such as mixed diffusion-jump, mean-reverting process, and jump amplitude process. A general approach of Monte Carlo simulation is developed to derive optimal investment trigger for the situation that the closed-form solution could not be readily obtained under the assumption of alternative process. The main finding is that the overall effect of uncertainty on investment is interpreted by the probability of investing, and the relationship appears to be an invested U-shaped curve between uncertainty and investment. The implication is that uncertainty does not always discourage investment even under several sources of uncertainty. Furthermore, high-risk projects are not always dominated by low-risk projects because the high-risk projects may have a positive realization effect on encouraging investment.

A Novel Low Power Digitally Controlled Oscillator with Improved linear Operating Range

In this paper, an ultra low power and low jitter 12bit CMOS digitally controlled oscillator (DCO) design is presented. Based on a ring oscillator implemented with low power Schmitt trigger based inverters. Simulation of the proposed DCO using 32nm CMOS Predictive Transistor Model (PTM) achieves controllable frequency range of 550MHz~830MHz with a wide linearity and high resolution. Monte Carlo simulation demonstrates that the time-period jitter due to random power supply fluctuation is under 31ps and the power consumption is 0.5677mW at 750MHz with 1.2V power supply and 0.53-ps resolution. The proposed DCO has a good robustness to voltage and temperature variations and better linearity comparing to the conventional design.

Estimated Production Potential Types of Wind Turbines Connected to the Network Using Random Numbers Simulation

Nowadays, power systems, energy generation by wind has been very important. Noting that the production of electrical energy by wind turbines on site to several factors (such as wind speed and profile site for the turbines, especially off the wind input speed, wind rated speed and wind output speed disconnect) is dependent. On the other hand, several different types of turbines in the market there. Therefore, selecting a turbine that its capacity could also answer the need for electric consumers the efficiency is high something is important and necessary. In this context, calculating the amount of wind power to help optimize overall network, system operation, in determining the parameters of wind power is very important. In this article, to help calculate the amount of wind power plant, connected to the national network in the region Manjil wind, selecting the best type of turbine and power delivery profile appropriate to the network using Monte Carlo method has been. In this paper, wind speed data from the wind site in Manjil, as minute and during the year has been. Necessary simulations based on Random Numbers Simulation method and repeat, using the software MATLAB and Excel has been done.

A Discretizing Method for Reliability Computation in Complex Stress-strength Models

This paper proposes, implements and evaluates an original discretization method for continuous random variables, in order to estimate the reliability of systems for which stress and strength are defined as complex functions, and whose reliability is not derivable through analytic techniques. This method is compared to other two discretizing approaches appeared in literature, also through a comparative study involving four engineering applications. The results show that the proposal is very efficient in terms of closeness of the estimates to the true (simulated) reliability. In the study we analyzed both a normal and a non-normal distribution for the random variables: this method is theoretically suitable for each parametric family.

Motion Parameter Estimation via Dopplerlet-Transform-Based Matched Field Processing

This work presents a matched field processing (MFP) algorithm based on Dopplerlet transform for estimating the motion parameters of a sound source moving along a straight line and with a constant speed by using a piecewise strategy, which can significantly reduce the computational burden. Monte Carlo simulation results and an experimental result are presented to verify the effectiveness of the algorithm advocated.

Probabilistic Method of Wind Generation Placement for Congestion Management

Wind farms (WFs) with high level of penetration are being established in power systems worldwide more rapidly than other renewable resources. The Independent System Operator (ISO), as a policy maker, should propose appropriate places for WF installation in order to maximize the benefits for the investors. There is also a possibility of congestion relief using the new installation of WFs which should be taken into account by the ISO when proposing the locations for WF installation. In this context, efficient wind farm (WF) placement method is proposed in order to reduce burdens on congested lines. Since the wind speed is a random variable and load forecasts also contain uncertainties, probabilistic approaches are used for this type of study. AC probabilistic optimal power flow (P-OPF) is formulated and solved using Monte Carlo Simulations (MCS). In order to reduce computation time, point estimate methods (PEM) are introduced as efficient alternative for time-demanding MCS. Subsequently, WF optimal placement is determined using generation shift distribution factors (GSDF) considering a new parameter entitled, wind availability factor (WAF). In order to obtain more realistic results, N-1 contingency analysis is employed to find the optimal size of WF, by means of line outage distribution factors (LODF). The IEEE 30-bus test system is used to show and compare the accuracy of proposed methodology.

A Monte Carlo Method to Data Stream Analysis

Data stream analysis is the process of computing various summaries and derived values from large amounts of data which are continuously generated at a rapid rate. The nature of a stream does not allow a revisit on each data element. Furthermore, data processing must be fast to produce timely analysis results. These requirements impose constraints on the design of the algorithms to balance correctness against timely responses. Several techniques have been proposed over the past few years to address these challenges. These techniques can be categorized as either dataoriented or task-oriented. The data-oriented approach analyzes a subset of data or a smaller transformed representation, whereas taskoriented scheme solves the problem directly via approximation techniques. We propose a hybrid approach to tackle the data stream analysis problem. The data stream has been both statistically transformed to a smaller size and computationally approximated its characteristics. We adopt a Monte Carlo method in the approximation step. The data reduction has been performed horizontally and vertically through our EMR sampling method. The proposed method is analyzed by a series of experiments. We apply our algorithm on clustering and classification tasks to evaluate the utility of our approach.

Networks with Unreliable Nodes and Edges: Monte Carlo Lifetime Estimation

Estimating the lifetime distribution of computer networks in which nodes and links exist in time and are bound for failure is very useful in various applications. This problem is known to be NP-hard. In this paper we present efficient combinatorial approaches to Monte Carlo estimation of network lifetime distribution. We also present some simulation results.

The Investigations of Water-ethanol Mixture by Monte Carlo Method

Energetic and structural results for ethanol-water mixtures as a function of the mole fraction were calculated using Monte Carlo methodology. Energy partitioning results obtained for equimolar water-ethanol mixture and ether organic liquids are compared. It has been shown that at xet=0.22 the RDFs for waterethanol and ethanol-ethanol interactions indicated strong hydrophobic interactions between ethanol molecules and the local structure of solution is less structured at this concentration as at ether ones. Results obtained for ethanol-water mixture as a function of concentration are in good agreement with the experimental data.

Evaluation of Multilevel Modulation Formats for 100Gbps Transmission with Direct Detection

This paper evaluate the multilevel modulation for different techniques such as amplitude shift keying (M-ASK), MASK, differential phase shift keying (M-ASK-Bipolar), Quaternary Amplitude Shift Keying (QASK) and Quaternary Polarization-ASK (QPol-ASK) at a total bit rate of 107 Gbps. The aim is to find a costeffective very high speed transport solution. Numerical investigation was performed using Monte Carlo simulations. The obtained results indicate that some modulation formats can be operated at 100Gbps in optical communication systems with low implementation effort and high spectral efficiency.

Statistical Evaluation of Nonlinear Distortion using the Multi-Canonical Monte Carlo Method and the Split Step Fourier Method

In high powered dense wavelength division multiplexed (WDM) systems with low chromatic dispersion, four-wave mixing (FWM) can prove to be a major source of noise. The MultiCanonical Monte Carlo Method (MCMC) and the Split Step Fourier Method (SSFM) are combined to accurately evaluate the probability density function of the decision variable of a receiver, limited by FWM. The combination of the two methods leads to more accurate results, and offers the possibility of adding other optical noises such as the Amplified Spontaneous Emission (ASE) noise.

Effect of Size of the Step in the Response Surface Methodology using Nonlinear Test Functions

The response surface methodology (RSM) is a collection of mathematical and statistical techniques useful in the modeling and analysis of problems in which the dependent variable receives the influence of several independent variables, in order to determine which are the conditions under which should operate these variables to optimize a production process. The RSM estimated a regression model of first order, and sets the search direction using the method of maximum / minimum slope up / down MMS U/D. However, this method selects the step size intuitively, which can affect the efficiency of the RSM. This paper assesses how the step size affects the efficiency of this methodology. The numerical examples are carried out through Monte Carlo experiments, evaluating three response variables: efficiency gain function, the optimum distance and the number of iterations. The results in the simulation experiments showed that in response variables efficiency and gain function at the optimum distance were not affected by the step size, while the number of iterations is found that the efficiency if it is affected by the size of the step and function type of test used.

The Application of Real Options to Capital Budgeting

Real options theory suggests that managerial flexibility embedded within irreversible investments can account for a significant value in project valuation. Although the argument has become the dominant focus of capital investment theory over decades, yet recent survey literature in capital budgeting indicates that corporate practitioners still do not explicitly apply real options in investment decisions. In this paper, we explore how real options decision criteria can be transformed into equivalent capital budgeting criteria under the consideration of uncertainty, assuming that underlying stochastic process follows a geometric Brownian motion (GBM), a mixed diffusion-jump (MX), or a mean-reverting process (MR). These equivalent valuation techniques can be readily decomposed into conventional investment rules and “option impacts", the latter of which describe the impacts on optimal investment rules with the option value considered. Based on numerical analysis and Monte Carlo simulation, three major findings are derived. First, it is shown that real options could be successfully integrated into the mindset of conventional capital budgeting. Second, the inclusion of option impacts tends to delay investment. It is indicated that the delay effect is the most significant under a GBM process and the least significant under a MR process. Third, it is optimal to adopt the new capital budgeting criteria in investment decision-making and adopting a suboptimal investment rule without considering real options could lead to a substantial loss in value.

Screened Potential in a Reverse Monte Carlo (RMC) Simulation

A structural study of an aqueous electrolyte whose experimental results are available. It is a solution of LiCl-6H2O type at glassy state (120K) contrasted with pure water at room temperature by means of Partial Distribution Functions (PDF) issue from neutron scattering technique. Based on these partial functions, the Reverse Monte Carlo method (RMC) computes radial and angular correlation functions which allow exploring a number of structural features of the system. The obtained curves include some artifacts. To remedy this, we propose to introduce a screened potential as an additional constraint. Obtained results show a good matching between experimental and computed functions and a significant improvement in PDFs curves with potential constraint. It suggests an efficient fit of pair distribution functions curves.

Diagnosing the Cause and its Timing of Changes in Multivariate Process Mean Vector from Quality Control Charts using Artificial Neural Network

Quality control charts are very effective in detecting out of control signals but when a control chart signals an out of control condition of the process mean, searching for a special cause in the vicinity of the signal time would not always lead to prompt identification of the source(s) of the out of control condition as the change point in the process parameter(s) is usually different from the signal time. It is very important to manufacturer to determine at what point and which parameters in the past caused the signal. Early warning of process change would expedite the search for the special causes and enhance quality at lower cost. In this paper the quality variables under investigation are assumed to follow a multivariate normal distribution with known means and variance-covariance matrix and the process means after one step change remain at the new level until the special cause is being identified and removed, also it is supposed that only one variable could be changed at the same time. This research applies artificial neural network (ANN) to identify the time the change occurred and the parameter which caused the change or shift. The performance of the approach was assessed through a computer simulation experiment. The results show that neural network performs effectively and equally well for the whole shift magnitude which has been considered.

Development of a RAM Simulation Model for Acid Gas Removal System

A reliability, availability and maintainability (RAM) model has been built for acid gas removal plant for system analysis that will play an important role in any process modifications, if required, for achieving its optimum performance. Due to the complexity of the plant, the model was based on a Reliability Block Diagram (RBD) with a Monte Carlo simulation engine. The model has been validated against actual plant data as well as local expert opinions, resulting in an acceptable simulation model. The results from the model showed that the operation and maintenance can be further improved, resulting in reduction of the annual production loss.