The Impact of Upgrades on ERP System Reliability

Constant upgrading of Enterprise Resource Planning (ERP) systems is necessary, but can cause new defects. This paper attempts to model the likelihood of defects after completed upgrades with Weibull defect probability density function (PDF). A case study is presented analyzing data of recorded defects obtained for one ERP subsystem. The trends are observed for the value of the parameters relevant to the proposed statistical Weibull distribution for a given one year period. As a result, the ability to predict the appearance of defects after the next upgrade is described.

Network Analysis in a Natural Perturbed Ecosystem

The objective of this work is to explicit knowledge on the interactions between the chlorophyll-a and nine meroplankton larvae of epibenthonic fauna. The studied case is the Arraial do Cabo upwelling system, Southeastern of Brazil, which provides different environmental conditions. To assess this information a network approach based in probability estimative was used. Comparisons among the generated graphs are made in the light of different water masses, application of Shannon biodiversity index, and the closeness and betweenness centralities measurements. Our results show the main pattern among different water masses and how the core organisms belonging to the network skeleton are correlated to the main environmental variable. We conclude that the approach of complex networks is a promising tool for environmental diagnostic.

Application of Pearson Parametric Distribution Model in Fatigue Life Reliability Evaluation

The aim of this paper is to introduce a parametric distribution model in fatigue life reliability analysis dealing with variation in material properties. Service loads in terms of responsetime history signal of Belgian pave were replicated on a multi-axial spindle coupled road simulator and stress-life method was used to estimate the fatigue life of automotive stub axle. A PSN curve was obtained by monotonic tension test and two-parameter Weibull distribution function was used to acquire the mean life of the component. A Pearson system was developed to evaluate the fatigue life reliability by considering stress range intercept and slope of the PSN curve as random variables. Considering normal distribution of fatigue strength, it is found that the fatigue life of the stub axle to have the highest reliability between 10000 – 15000 cycles. Taking into account the variation of material properties associated with the size effect, machining and manufacturing conditions, the method described in this study can be effectively applied in determination of probability of failure of mass-produced parts.

Probabilistic Model Development for Project Performance Forecasting

In this paper, based on the past project cost and time performance, a model for forecasting project cost performance is developed. This study presents a probabilistic project control concept to assure an acceptable forecast of project cost performance. In this concept project activities are classified into sub-groups entitled control accounts. Then obtain the Stochastic S-Curve (SS-Curve), for each sub-group and the project SS-Curve is obtained by summing sub-groups- SS-Curves. In this model, project cost uncertainties are considered through Beta distribution functions of the project activities costs required to complete the project at every selected time sections through project accomplishment, which are extracted from a variety of sources. Based on this model, after a percentage of the project progress, the project performance is measured via Earned Value Management to adjust the primary cost probability distribution functions. Then, accordingly the future project cost performance is predicted by using the Monte-Carlo simulation method.

An Adaptive Model for Blind Image Restoration using Bayesian Approach

Image restoration involves elimination of noise. Filtering techniques were adopted so far to restore images since last five decades. In this paper, we consider the problem of image restoration degraded by a blur function and corrupted by random noise. A method for reducing additive noise in images by explicit analysis of local image statistics is introduced and compared to other noise reduction methods. The proposed method, which makes use of an a priori noise model, has been evaluated on various types of images. Bayesian based algorithms and technique of image processing have been described and substantiated with experimentation using MATLAB.

A Simulation Model for Bid Price Decision Making

In Lebanon, public construction projects are awarded to the contractor submitting the lowest bid price based on a competitive bidding process. The contractor has to make a strategic decision in choosing the appropriate bid price that will offer a satisfactory profit with a greater probability to win. A simulation model for bid price decision making based on the lowest bid price evaluation is developed. The model, built using Crystal Ball decisionengineering software, considers two main factors affecting the bidding process: the number of qualified bidders and the size of the project. The validity of the model is tested on twelve separate projects. The study also shows how to use the model to conduct risk analysis and help any specific contractor to decide on his bid price with associated certainty level in a scientific method.

Advanced Stochastic Models for Partially Developed Speckle

Speckled images arise when coherent microwave, optical, and acoustic imaging techniques are used to image an object, surface or scene. Examples of coherent imaging systems include synthetic aperture radar, laser imaging systems, imaging sonar systems, and medical ultrasound systems. Speckle noise is a form of object or target induced noise that results when the surface of the object is Rayleigh rough compared to the wavelength of the illuminating radiation. Detection and estimation in images corrupted by speckle noise is complicated by the nature of the noise and is not as straightforward as detection and estimation in additive noise. In this work, we derive stochastic models for speckle noise, with an emphasis on speckle as it arises in medical ultrasound images. The motivation for this work is the problem of segmentation and tissue classification using ultrasound imaging. Modeling of speckle in this context involves partially developed speckle model where an underlying Poisson point process modulates a Gram-Charlier series of Laguerre weighted exponential functions, resulting in a doubly stochastic filtered Poisson point process. The statistical distribution of partially developed speckle is derived in a closed canonical form. It is observed that as the mean number of scatterers in a resolution cell is increased, the probability density function approaches an exponential distribution. This is consistent with fully developed speckle noise as demonstrated by the Central Limit theorem.

Monte Carlo Analysis and Fuzzy Sets for Uncertainty Propagation in SIS Performance Assessment

The object of this work is the probabilistic performance evaluation of safety instrumented systems (SIS), i.e. the average probability of dangerous failure on demand (PFDavg) and the average frequency of failure (PFH), taking into account the uncertainties related to the different parameters that come into play: failure rate (λ), common cause failure proportion (β), diagnostic coverage (DC)... This leads to an accurate and safe assessment of the safety integrity level (SIL) inherent to the safety function performed by such systems. This aim is in keeping with the requirement of the IEC 61508 standard with respect to handling uncertainty. To do this, we propose an approach that combines (1) Monte Carlo simulation and (2) fuzzy sets. Indeed, the first method is appropriate where representative statistical data are available (using pdf of the relating parameters), while the latter applies in the case characterized by vague and subjective information (using membership function). The proposed approach is fully supported with a suitable computer code.

Establishing a Probabilistic Model of Extrapolated Wind Speed Data for Wind Energy Prediction

Wind is among the potential energy resources which can be harnessed to generate wind energy for conversion into electrical power. Due to the variability of wind speed with time and height, it becomes difficult to predict the generated wind energy more optimally. In this paper, an attempt is made to establish a probabilistic model fitting the wind speed data recorded at Makambako site in Tanzania. Wind speeds and direction were respectively measured using anemometer (type AN1) and wind Vane (type WD1) both supplied by Delta-T-Devices at a measurement height of 2 m. Wind speeds were then extrapolated for the height of 10 m using power law equation with an exponent of 0.47. Data were analysed using MINITAB statistical software to show the variability of wind speeds with time and height, and to determine the underlying probability model of the extrapolated wind speed data. The results show that wind speeds at Makambako site vary cyclically over time; and they conform to the Weibull probability distribution. From these results, Weibull probability density function can be used to predict the wind energy.

Bayesian Decision Approach to Protection on the Flood Event in Upper Ayeyarwady River, Myanmar

This paper introduces the foundations of Bayesian probability theory and Bayesian decision method. The main goal of Bayesian decision theory is to minimize the expected loss of a decision or minimize the expected risk. The purposes of this study are to review the decision process on the issue of flood occurrences and to suggest possible process for decision improvement. This study examines the problem structure of flood occurrences and theoretically explicates the decision-analytic approach based on Bayesian decision theory and application to flood occurrences in Environmental Engineering. In this study, we will discuss about the flood occurrences upon an annual maximum water level in cm, 43-year record available from 1965 to 2007 at the gauging station of Sagaing on the Ayeyarwady River with the drainage area - 120193 sq km by using Bayesian decision method. As a result, we will discuss the loss and risk of vast areas of agricultural land whether which will be inundated or not in the coming year based on the two standard maximum water levels during 43 years. And also we forecast about that lands will be safe from flood water during the next 10 years.

Some Investigations on Higher Mathematics Scores for Chinese University Student

To investigate some relations between higher mathe¬matics scores in Chinese graduate student entrance examination and calculus (resp. linear algebra, probability statistics) scores in subject's completion examination of Chinese university, we select 20 students as a sample, take higher mathematics score as a decision attribute and take calculus score, linear algebra score, probability statistics score as condition attributes. In this paper, we are based on rough-set theory (Rough-set theory is a logic-mathematical method proposed by Z. Pawlak. In recent years, this theory has been widely implemented in the many fields of natural science and societal science.) to investigate importance of condition attributes with respective to decision attribute and strength of condition attributes supporting decision attribute. Results of this investigation will be helpful for university students to raise higher mathematics scores in Chinese graduate student entrance examination.

Performance Analysis of an Adaptive Threshold Hybrid Double-Dwell System with Antenna Diversity for Acquisition in DS-CDMA Systems

In this paper, we consider the analysis of the acquisition process for a hybrid double-dwell system with antenna diversity for DS-CDMA (direct sequence-code division multiple access) using an adaptive threshold. Acquisition systems with a fixed threshold value are unable to adapt to fast varying mobile communications environments and may result in a high false alarm rate, and/or low detection probability. Therefore, we propose an adaptively varying threshold scheme through the use of a cellaveraging constant false alarm rate (CA-CFAR) algorithm, which is well known in the field of radar detection. We derive exact expressions for the probabilities of detection and false alarm in Rayleigh fading channels. The mean acquisition time of the system under consideration is also derived. The performance of the system is analyzed and compared to that of a hybrid single dwell system.

Noise-Improved Signal Detection in Nonlinear Threshold Systems

We discuss the signal detection through nonlinear threshold systems. The detection performance is assessed by the probability of error Per . We establish that: (1) when the signal is complete suprathreshold, noise always degrades the signal detection both in the single threshold system and in the parallel array of threshold devices. (2) When the signal is a little subthreshold, noise degrades signal detection in the single threshold system. But in the parallel array, noise can improve signal detection, i.e., stochastic resonance (SR) exists in the array. (3) When the signal is predominant subthreshold, noise always can improve signal detection and SR always exists not only in the single threshold system but also in the parallel array. (4) Array can improve signal detection by raising the number of threshold devices. These results extend further the applicability of SR in signal detection.

Estimating Regression Parameters in Linear Regression Model with a Censored Response Variable

In this work we study the effect of several covariates X on a censored response variable T with unknown probability distribution. In this context, most of the studies in the literature can be located in two possible general classes of regression models: models that study the effect the covariates have on the hazard function; and models that study the effect the covariates have on the censored response variable. Proposals in this paper are in the second class of models and, more specifically, on least squares based model approach. Thus, using the bootstrap estimate of the bias, we try to improve the estimation of the regression parameters by reducing their bias, for small sample sizes. Simulation results presented in the paper show that, for reasonable sample sizes and censoring levels, the bias is always smaller for the new proposals.

Quantifying and Adjusting the Effects of Publication Bias in Continuous Meta-Analysis

This study uses simulated meta-analysis to assess the effects of publication bias on meta-analysis estimates and to evaluate the efficacy of the trim and fill method in adjusting for these biases. The estimated effect sizes and the standard error were evaluated in terms of the statistical bias and the coverage probability. The results demonstrate that if publication bias is not adjusted it could lead to up to 40% bias in the treatment effect estimates. Utilization of the trim and fill method could reduce the bias in the overall estimate by more than half. The method is optimum in presence of moderate underlying bias but has minimal effects in presence of low and severe publication bias. Additionally, the trim and fill method improves the coverage probability by more than half when subjected to the same level of publication bias as those of the unadjusted data. The method however tends to produce false positive results and will incorrectly adjust the data for publication bias up to 45 % of the time. Nonetheless, the bias introduced into the estimates due to this adjustment is minimal

A Study on Mechanical Properties of Fiberboard Made of Durian Rind through Latex with Phenolic Resin as Binding Agent

This study was aimed to study the probability about the production of fiberboard made of durian rind through latex with phenolic resin as binding agent. The durian rind underwent the boiling process with NaOH [7], [8] and then the fiber from durian rind was formed into fiberboard through heat press. This means that durian rind could be used as replacement for plywood in plywood industry by using durian fiber as composite material with adhesive substance. This research would study the probability about the production of fiberboard made of durian rind through latex with phenolic resin as binding agent. At first, durian rind was split, exposed to light, boiled and steamed in order to gain durian fiber. Then, fiberboard was tested with the density of 600 Kg/m3 and 800 Kg/m3. in order to find a suitable ratio of durian fiber and latex. Afterwards, mechanical properties were tested according to the standards of ASTM and JIS A5905-1994. After the suitable ratio was known, the test results would be compared with medium density fiberboard (MDF) and other related research studies. According to the results, fiberboard made of durian rind through latex with phenolic resin at the density of 800 Kg/m3 at ratio of 1:1, the moisture was measured to be 5.05% with specific gravity (ASTM D 2395-07a) of 0.81, density (JIS A 5905-1994) of 0.88 g/m3, tensile strength, hardness (ASTM D2240), flexibility or elongation at break yielded similar values as the ones by medium density fiberboard (MDF).

Confidence Intervals for the Normal Mean with Known Coefficient of Variation

In this paper we proposed two new confidence intervals for the normal population mean with known coefficient of variation. This situation occurs normally in environment and agriculture experiments where the scientist knows the coefficient of variation of their experiments. We propose two new confidence intervals for this problem based on the recent work of Searls [5] and the new method proposed in this paper for the first time. We derive analytic expressions for the coverage probability and the expected length of each confidence interval. Monte Carlo simulation will be used to assess the performance of these intervals based on their expected lengths.

Modeling Peer-to-Peer Networks with Interest-Based Clusters

In the world of Peer-to-Peer (P2P) networking different protocols have been developed to make the resource sharing or information retrieval more efficient. The SemPeer protocol is a new layer on Gnutella that transforms the connections of the nodes based on semantic information to make information retrieval more efficient. However, this transformation causes high clustering in the network that decreases the number of nodes reached, therefore the probability of finding a document is also decreased. In this paper we describe a mathematical model for the Gnutella and SemPeer protocols that captures clustering-related issues, followed by a proposition to modify the SemPeer protocol to achieve moderate clustering. This modification is a sort of link management for the individual nodes that allows the SemPeer protocol to be more efficient, because the probability of a successful query in the P2P network is reasonably increased. For the validation of the models, we evaluated a series of simulations that supported our results.

Real-Time Testing of Steel Strip Welds based on Bayesian Decision Theory

One of the main trouble in a steel strip manufacturing line is the breakage of whatever weld carried out between steel coils, that are used to produce the continuous strip to be processed. A weld breakage results in a several hours stop of the manufacturing line. In this process the damages caused by the breakage must be repaired. After the reparation and in order to go on with the production it will be necessary a restarting process of the line. For minimizing this problem, a human operator must inspect visually and manually each weld in order to avoid its breakage during the manufacturing process. The work presented in this paper is based on the Bayesian decision theory and it presents an approach to detect, on real-time, steel strip defective welds. This approach is based on quantifying the tradeoffs between various classification decisions using probability and the costs that accompany such decisions.

Performance Evaluation of a Limited Round-Robin System

Performance of a limited Round-Robin (RR) rule is studied in order to clarify the characteristics of a realistic sharing model of a processor. Under the limited RR rule, the processor allocates to each request a fixed amount of time, called a quantum, in a fixed order. The sum of the requests being allocated these quanta is kept below a fixed value. Arriving requests that cannot be allocated quanta because of such a restriction are queued or rejected. Practical performance measures, such as the relationship between the mean sojourn time, the mean number of requests, or the loss probability and the quantum size are evaluated via simulation. In the evaluation, the requested service time of an arriving request is converted into a quantum number. One of these quanta is included in an RR cycle, which means a series of quanta allocated to each request in a fixed order. The service time of the arriving request can be evaluated using the number of RR cycles required to complete the service, the number of requests receiving service, and the quantum size. Then an increase or decrease in the number of quanta that are necessary before service is completed is reevaluated at the arrival or departure of other requests. Tracking these events and calculations enables us to analyze the performance of our limited RR rule. In particular, we obtain the most suitable quantum size, which minimizes the mean sojourn time, for the case in which the switching time for each quantum is considered.