On the Analysis of IP Traffic Distribution in the Network of Suranaree University of Technology

This paper presents the IP traffic analysis. The traffic was collected from the network of Suranaree University of Technology using the software based on the Simple Network Management Protocol (SNMP). In particular, we analyze the distribution of the aggregated traffic during the hours of peak load and light load. The traffic profiles including the parameters described the traffic distributions were derived. From the statistical analysis applying three different methods, including the Kolmogorov Smirnov test, Anderson Darling test, and Chi-Squared test, we found that the IP traffic distribution is a non-normal distribution and the distributions during the peak load and the light load are different. The experimental study and analysis show high uncertainty of the IP traffic.

Design and Implementation of Real-Time Automatic Censoring System on Chip for Radar Detection

Design and implementation of a novel B-ACOSD CFAR algorithm is presented in this paper. It is proposed for detecting radar target in log-normal distribution environment. The BACOSD detector is capable to detect automatically the number interference target in the reference cells and detect the real target by an adaptive threshold. The detector is implemented as a System on Chip on FPGA Altera Stratix II using parallelism and pipelining technique. For a reference window of length 16 cells, the experimental results showed that the processor works properly with a processing speed up to 115.13MHz and processing time0.29 ┬Ás, thus meets real-time requirement for a typical radar system.

Fuzzy Estimation of Parameters in Statistical Models

Using a set of confidence intervals, we develop a common approach, to construct a fuzzy set as an estimator for unknown parameters in statistical models. We investigate a method to derive the explicit and unique membership function of such fuzzy estimators. The proposed method has been used to derive the fuzzy estimators of the parameters of a Normal distribution and some functions of parameters of two Normal distributions, as well as the parameters of the Exponential and Poisson distributions.

Application of Pearson Parametric Distribution Model in Fatigue Life Reliability Evaluation

The aim of this paper is to introduce a parametric distribution model in fatigue life reliability analysis dealing with variation in material properties. Service loads in terms of responsetime history signal of Belgian pave were replicated on a multi-axial spindle coupled road simulator and stress-life method was used to estimate the fatigue life of automotive stub axle. A PSN curve was obtained by monotonic tension test and two-parameter Weibull distribution function was used to acquire the mean life of the component. A Pearson system was developed to evaluate the fatigue life reliability by considering stress range intercept and slope of the PSN curve as random variables. Considering normal distribution of fatigue strength, it is found that the fatigue life of the stub axle to have the highest reliability between 10000 – 15000 cycles. Taking into account the variation of material properties associated with the size effect, machining and manufacturing conditions, the method described in this study can be effectively applied in determination of probability of failure of mass-produced parts.

Coverage Probability of Confidence Intervals for the Normal Mean and Variance with Restricted Parameter Space

Recent articles have addressed the problem to construct the confidence intervals for the mean of a normal distribution where the parameter space is restricted, see for example Wang [Confidence intervals for the mean of a normal distribution with restricted parameter space. Journal of Statistical Computation and Simulation, Vol. 78, No. 9, 2008, 829–841.], we derived, in this paper, analytic expressions of the coverage probability and the expected length of confidence interval for the normal mean when the whole parameter space is bounded. We also construct the confidence interval for the normal variance with restricted parameter for the first time and its coverage probability and expected length are also mathematically derived. As a result, one can use these criteria to assess the confidence interval for the normal mean and variance when the parameter space is restricted without the back up from simulation experiments.

Confidence Intervals for Double Exponential Distribution: A Simulation Approach

The double exponential model (DEM), or Laplace distribution, is used in various disciplines. However, there are issues related to the construction of confidence intervals (CI), when using the distribution.In this paper, the properties of DEM are considered with intention of constructing CI based on simulated data. The analysis of pivotal equations for the models here in comparisons with pivotal equations for normal distribution are performed, and the results obtained from simulation data are presented.

Framework for Spare Inventory Management

Spare parts inventory management is one of the major areas of inventory research. Analysis of recent literature showed that an approach integrating spare parts classification, demand forecasting, and stock control policies is essential; however, adapting this integrated approach is limited. This work presents an integrated framework for spare part inventory management and an Excel based application developed for the implementation of the proposed framework. A multi-criteria analysis has been used for spare classification. Forecasting of spare parts- intermittent demand has been incorporated into the application using three different forecasting models; namely, normal distribution, exponential smoothing, and Croston method. The application is also capable of running with different inventory control policies. To illustrate the performance of the proposed framework and the developed application; the framework is applied to different items at a service organization. The results achieved are presented and possible areas for future work are highlighted.

Numerical Optimization within Vector of Parameters Estimation in Volatility Models

In this paper usefulness of quasi-Newton iteration procedure in parameters estimation of the conditional variance equation within BHHH algorithm is presented. Analytical solution of maximization of the likelihood function using first and second derivatives is too complex when the variance is time-varying. The advantage of BHHH algorithm in comparison to the other optimization algorithms is that requires no third derivatives with assured convergence. To simplify optimization procedure BHHH algorithm uses the approximation of the matrix of second derivatives according to information identity. However, parameters estimation in a/symmetric GARCH(1,1) model assuming normal distribution of returns is not that simple, i.e. it is difficult to solve it analytically. Maximum of the likelihood function can be founded by iteration procedure until no further increase can be found. Because the solutions of the numerical optimization are very sensitive to the initial values, GARCH(1,1) model starting parameters are defined. The number of iterations can be reduced using starting values close to the global maximum. Optimization procedure will be illustrated in framework of modeling volatility on daily basis of the most liquid stocks on Croatian capital market: Podravka stocks (food industry), Petrokemija stocks (fertilizer industry) and Ericsson Nikola Tesla stocks (information-s-communications industry).

Self Organizing Mixture Network in Mixture Discriminant Analysis: An Experimental Study

In the recent works related with mixture discriminant analysis (MDA), expectation and maximization (EM) algorithm is used to estimate parameters of Gaussian mixtures. But, initial values of EM algorithm affect the final parameters- estimates. Also, when EM algorithm is applied two times, for the same data set, it can be give different results for the estimate of parameters and this affect the classification accuracy of MDA. Forthcoming this problem, we use Self Organizing Mixture Network (SOMN) algorithm to estimate parameters of Gaussians mixtures in MDA that SOMN is more robust when random the initial values of the parameters are used [5]. We show effectiveness of this method on popular simulated waveform datasets and real glass data set.

Parametric Modeling Approach for Call Holding Times for IP based Public Safety Networks via EM Algorithm

This paper presents parametric probability density models for call holding times (CHTs) into emergency call center based on the actual data collected for over a week in the public Emergency Information Network (EIN) in Mongolia. When the set of chosen candidates of Gamma distribution family is fitted to the call holding time data, it is observed that the whole area in the CHT empirical histogram is underestimated due to spikes of higher probability and long tails of lower probability in the histogram. Therefore, we provide the Gaussian parametric model of a mixture of lognormal distributions with explicit analytical expressions for the modeling of CHTs of PSNs. Finally, we show that the CHTs for PSNs are fitted reasonably by a mixture of lognormal distributions via the simulation of expectation maximization algorithm. This result is significant as it expresses a useful mathematical tool in an explicit manner of a mixture of lognormal distributions.

A Discretizing Method for Reliability Computation in Complex Stress-strength Models

This paper proposes, implements and evaluates an original discretization method for continuous random variables, in order to estimate the reliability of systems for which stress and strength are defined as complex functions, and whose reliability is not derivable through analytic techniques. This method is compared to other two discretizing approaches appeared in literature, also through a comparative study involving four engineering applications. The results show that the proposal is very efficient in terms of closeness of the estimates to the true (simulated) reliability. In the study we analyzed both a normal and a non-normal distribution for the random variables: this method is theoretically suitable for each parametric family.

An AK-Chart for the Non-Normal Data

Traditional multivariate control charts assume that measurement from manufacturing processes follows a multivariate normal distribution. However, this assumption may not hold or may be difficult to verify because not all the measurement from manufacturing processes are normal distributed in practice. This study develops a new multivariate control chart for monitoring the processes with non-normal data. We propose a mechanism based on integrating the one-class classification method and the adaptive technique. The adaptive technique is used to improve the sensitivity to small shift on one-class classification in statistical process control. In addition, this design provides an easy way to allocate the value of type I error so it is easier to be implemented. Finally, the simulation study and the real data from industry are used to demonstrate the effectiveness of the propose control charts.

Quality of Concrete of Recent Development Projects in Libya

Numerous concrete structures projects are currently running in Libya as part of a US$50 billion government funding. The quality of concrete used in 20 different construction projects were assessed based mainly on the concrete compressive strength achieved. The projects are scattered all over the country and are at various levels of completeness. For most of these projects, the concrete compressive strength was obtained from test results of a 150mm standard cube mold. Statistical analysis of collected concrete compressive strengths reveals that the data in general followed a normal distribution pattern. The study covers comparison and assessment of concrete quality aspects such as: quality control, strength range, data standard deviation, data scatter, and ratio of minimum strength to design strength. Site quality control for these projects ranged from very good to poor according to ACI214 criteria [1]. The ranges (Rg) of the strength (max. strength – min. strength) divided by average strength are from (34% to 160%). Data scatter is measured as the range (Rg) divided by standard deviation () and is found to be (1.82 to 11.04), indicating that the range is ±3σ. International construction companies working in Libya follow different assessment criteria for concrete compressive strength in lieu of national unified procedure. The study reveals that assessments of concrete quality conducted by these construction companies usually meet their adopted (internal) standards, but sometimes fail to meet internationally known standard requirements. The assessment of concrete presented in this paper is based on ACI, British standards and proposed Libyan concrete strength assessment criteria.

Diagnosing the Cause and its Timing of Changes in Multivariate Process Mean Vector from Quality Control Charts using Artificial Neural Network

Quality control charts are very effective in detecting out of control signals but when a control chart signals an out of control condition of the process mean, searching for a special cause in the vicinity of the signal time would not always lead to prompt identification of the source(s) of the out of control condition as the change point in the process parameter(s) is usually different from the signal time. It is very important to manufacturer to determine at what point and which parameters in the past caused the signal. Early warning of process change would expedite the search for the special causes and enhance quality at lower cost. In this paper the quality variables under investigation are assumed to follow a multivariate normal distribution with known means and variance-covariance matrix and the process means after one step change remain at the new level until the special cause is being identified and removed, also it is supposed that only one variable could be changed at the same time. This research applies artificial neural network (ANN) to identify the time the change occurred and the parameter which caused the change or shift. The performance of the approach was assessed through a computer simulation experiment. The results show that neural network performs effectively and equally well for the whole shift magnitude which has been considered.

The Performance of Predictive Classification Using Empirical Bayes

This research is aimed to compare the percentages of correct classification of Empirical Bayes method (EB) to Classical method when data are constructed as near normal, short-tailed and long-tailed symmetric, short-tailed and long-tailed asymmetric. The study is performed using conjugate prior, normal distribution with known mean and unknown variance. The estimated hyper-parameters obtained from EB method are replaced in the posterior predictive probability and used to predict new observations. Data are generated, consisting of training set and test set with the sample sizes 100, 200 and 500 for the binary classification. The results showed that EB method exhibited an improved performance over Classical method in all situations under study.

Analysis of Temperature Change under Global Warming Impact using Empirical Mode Decomposition

The empirical mode decomposition (EMD) represents any time series into a finite set of basis functions. The bases are termed as intrinsic mode functions (IMFs) which are mutually orthogonal containing minimum amount of cross-information. The EMD successively extracts the IMFs with the highest local frequencies in a recursive way, which yields effectively a set low-pass filters based entirely on the properties exhibited by the data. In this paper, EMD is applied to explore the properties of the multi-year air temperature and to observe its effects on climate change under global warming. This method decomposes the original time-series into intrinsic time scale. It is capable of analyzing nonlinear, non-stationary climatic time series that cause problems to many linear statistical methods and their users. The analysis results show that the mode of EMD presents seasonal variability. The most of the IMFs have normal distribution and the energy density distribution of the IMFs satisfies Chi-square distribution. The IMFs are more effective in isolating physical processes of various time-scales and also statistically significant. The analysis results also show that the EMD method provides a good job to find many characteristics on inter annual climate. The results suggest that climate fluctuations of every single element such as temperature are the results of variations in the global atmospheric circulation.

Study of a BVAR(p) Process Applied to U.S. Commodity Market Data

The paper presents an applied study of a multivariate AR(p) process fitted to daily data from U.S. commodity futures markets with the use of Bayesian statistics. In the first part a detailed description of the methods used is given. In the second part two BVAR models are chosen one with assumption of lognormal, the second with normal distribution of prices conditioned on the parameters. For a comparison two simple benchmark models are chosen that are commonly used in todays Financial Mathematics. The article compares the quality of predictions of all the models, tries to find an adequate rate of forgetting of information and questions the validity of Efficient Market Hypothesis in the semi-strong form.

High Perfomance Communication Protocol for Wireless Ad-Hoc Sensor Networks

In order to monitor for traffic traversal, sensors can be deployed to perform collaborative target detection. Such a sensor network achieves a certain level of detection performance with the associated costs of deployment and routing protocol. This paper addresses these two points of sensor deployment and routing algorithm in the situation where the absolute quantity of sensors or total energy becomes insufficient. This discussion on the best deployment system concluded that two kinds of deployments; Normal and Power law distributions, show 6 and 3 times longer than Random distribution in the duration of coverage, respectively. The other discussion on routing algorithm to achieve good performance in each deployment system was also addressed. This discussion concluded that, in place of the traditional algorithm, a new algorithm can extend the time of coverage duration by 4 times in a Normal distribution, and in the circumstance where every deployed sensor operates as a binary model.

Clustering Mixed Data Using Non-normal Regression Tree for Process Monitoring

In the semiconductor manufacturing process, large amounts of data are collected from various sensors of multiple facilities. The collected data from sensors have several different characteristics due to variables such as types of products, former processes and recipes. In general, Statistical Quality Control (SQC) methods assume the normality of the data to detect out-of-control states of processes. Although the collected data have different characteristics, using the data as inputs of SQC will increase variations of data, require wide control limits, and decrease performance to detect outof- control. Therefore, it is necessary to separate similar data groups from mixed data for more accurate process control. In the paper, we propose a regression tree using split algorithm based on Pearson distribution to handle non-normal distribution in parametric method. The regression tree finds similar properties of data from different variables. The experiments using real semiconductor manufacturing process data show improved performance in fault detecting ability.

Determination of Cd, Zn, K, pH, TNV, Organic Material and Electrical Conductivity (EC) Distribution in Agricultural Soils using Geostatistics and GIS (Case Study: South- Western of Natanz- Iran)

Soil chemical and physical properties have important roles in compartment of the environment and agricultural sustainability and human health. The objectives of this research is determination of spatial distribution patterns of Cd, Zn, K, pH, TNV, organic material and electrical conductivity (EC) in agricultural soils of Natanz region in Esfehan province. In this study geostatistic and non-geostatistic methods were used for prediction of spatial distribution of these parameters. 64 composite soils samples were taken at 0-20 cm depth. The study area is located in south of NATANZ agricultural lands with area of 21660 hectares. Spatial distribution of Cd, Zn, K, pH, TNV, organic material and electrical conductivity (EC) was determined using geostatistic and geographic information system. Results showed that Cd, pH, TNV and K data has normal distribution and Zn, OC and EC data had not normal distribution. Kriging, Inverse Distance Weighting (IDW), Local Polynomial Interpolation (LPI) and Redial Basis functions (RBF) methods were used to interpolation. Trend analysis showed that organic carbon in north-south and east to west did not have trend while K and TNV had second degree trend. We used some error measurements include, mean absolute error(MAE), mean squared error (MSE) and mean biased error(MBE). Ordinary kriging(exponential model), LPI(Local polynomial interpolation), RBF(radial basis functions) and IDW methods have been chosen as the best methods to interpolating of the soil parameters. Prediction maps by disjunctive kriging was shown that in whole study area was intensive shortage of organic matter and more than 63.4 percent of study area had shortage of K amount.