Abstract: In many practical applications in various areas, such as engineering, science and social science, it is known that there exist bounds on the values of unknown parameters. For example, values of some measurements for controlling machines in an industrial process, weight or height of subjects, blood pressures of patients and retirement ages of public servants. When interval estimation is considered in a situation where the parameter to be estimated is bounded, it has been argued that the classical Neyman procedure for setting confidence intervals is unsatisfactory. This is due to the fact that the information regarding the restriction is simply ignored. It is, therefore, of significant interest to construct confidence intervals for the parameters that include the additional information on parameter values being bounded to enhance the accuracy of the interval estimation. Therefore in this paper, we propose a new confidence interval for the coefficient of variance where the population mean and standard deviation are bounded. The proposed interval is evaluated in terms of coverage probability and expected length via Monte Carlo simulation.
Abstract: In this work, sorption of nickel from aqueous solution on hypnea valentiae, red macro algae, was investigated. Batch experiments have been carried out to find the effect of various parameters such as pH, temperature, sorbent dosage, metal concentration and contact time on the sorption of nickel using hypnea valentiae. Response surface methodology (RSM) is employed to optimize the process parameters. Based on the central composite design, quadratic model was developed to correlate the process variables to the response. The most influential factor on each experimental design response was identified from the analysis of variance (ANOVA). The optimum conditions for the sorption of nickel were found to be: pH – 5.1, temperature – 36.8oC, sorbent dosage – 5.1 g/L, metal concentration – 100 mg/L and contact time – 30 min. At these optimized conditions the maximum removal of nickel was found to be 91.97%. A coefficient of determination R2 value 0.9548 shows the fitness of response surface methodology in this work.
Abstract: Motivated by the recent work of Herbert, Hayen, Macaskill and Walter [Interval estimation for the difference of two independent variances. Communications in Statistics, Simulation and Computation, 40: 744-758, 2011.], we investigate, in this paper, new confidence intervals for the difference between two normal population variances based on the generalized confidence interval of Weerahandi [Generalized Confidence Intervals. Journal of the American Statistical Association, 88(423): 899-905, 1993.] and the closed form method of variance estimation of Zou, Huo and Taleban [Simple confidence intervals for lognormal means and their differences with environmental applications. Environmetrics 20: 172-180, 2009]. Monte Carlo simulation results indicate that our proposed confidence intervals give a better coverage probability than that of the existing confidence interval. Also two new confidence intervals perform similarly based on their coverage probabilities and their average length widths.
Abstract: In this article, we consider the estimation of P[Y < X], when strength, X and stress, Y are two independent variables of Burr Type XII distribution. The MLE of the R based on one simple iterative procedure is obtained. Assuming that the common parameter is known, the maximum likelihood estimator, uniformly minimum variance unbiased estimator and Bayes estimator of P[Y < X] are discussed. The exact confidence interval of the R is also obtained. Monte Carlo simulations are performed to compare the different proposed methods.
Abstract: German electricity European options on futures using
Lévy processes for the underlying asset are examined. Implied
volatility evolution, under each of the considered models, is
discussed after calibrating for the Merton jump diffusion (MJD),
variance gamma (VG), normal inverse Gaussian (NIG), Carr, Geman,
Madan and Yor (CGMY) and the Black and Scholes (B&S) model.
Implied volatility is examined for the entire sample period, revealing
some curious features about market evolution, where data fitting
performances of the five models are compared. It is shown that
variance gamma processes provide relatively better results and that
implied volatility shows significant differences through time, having
increasingly evolved. Volatility changes for changed uncertainty, or
else, increasing futures prices and there is evidence for the need to
account for seasonality when modelling both electricity spot/futures
prices and volatility.
Abstract: Based on the sources- smoothed rank profile (SRP) and modified minimum description length (MMDL) principle, a method for estimation of the source coherency structure (SCS) and the number of wideband sources is proposed in this paper. Instead of focusing, we first use a spatial smoothing technique to pre-process the array covariance matrix of each frequency for de-correlating the sources and then use smoothed rank profile to determine the SCS and the number of wideband sources. We demonstrate the availability of the method by numerical simulations.
Abstract: Textile structures are engineered and fabricated to
meet worldwide structural applications. Nevertheless, research
varying textile structure on natural fibre as composite reinforcement
was found to be very limited. Most of the research is focusing on
short fibre and random discontinuous orientation of the reinforcement
structure. Realizing that natural fibre (NF) composite had been
widely developed to be used as synthetic fibre composite
replacement, this research attempted to examine the influence of
woven and cross-ply laminated structure towards its mechanical
performances. Laminated natural fibre composites were developed
using hand lay-up and vacuum bagging technique. Impact and
flexural strength were investigated as a function of fibre type (coir
and kenaf) and reinforcement structure (imbalanced plain woven,
0°/90° cross-ply and +45°/-45° cross-ply). Multi-level full factorial
design of experiment (DOE) and analysis of variance (ANOVA) was
employed to impart data as to how fibre type and reinforcement
structure parameters affect the mechanical properties of the
composites. This systematic experimentation has led to determination
of significant factors that predominant influences the impact and
flexural properties of the textile composites. It was proven that both
fibre type and reinforcement structure demonstrated significant
difference results. Overall results indicated that coir composite and
woven structure exhibited better impact and flexural strength. Yet,
cross-ply composite structure demonstrated better fracture resistance.
Abstract: Deep Brain Stimulation or DBS is a surgical treatment for Parkinson-s Disease with three stimulation parameters: frequency, pulse width, and voltage. The parameters should be selected appropriately to achieve effective treatment. This selection now, performs clinically. The aim of this research is to study chaotic behavior of recorded tremor of patients under DBS in order to present a computational method to recognize stimulation optimum voltage. We obtained some chaotic features of tremor signal, and discovered embedding space of it has an attractor, and its largest Lyapunov exponent is positive, which show tremor signal has chaotic behavior, also we found out, in optimal voltage, entropy and embedding space variance of tremor signal have minimum values in comparison with other voltages. These differences can help neurologists recognize optimal voltage numerically, which leads to reduce patients' role and discomfort in optimizing stimulation parameters and to do treatment with high accuracy.
Abstract: In order to enhance the contrast in the regions where the pixels have similar intensities, this paper presents a new histogram equalization scheme. Conventional global equalization schemes over-equalizes these regions so that too bright or dark pixels are resulted and local equalization schemes produce unexpected discontinuities at the boundaries of the blocks. The proposed algorithm segments the original histogram into sub-histograms with reference to brightness level and equalizes each sub-histogram with the limited extents of equalization considering its mean and variance. The final image is determined as the weighted sum of the equalized images obtained by using the sub-histogram equalizations. By limiting the maximum and minimum ranges of equalization operations on individual sub-histograms, the over-equalization effect is eliminated. Also the result image does not miss feature information in low density histogram region since the remaining these area is applied separating equalization. This paper includes how to determine the segmentation points in the histogram. The proposed algorithm has been tested with more than 100 images having various contrasts in the images and the results are compared to the conventional approaches to show its superiority.
Abstract: Missing data yields many analysis challenges. In case of complex survey design, in addition to dealing with missing data, researchers need to account for the sampling design to achieve useful inferences. Methods for incorporating sampling weights in neural network imputation were investigated to account for complex survey designs. An estimate of variance to account for the imputation uncertainty as well as the sampling design using neural networks will be provided. A simulation study was conducted to compare estimation results based on complete case analysis, multiple imputation using a Markov Chain Monte Carlo, and neural network imputation. Furthermore, a public-use dataset was used as an example to illustrate neural networks imputation under a complex survey design
Abstract: This paper aimed to study the factors that relate to
working behavior of employees at Pakkred Municipality, Nonthaburi
Province. A questionnaire was utilized as the tool in collecting
information. Descriptive statistics included frequency, percentage,
mean and standard deviation. Independent- sample t- test, analysis of
variance and Pearson Correlation were also used. The findings of this
research revealed that the majority of the respondents were female,
between 25- 35 years old, married, with a Bachelor degree. The
average monthly salary of respondents was between 8,001- 12,000
Baht, and having about 4-7 years of working experience. Regarding
the overall working motivation factors, the findings showed that
interrelationship, respect, and acceptance were ranked as highly
important factors, whereas motivation, remunerations & welfare,
career growth, and working conditions were ranked as moderately
important factors. Also, overall working behavior was ranked as high.
The hypotheses testing revealed that different genders had a
different working behavior and had a different way of working as a
team, which was significant at the 0.05 confidence level, Moreover,
there was a difference among employees with different monthly
salary in working behavior, problem- solving and decision making,
which all were significant at the 0.05 confidence level. Employees
with different years of working experience were found to have work
working behavior both individual and as a team at the statistical
significance level of 0.01 and 0.05. The result of testing the
relationship between motivation in overall working revealed that
interrelationship, respect and acceptance from others, career growth,
and working conditions related to working behavior at a moderate
level, while motivation in performing duties and remunerations and
welfares related to working behavior towards the same direction at a
low level, with a statistical significance of 0.01.
Abstract: An attempt has been made to investigate the
machinability of zirconia toughened alumina (ZTA) inserts while
turning AISI 4340 steel. The insert was prepared by powder
metallurgy process route and the machining experiments were
performed based on Response Surface Methodology (RSM) design
called Central Composite Design (CCD). The mathematical model of
flank wear, cutting force and surface roughness have been developed
using second order regression analysis. The adequacy of model has
been carried out based on Analysis of variance (ANOVA) techniques.
It can be concluded that cutting speed and feed rate are the two most
influential factor for flank wear and cutting force prediction. For
surface roughness determination, the cutting speed & depth of cut
both have significant contribution. Key parameters effect on each
response has also been presented in graphical contours for choosing
the operating parameter preciously. 83% desirability level has been
achieved using this optimized condition.
Abstract: When the failure function is monotone, some monotonic reliability methods are used to gratefully simplify and facilitate the reliability computations. However, these methods often work in a transformed iso-probabilistic space. To this end, a monotonic simulator or transformation is needed in order that the transformed failure function is still monotone. This note proves at first that the output distribution of failure function is invariant under the transformation. And then it presents some conditions under which the transformed function is still monotone in the newly obtained space. These concern the copulas and the dependence concepts. In many engineering applications, the Gaussian copulas are often used to approximate the real word copulas while the available information on the random variables is limited to the set of marginal distributions and the covariances. So this note catches an importance on the conditional monotonicity of the often used transformation from an independent random vector into a dependent random vector with Gaussian copulas.
Abstract: In this paper we propose a novel method for human
face segmentation using the elliptical structure of the human head. It
makes use of the information present in the edge map of the image.
In this approach we use the fact that the eigenvalues of covariance
matrix represent the elliptical structure. The large and small
eigenvalues of covariance matrix are associated with major and
minor axial lengths of an ellipse. The other elliptical parameters are
used to identify the centre and orientation of the face. Since an
Elliptical Hough Transform requires 5D Hough Space, the Circular
Hough Transform (CHT) is used to evaluate the elliptical parameters.
Sparse matrix technique is used to perform CHT, as it squeeze zero
elements, and have only a small number of non-zero elements,
thereby having an advantage of less storage space and computational
time. Neighborhood suppression scheme is used to identify the valid
Hough peaks. The accurate position of the circumference pixels for
occluded and distorted ellipses is identified using Bresenham-s
Raster Scan Algorithm which uses the geometrical symmetry
properties. This method does not require the evaluation of tangents
for curvature contours, which are very sensitive to noise. The method
has been evaluated on several images with different face orientations.
Abstract: Aims for this study: first, to compare the expertise
level in data analysis, communication and information technologies
in undergraduate psychology students. Second, to verify the factor
structure of E-ETICA (Escala de Experticia en Tecnologias de la Informacion, la Comunicacion y el Análisis or Data Analysis,
Communication and Information'Expertise Scale) which had shown
an excellent internal consistency (α= 0.92) as well as a simple factor
structure. Three factors, Complex, Basic Information and
Communications Technologies and E-Searching and Download
Abilities, explains 63% of variance. In the present study, 260
students (119 juniors and 141 seniors) were asked to respond to
ETICA (16 items Likert scale of five points 1: null domain to 5: total
domain). The results show that both junior and senior students report
having very similar expertise level; however, E-ETICA presents a
different factor structure for juniors and four factors explained also
63% of variance: Information E-Searching, Download and Process;
Data analysis; Organization; and Communication technologies.
Abstract: A novel feature selection strategy to improve the recognition accuracy on the faces that are affected due to nonuniform illumination, partial occlusions and varying expressions is proposed in this paper. This technique is applicable especially in scenarios where the possibility of obtaining a reliable intra-class probability distribution is minimal due to fewer numbers of training samples. Phase congruency features in an image are defined as the points where the Fourier components of that image are maximally inphase. These features are invariant to brightness and contrast of the image under consideration. This property allows to achieve the goal of lighting invariant face recognition. Phase congruency maps of the training samples are generated and a novel modular feature selection strategy is implemented. Smaller sub regions from a predefined neighborhood within the phase congruency images of the training samples are merged to obtain a large set of features. These features are arranged in the order of increasing distance between the sub regions involved in merging. The assumption behind the proposed implementation of the region merging and arrangement strategy is that, local dependencies among the pixels are more important than global dependencies. The obtained feature sets are then arranged in the decreasing order of discriminating capability using a criterion function, which is the ratio of the between class variance to the within class variance of the sample set, in the PCA domain. The results indicate high improvement in the classification performance compared to baseline algorithms.
Abstract: Fingerprint based identification system; one of a well
known biometric system in the area of pattern recognition and has
always been under study through its important role in forensic
science that could help government criminal justice community. In
this paper, we proposed an identification framework of individuals by
means of fingerprint. Different from the most conventional
fingerprint identification frameworks the extracted Geometrical
element features (GEFs) will go through a Discretization process.
The intention of Discretization in this study is to attain individual
unique features that could reflect the individual varianceness in order
to discriminate one person from another. Previously, Discretization
has been shown a particularly efficient identification on English
handwriting with accuracy of 99.9% and on discrimination of twins-
handwriting with accuracy of 98%. Due to its high discriminative
power, this method is adopted into this framework as an independent
based method to seek for the accuracy of fingerprint identification.
Finally the experimental result shows that the accuracy rate of
identification of the proposed system using Discretization is 100%
for FVC2000, 93% for FVC2002 and 89.7% for FVC2004 which is
much better than the conventional or the existing fingerprint
identification system (72% for FVC2000, 26% for FVC2002 and
32.8% for FVC2004). The result indicates that Discretization
approach manages to boost up the classification effectively, and
therefore prove to be suitable for other biometric features besides
handwriting and fingerprint.
Abstract: A new Feed-Forward/Feedback Generalized
Minimum Variance Pole-placement Controller to incorporate the
robustness of classical pole-placement into the flexibility of
generalized minimum variance self-tuning controller for Single-Input
Single-Output (SISO) has been proposed in this paper. The design,
which provides the user with an adaptive mechanism, which ensures
that the closed loop poles are, located at their pre-specified positions.
In addition, the controller design which has a feed-forward/feedback
structure overcomes the certain limitations existing in similar poleplacement
control designs whilst retaining the simplicity of
adaptation mechanisms used in other designs. It tracks set-point
changes with the desired speed of response, penalizes excessive
control action, and can be applied to non-minimum phase systems.
Besides, at steady state, the controller has the ability to regulate the
constant load disturbance to zero. Example simulation results using
both simulated and real plant models demonstrate the effectiveness of
the proposed controller.
Abstract: The paper evaluates several hundred one-day-ahead
VaR forecasting models in the time period between the years 2004
and 2009 on data from six world stock indices - DJI, GSPC, IXIC,
FTSE, GDAXI and N225. The models model mean using the ARMA
processes with up to two lags and variance with one of GARCH,
EGARCH or TARCH processes with up to two lags. The models are
estimated on the data from the in-sample period and their forecasting
accuracy is evaluated on the out-of-sample data, which are more
volatile. The main aim of the paper is to test whether a model
estimated on data with lower volatility can be used in periods with
higher volatility. The evaluation is based on the conditional coverage
test and is performed on each stock index separately. The primary
result of the paper is that the volatility is best modelled using a
GARCH process and that an ARMA process pattern cannot be found
in analyzed time series.
Abstract: The purpose of this work is to present a method for
rigid registration of medical images using 1D binary projections
when a part of one of the two images is missing. We use 1D binary
projections and we adjust the projection limits according to the
reduced image in order to perform accurate registration. We use the
variance of the weighted ratio as a registration function which we
have shown is able to register 2D and 3D images more accurately and
robustly than mutual information methods. The function is computed
explicitly for n=5 Chebyshev points in a [-9,+9] interval and it is
approximated using Chebyshev polynomials for all other points. The
images used are MR scans of the head. We find that the method is
able to register the two images with average accuracy 0.3degrees for
rotations and 0.2 pixels for translations for a y dimension of 156 with
initial dimension 256. For y dimension 128/256 the accuracy
decreases to 0.7 degrees for rotations and 0.6 pixels for translations.