Abstract: Piecewise polynomial regression model is very flexible model for modeling the data. If the piecewise polynomial regression model is matched against the data, its parameters are not generally known. This paper studies the parameter estimation problem of piecewise polynomial regression model. The method which is used to estimate the parameters of the piecewise polynomial regression model is Bayesian method. Unfortunately, the Bayes estimator cannot be found analytically. Reversible jump MCMC algorithm is proposed to solve this problem. Reversible jump MCMC algorithm generates the Markov chain that converges to the limit distribution of the posterior distribution of piecewise polynomial regression model parameter. The resulting Markov chain is used to calculate the Bayes estimator for the parameters of piecewise polynomial regression model.
Abstract: An in vitro study was carried out to evaluate the feasibility of small field of view (FOV) cone beam computed tomography (CBCT) in determining endodontic working length. The objectives were to determine the accuracy of CBCT in measuring the estimated preoperative working lengths (EPWL), endodontic working lengths (EWL) and file lengths. Access cavities were prepared in 27 molars. For each root canal, the baseline electronic working length was determined using an EAL (Raypex 5). The teeth were then divided into overextended, non-modified and underextended groups and the lengths were adjusted accordingly. Imaging and measurements were made using the respective software of the RVG (Kodak RVG 6100) and CBCT units (Kodak 9000 3D). Root apices were then shaved and the apical constrictions viewed under magnification to measure the control working lengths. The paired t-test showed a statistically significant difference between CBCT EPWL and control length but the difference was too small to be clinically significant. From the Bland Altman analysis, the CBCT method had the widest range of 95% limits of agreement, reflecting its greater potential of error. In measuring file lengths, RVG had a bigger window of 95% limits of agreement compared to CBCT. Conclusions: (1) The clinically insignificant underestimation of the preoperative working length using small FOV CBCT showed that it is acceptable for use in the estimation of preoperative working length. (2) Small FOV CBCT may be used in working length determination but it is not as accurate as the currently practiced method of using the EAL. (3) It is also more accurate than RVG in measuring file lengths.
Abstract: Even the behavior problems in pre-school children might be considered as a transitional problem which may disappear by their transition into elementary school; it is an issue that needs a lot of attention because of the fact that the behavioral patterns are adopted in the children especially in this age. Common issue in the process of elimination of the behavior problems in the group of pre-school children is a difference in the perception of the importance and gravity of the symptoms. The underestimation of the children's problems by parents often result into conflicts with kindergarten teachers. Thus, the child does not get the support that his/her problems require and this might result into a school failure and can negatively influence his/her future school performance and success. The research sample consisted of 4 children with behavior problems, their teachers and parents. To determine the most problematic area in the child's behavior, Child Behavior Checklist (CBCL) filled by parents and Caregiver/Teacher Form (CTF-R) filled by teachers were used. Scores from the CBCL and the CTR-F were compared with Pearson correlation coefficient in order to find the differences in the perception of behavior problems in pre-school children.
Abstract: We present in this work our model of road traffic
emissions (line sources) and dispersion of these emissions, named
DISPOLSPEM (Dispersion of Poly Sources and Pollutants Emission
Model). In its emission part, this model was designed to keep the
consistent bottom-up and top-down approaches. It also allows to
generate emission inventories from reduced input parameters being
adapted to existing conditions in Morocco and in the other developing
countries. While several simplifications are made, all the performance
of the model results are kept. A further important advantage of
the model is that it allows the uncertainty calculation and emission
rate uncertainty according to each of the input parameters. In the
dispersion part of the model, an improved line source model has
been developed, implemented and tested against a reference solution.
It provides improvement in accuracy over previous formulas of line
source Gaussian plume model, without being too demanding in terms
of computational resources. In the case study presented here, the
biggest errors were associated with the ends of line source sections;
these errors will be canceled by adjacent sections of line sources
during the simulation of a road network. In cases where the wind
is parallel to the source line, the use of the combination discretized
source and analytical line source formulas minimizes remarkably the
error. Because this combination is applied only for a small number
of wind directions, it should not excessively increase the calculation
time.
Abstract: The tradition competitive newsvendor game assumes decision makers are rational. However, there are behavioral biases when people make decisions, such as loss aversion, mental accounting and overconfidence. Overestimation of a subject’s own performance is one type of overconfidence. The objective of this research is to analyze the impact of the overestimated demand in the newsvendor competitive game with two players. This study builds a competitive newsvendor game model where newsvendors have private information of their demands, which is overestimated. At the same time, demands of each newsvendor forecasted by a third party institution are available. This research shows that the overestimation leads to demand steal effect, which reduces the competitor’s order quantity. However, the overall supply of the product increases due to overestimation. This study illustrates the boundary condition for the overestimated newsvendor to have the equilibrium order drop due to the demand steal effect from the other newsvendor. A newsvendor who has higher critical fractile will see its equilibrium order decrease with the drop of estimation level from the other newsvendor.
Abstract: Relative permeabilities are practical factors that are used to correct the single phase Darcy’s law for application to multiphase flow. For effective characterisation of large-scale multiphase flow in hydrocarbon recovery, relative permeability and capillary pressures are used. These parameters are acquired via special core flooding experiments. Special core analysis (SCAL) module of reservoir simulation is applied by engineers for the evaluation of these parameters. But, core flooding experiments in shale core sample are expensive and time consuming before various flow assumptions are achieved for instance Darcy’s law. This makes it imperative for the application of coreflooding simulations in which various analysis of relative permeabilities and capillary pressures of multiphase flow can be carried out efficiently and effectively at a relative pace. This paper presents a Sendra software simulation of core flooding to achieve to relative permeabilities and capillary pressures using different correlations. The approach used in this study was three steps. The first step, the basic petrophysical parameters of Marcellus shale sample such as porosity was determined using laboratory techniques. Secondly, core flooding was simulated for particular scenario of injection using different correlations. And thirdly the best fit correlations for the estimation of relative permeability and capillary pressure was obtained. This research approach saves cost and time and very reliable in the computation of relative permeability and capillary pressures at steady or unsteady state, drainage or imbibition processes in oil and gas industry when compared to other methods.
Abstract: Compliant foil gas lubricated bearings are used for the
support of light loads in the order of few kilograms at high speeds, in
the order of 50,000 RPM. The stiffness of the foil bearings depends
both on the stiffness of the compliant foil and on the lubricating
gas film. The stiffness of the bearings plays a crucial role in the
stable operation of the supported rotor over a range of speeds. This
paper describes a numerical approach to estimate the stiffness of the
bearings using pseudo spectral scheme. Methodology to obtain the
stiffness of the foil bearing as a function of weight of the shaft is
given and the results are presented.
Abstract: In this paper, a brief review of the corrosion mechanism in buried pipe and modes of failure is provided together with the available corrosion models. Moreover, the sensitivity analysis is performed to understand the influence of corrosion model parameters on the remaining life estimation. Further, the probabilistic analysis is performed to propagate the uncertainty in the corrosion model on the estimation of the renaming life of the pipe. Finally, the comparison among the corrosion models on the basis of the remaining life estimation will be provided to improve the renewal plan.
Abstract: In this paper, propose method that can user’s position
that based on database is built from single camera. Previous
positioning calculate distance by arrival-time of signal like GPS
(Global Positioning System), RF(Radio Frequency). However, these
previous method have weakness because these have large error range
according to signal interference. Method for solution estimate position
by camera sensor. But, signal camera is difficult to obtain relative
position data and stereo camera is difficult to provide real-time
position data because of a lot of image data, too. First of all, in this
research we build image database at space that able to provide
positioning service with single camera. Next, we judge similarity
through image matching of database image and transmission image
from user. Finally, we decide position of user through position of most
similar database image. For verification of propose method, we
experiment at real-environment like indoor and outdoor. Propose
method is wide positioning range and this method can verify not only
position of user but also direction.
Abstract: This paper describes a method for AWGN (Additive White Gaussian Noise) variance estimation in noisy stochastic signals, referred to as Multiplicative-Noising Variance Estimation (MNVE). The aim was to develop an estimation algorithm with minimal number of assumptions on the original signal structure. The provided MATLAB simulation and results analysis of the method applied on speech signals showed more accuracy than standardized AR (autoregressive) modeling noise estimation technique. In addition, great performance was observed on very low signal-to-noise ratios, which in general represents the worst case scenario for signal denoising methods. High execution time appears to be the only disadvantage of MNVE. After close examination of all the observed features of the proposed algorithm, it was concluded it is worth of exploring and that with some further adjustments and improvements can be enviably powerful.
Abstract: Oil palm or Elaeis guineensis is considered as the golden crop in Malaysia. But oil palm industry in this country is now facing with the most devastating disease called as Ganoderma Basal Stem Rot disease. The objective of this paper is to analyze the economic loss due to this disease. There were three commercial oil palm sites selected for collecting the required data for economic analysis. Yield parameter used to measure the loss was the total weight of fresh fruit bunch in six months. The predictors include disease severity, change in disease severity, number of infected neighbor palms, age of palm, planting generation, topography, and first order interaction variables. The estimation model of yield loss was identified by using backward elimination based regression method. Diagnostic checking was conducted on the residual of the best yield loss model. The value of mean absolute percentage error (MAPE) was used to measure the forecast performance of the model. The best yield loss model was then used to estimate the economic loss by using the current monthly price of fresh fruit bunch at mill gate.
Abstract: Information system risk management helps to reduce
or eliminate risk by implementing appropriate controls. In this paper,
we propose a quantification model of controls impact on information
system risks by automatizing the residual criticality estimation step of
FMECA which is based on a inductive reasoning. For this, we defined
three equations based on type and maturity of controls. For testing,
the values obtained with the model were compared to estimated
values given by interlocutors during different working sessions and
the result is satisfactory. This model allows an optimal assessment of
controls maturity and facilitates risk analysis of information system.
Abstract: Risk management in banking sector is a key issue
linked to financial system stability and its importance has been
elevated by technological developments and emergence of new
financial instruments. In this paper, we improve the model previously
defined for quantifying internal control impact on banking risks by
automatizing the residual criticality estimation step of FMECA. For
this, we defined three equations and a maturity coefficient to obtain
a mathematical model which is tested on all banking processes and
type of risks. The new model allows an optimal assessment of residual
criticality and improves the correlation rate that has become 98%.
Abstract: This paper presents a multiscale information measure of
Electroencephalogram (EEG) for analysis with a short data length.
A multiscale extension of permutation entropy (MPE) is capable of
fully reflecting the dynamical characteristics of EEG across different
temporal scales. However, MPE yields an imprecise estimation due
to coarse-grained procedure at large scales. We present an improved
MPE measure to estimate entropy more accurately with a short
time series. By computing entropies of all coarse-grained time series
and averaging those at each scale, it leads to the modified MPE
(MMPE) which provides an enhanced accuracy as compared to
MPE. Simulation and experimental studies confirmed that MMPE
has proved its capability over MPE in terms of accuracy.
Abstract: In the standards of IEC 60076-2 and IEC 60076-7, three different hot-spot temperature estimation methods are suggested. In this study, the algorithms which used in hot-spot temperature calculations are analyzed by comparing the algorithms with the results of an experimental set-up made by a Transformer Monitoring System (TMS) in use. In tested system, TMS uses only top oil temperature and load ratio for hot-spot temperature calculation. And also, it uses some constants from standards which are on agreed statements tables. During the tests, it came out that hot-spot temperature calculation method is just making a simple calculation and not uses significant all other variables that could affect the hot-spot temperature.
Abstract: Numerous signal processing based speech enhancement systems have been proposed to improve intelligibility in the presence of noise. Traditionally, studies of neural vowel encoding have focused on the representation of formants (peaks in vowel spectra) in the discharge patterns of the population of auditory-nerve (AN) fibers. A method is presented for recording high-frequency speech components into a low-frequency region, to increase audibility for hearing loss listeners. The purpose of the paper is to enhance the formant of the speech based on the Kaiser window. The pitch and formant of the signal is based on the auto correlation, zero crossing and magnitude difference function. The formant enhancement stage aims to restore the representation of formants at the level of the midbrain. A MATLAB software’s are used for the implementation of the system with low complexity is developed.
Abstract: Up to this point business process management projects
in general and business process modelling projects in particular
could not rely on a practical and scientifically validated method to
estimate cost and effort. Especially the model development phase
is not covered by a cost estimation method or model. Further
phases of business process modelling starting with implementation
are covered by initial solutions which are discussed in the literature.
This article proposes a method of filling this gap by deriving a cost
estimation method from available methods in similar domains namely
software development or software engineering. Software development
is regarded as closely similar to process modelling as we show. After
the proposition of this method different ideas for further analysis and
validation of the method are proposed. We derive this method from
COCOMO II and Function Point which are established methods of
effort estimation in the domain of software development. For this
we lay out similarities of the software development process and the
process of process modelling which is a phase of the Business Process
Management life-cycle.
Abstract: The estimation of gear tooth stiffness is important for finding the load distribution between the gear teeth when two consecutive sets of teeth are in contact. Based on dynamic model a C-program has been developed to compute mesh stiffness. By using this program position dependent mesh stiffness of spur gear tooth for various profile shifts have been computed for a fixed center distance and altering tooth-sum gearing (100 by ± 4%). It is found that the C-program using dynamic model is one of the rapid soft computing technique which helps in design of gears. The mesh tooth stiffness along the path of contact is studied for both 20° and 25° pressure angle gears at various profile shifts. Better tooth stiffness is noticed in case of negative alteration tooth-sum gears compared to standard and positive alteration tooth-sum gears. Also, in case of negative alteration tooth-sum gearing better mesh stiffness is noticed in 20° pressure angle when compared to 25°.
Abstract: The reduction of road congestion which is inherent to the use of vehicles is an obvious priority to public authority. Therefore, assessing the willingness to pay of an individual in order to save trip-time is akin to estimating the change in price which was the result of setting up a new transport policy to increase the networks fluidity and improving the level of social welfare. This study holds an innovative perspective. In fact, it initiates an economic calculation that has the objective of giving an estimation of the monetized time value during the trips made in Sfax. This research is founded on a double-objective approach. The aim of this study is to i) give an estimation of the monetized value of time; an hour dedicated to trips, ii) determine whether or not the consumer considers the environmental variables to be significant, iii) analyze the impact of applying a public management of the congestion via imposing taxation of city tolls on urban dwellers. This article is built upon a rich field survey led in the city of Sfax. With the use of the contingent valuation method, we analyze the “declared time preferences” of 450 drivers during rush hours. Based on the fond consideration of attributed bias of the applied method, we bring to light the delicacy of this approach with regards to the revelation mode and the interrogative techniques by following the NOAA panel recommendations bearing the exception of the valorization point and other similar studies about the estimation of transportation externality.
Abstract: This paper reviews a number of theoretical aspects
for implementing an explicit spatial perspective in econometrics
for modelling non-continuous data, in general, and count data, in
particular. It provides an overview of the several spatial econometric
approaches that are available to model data that are collected with
reference to location in space, from the classical spatial econometrics
approaches to the recent developments on spatial econometrics to
model count data, in a Bayesian hierarchical setting. Considerable
attention is paid to the inferential framework, necessary for
structural consistent spatial econometric count models, incorporating
spatial lag autocorrelation, to the corresponding estimation and
testing procedures for different assumptions, to the constrains and
implications embedded in the various specifications in the literature. This review combines insights from the classical spatial
econometrics literature as well as from hierarchical modeling and
analysis of spatial data, in order to look for new possible directions
on the processing of count data, in a spatial hierarchical Bayesian
econometric context.