Abstract: Quality control charts are very effective in detecting
out of control signals but when a control chart signals an out of
control condition of the process mean, searching for a special cause
in the vicinity of the signal time would not always lead to prompt
identification of the source(s) of the out of control condition as the
change point in the process parameter(s) is usually different from the
signal time. It is very important to manufacturer to determine at what
point and which parameters in the past caused the signal. Early
warning of process change would expedite the search for the special
causes and enhance quality at lower cost. In this paper the quality
variables under investigation are assumed to follow a multivariate
normal distribution with known means and variance-covariance
matrix and the process means after one step change remain at the new
level until the special cause is being identified and removed, also it is
supposed that only one variable could be changed at the same time.
This research applies artificial neural network (ANN) to identify the
time the change occurred and the parameter which caused the change
or shift. The performance of the approach was assessed through a
computer simulation experiment. The results show that neural
network performs effectively and equally well for the whole shift
magnitude which has been considered.
Abstract: In this paper, parallelism in the solution of Ordinary
Differential Equations (ODEs) to increase the computational speed is
studied. The focus is the development of parallel algorithm of the two
point Block Backward Differentiation Formulas (PBBDF) that can
take advantage of the parallel architecture in computer technology.
Parallelism is obtained by using Message Passing Interface (MPI).
Numerical results are given to validate the efficiency of the PBBDF
implementation as compared to the sequential implementation.
Abstract: In this paper, based on the estimation of the Cauchy matrix of linear impulsive differential equations, by using Banach fixed point theorem and Gronwall-Bellman-s inequality, some sufficient conditions are obtained for the existence and exponential stability of almost periodic solution for Cohen-Grossberg shunting inhibitory cellular neural networks (SICNNs) with continuously distributed delays and impulses. An example is given to illustrate the main results.
Abstract: Palladium-catalyzed hydrodechlorination is a
promising alternative for the treatment of environmentally relevant
water bodies, such as groundwater, contaminated with chlorinated
organic compounds (COCs). In the aqueous phase
hydrodechlorination of COCs, Pd-based catalysts were found to have
a very high catalytic activity. However, the full utilization of the
catalyst-s potential is impeded by the sensitivity of the catalyst to
poisoning and deactivation induced by reduced sulfur compounds
(e.g. sulfides). Several regenerants have been tested before to recover
the performance of sulfide-fouled Pd catalyst. But these only
delivered partial success with respect to re-establishment of the
catalyst activity. In this study, the deactivation behaviour of
Pd/Al2O3 in the presence of sulfide was investigated. Subsequent to
total deactivation the catalyst was regenerated in the aqueous phase
using potassium permanganate. Under neutral pH condition,
oxidative regeneration with permanganate delivered a slow recovery
of catalyst activity. However, changing the pH of the bulk solution to
acidic resulted in the complete recovery of catalyst activity within a
regeneration time of about half an hour. These findings suggest the
superiority of permanganate as regenerant in re-activating Pd/Al2O3
by oxidizing Pd-bound sulfide.
Abstract: The quest for alternatefuels for a CI engine has
become all the more imperative considering its importance in the
economy of a nation and from the standpoint of preserving the environment. Reported in this paper are the combustion performance and P-θ characteristics of a CI engine operating on B20 biodiesel fuel derived from Jatropha oil.Itis observed that the twin effect of advancing the injection timing and increasing the injector opening pressure (IOP) up to 220 barhas resulted in minimum brake specific
energy consumption and higherpeak pressure. It is also observed that
the crank angle of occurrence of peak pressure progressestowards top
dead center (TDC) as the timing is advanced and IOP is increased.
Abstract: In this paper, to optimize the “Characteristic Straight Line Method" which is used in the soil displacement analysis, a “best estimate" of the geodetic leveling observations has been achieved by taking in account the concept of 'Height systems'. This concept has been discussed in detail and consequently the concept of “height". In landslides dynamic analysis, the soil is considered as a mosaic of rigid blocks. The soil displacement has been monitored and analyzed by using the “Characteristic Straight Line Method". Its characteristic components have been defined constructed from a “best estimate" of the topometric observations. In the measurement of elevation differences, we have used the most modern leveling equipment available. Observational procedures have also been designed to provide the most effective method to acquire data. In addition systematic errors which cannot be sufficiently controlled by instrumentation or observational techniques are minimized by applying appropriate corrections to the observed data: the level collimation correction minimizes the error caused by nonhorizontality of the leveling instrument's line of sight for unequal sight lengths, the refraction correction is modeled to minimize the refraction error caused by temperature (density) variation of air strata, the rod temperature correction accounts for variation in the length of the leveling rod' s Invar/LO-VAR® strip which results from temperature changes, the rod scale correction ensures a uniform scale which conforms to the international length standard and the introduction of the concept of the 'Height systems' where all types of height (orthometric, dynamic, normal, gravity correction, and equipotential surface) have been investigated. The “Characteristic Straight Line Method" is slightly more convenient than the “Characteristic Circle Method". It permits to evaluate a displacement of very small magnitude even when the displacement is of an infinitesimal quantity. The inclination of the landslide is given by the inverse of the distance reference point O to the “Characteristic Straight Line". Its direction is given by the bearing of the normal directed from point O to the Characteristic Straight Line (Fig..6). A “best estimate" of the topometric observations was used to measure the elevation of points carefully selected, before and after the deformation. Gross errors have been eliminated by statistical analyses and by comparing the heights within local neighborhoods. The results of a test using an area where very interesting land surface deformation occurs are reported. Monitoring with different options and qualitative comparison of results based on a sufficient number of check points are presented.
Abstract: The development of the signal compression
algorithms is having compressive progress. These algorithms are
continuously improved by new tools and aim to reduce, an average,
the number of bits necessary to the signal representation by means of
minimizing the reconstruction error. The following article proposes
the compression of Arabic speech signal by a hybrid method
combining the wavelet transform and the linear prediction. The
adopted approach rests, on one hand, on the original signal
decomposition by ways of analysis filters, which is followed by the
compression stage, and on the other hand, on the application of the
order 5, as well as, the compression signal coefficients. The aim of
this approach is the estimation of the predicted error, which will be
coded and transmitted. The decoding operation is then used to
reconstitute the original signal. Thus, the adequate choice of the
bench of filters is useful to the transform in necessary to increase the
compression rate and induce an impercevable distortion from an
auditive point of view.
Abstract: The paper focuses on the enhanced stiffness modeling
of robotic manipulators by taking into account influence of the external force/torque acting upon the end point. It implements the
virtual joint technique that describes the compliance of manipulator elements by a set of localized six-dimensional springs separated by
rigid links and perfect joints. In contrast to the conventional
formulation, which is valid for the unloaded mode and small
displacements, the proposed approach implicitly assumes that the loading leads to the non-negligible changes of the manipulator posture and corresponding amendment of the Jacobian. The
developed numerical technique allows computing the static
equilibrium and relevant force/torque reaction of the manipulator for
any given displacement of the end-effector. This enables designer
detecting essentially nonlinear effects in elastic behavior of
manipulator, similar to the buckling of beam elements. It is also proposed the linearization procedure that is based on the inversion of
the dedicated matrix composed of the stiffness parameters of the
virtual springs and the Jacobians/Hessians of the active and passive
joints. The developed technique is illustrated by an application example that deals with the stiffness analysis of a parallel
manipulator of the Orthoglide family
Abstract: This research’s objective is to select the model with
most accurate value by using Neural Network Technique as a way to
filter potential students who enroll in IT course by Electronic learning
at Suan Suanadha Rajabhat University. It is designed to help students
selecting the appropriate courses by themselves. The result showed
that the most accurate model was 100 Folds Cross-validation which
had 73.58% points of accuracy.
Abstract: Solidification cracking and hydrogen cracking are some defects generated in the fusion welding of ultrahigh carbon steels. However, friction stir welding (FSW) of such steels, being a solid-state technique, has been demonstrated to alleviate such problems encountered in traditional welding. FSW include different process parameters that must be carefully defined prior processing. These parameters included but not restricted to: tool feed, tool RPM, tool geometry, tool tilt angle. These parameters form a key factor behind avoiding warm holes and voids behind the tool and in achieving a defect-free weld. More importantly, these parameters directly affect the microstructure of the weld and hence the final mechanical properties of weld. For that, 3D finite element (FE) thermo-mechanical model was developed using DEFORM 3D to simulate FSW of carbon steel. At points of interest in the joint, tracking is done for history of critical state variables such as temperature, stresses, and strain rates. Typical results found include the ability to simulate different weld zones. Simulations predictions were successfully compared to experimental FSW tests. It is believed that such a numerical model can be used to optimize FSW processing parameters to favor desirable defect free weld with better mechanical properties.
Abstract: MBMS is a unidirectional point-to-multipoint bearer
service in which data are transmitted from a single source entity to
multiple recipients. For a mobile to support the MBMS, MBMS client
functions as well as MBMS radio protocols should be designed and
implemented. In this paper, we analyze the MBMS client functions
and describe the implementation of them in our mobile test-bed. User
operations and signaling flows between protocol entities to control the
MBMS functions are designed in detail. Service announcement
utilizing the file download MBMS service and four MBMS user
services are demonstrated in the test-bed to verify the MBMS client
functions.
Abstract: Effectiveness and efficiency of food distribution is necessary to maintain food security in a region. Food supply varies among regions depending on their production capacity; therefore, it is necessary to regulate food distribution. Sea transportation could play a great role in the food distribution system. To play this role and to support transportation needs in the Eastern Indonesia, sea transportation shall be supported by fleet which is adequate and reliable, both in terms of load and worthiness. This research uses Linear Programming (LP) method to analyze food distribution pattern in order to determine the optimal distribution system. In this research, transshipment points have been selected for regions in one province. Comparison between result of modeling and existing shipping route reveals that from 369 existing routes, 54 routes are used for transporting rice, corn, green bean, peanut, soybean, sweet potato, and cassava.
Abstract: This paper deals with a novel technique for the
fabrication of Spiral grooves in a dynamic thrust bearing. The main
scheme proposed in this paper is to fabricate the microgrooves using
desktop forming system. This process has advantages compared to the
conventional electro-chemical machining in the viewpoint of a higher
productivity. For this reason, a new testing apparatus is designed and
built for press forming microgrooves on a surface of the thrust bearing.
The material used in this study is sintered Cu-Fe alloy. The effects of
the forming load on the performance of micro press forming are
experimentally investigated. From the experimental results, formed
depths are closed to the target ones with increasing the forming load.
Abstract: Compensating physiological motion in the context
of minimally invasive cardiac surgery has become an attractive
issue since it outperforms traditional cardiac procedures offering
remarkable benefits. Owing to space restrictions, computer vision
techniques have proven to be the most practical and suitable solution.
However, the lack of robustness and efficiency of existing methods
make physiological motion compensation an open and challenging
problem. This work focusses on increasing robustness and efficiency
via exploration of the classes of 1−and 2−regularized optimization,
emphasizing the use of explicit regularization. Both approaches are
based on natural features of the heart using intensity information.
Results pointed out the 1−regularized optimization class as the best
since it offered the shortest computational cost, the smallest average
error and it proved to work even under complex deformations.
Abstract: Fermented beverages have high expression in the
market for beverages in general, is increasingly valued in situations
where the characteristic aroma and flavor of the material that gave
rise to them are kept after processing. This study aimed to develop a
distilled beverage from passion fruit, and assess, by sensory tests and
chromatographic profile, the influence of different treatments (FM1-
spirit with pulp addiction and FM2 – spirit with bigger ratio of pulp
in must) in the setting of volatiles in the fruit drink, and performing
chemical characterization taking into account the main parameters of
quality established by the legislation. The chromatograms and the
first sensorial tests had indicated that sample FM1 possess better
characteristics of aroma, as much of how much quantitative the
qualitative point of view. However, it analyzes it sensorial end
(preference test) disclosed the biggest preference of the cloth provers
for sample FM2-2 (note 7.93), being the attributes of decisive color
and flavor in this reply, confirmed for the observed values lowest of
fixed and total acidity in the samples of treatment FM2.
Abstract: This paper presents an idea to improve the efficiency
of security checks in airports through the active tracking and
monitoring of passengers and staff using OFDM modulation
technique and Finger print authentication. The details of the
passenger are multiplexed using OFDM .To authenticate the
passenger, the fingerprint along with important identification
information is collected. The details of the passenger can be
transmitted after necessary modulation, and received using various
transceivers placed within the premises of the airport, and checked at
the appropriate check points, thereby increasing the efficiency of
checking. OFDM has been employed for spectral efficiency.
Abstract: In general, class complexity is measured based on any
one of these factors such as Line of Codes (LOC), Functional points
(FP), Number of Methods (NOM), Number of Attributes (NOA) and so on. There are several new techniques, methods and metrics with
the different factors that are to be developed by the researchers for calculating the complexity of the class in Object Oriented (OO)
software. Earlier, Arockiam et.al has proposed a new complexity measure namely Extended Weighted Class Complexity (EWCC)
which is an extension of Weighted Class Complexity which is proposed by Mishra et.al. EWCC is the sum of cognitive weights of
attributes and methods of the class and that of the classes derived. In EWCC, a cognitive weight of each attribute is considered to be 1.
The main problem in EWCC metric is that, every attribute holds the
same value but in general, cognitive load in understanding the
different types of attributes cannot be the same. So here, we are proposing a new metric namely Attribute Weighted Class Complexity
(AWCC). In AWCC, the cognitive weights have to be assigned for the attributes which are derived from the effort needed to understand
their data types. The proposed metric has been proved to be a better
measure of complexity of class with attributes through the case studies and experiments
Abstract: The zero inflated models are usually used in modeling
count data with excess zeros where the existence of the excess zeros
could be structural zeros or zeros which occur by chance. These type
of data are commonly found in various disciplines such as finance,
insurance, biomedical, econometrical, ecology, and health sciences
which involve sex and health dental epidemiology. The most popular
zero inflated models used by many researchers are zero inflated
Poisson and zero inflated negative binomial models. In addition, zero
inflated generalized Poisson and zero inflated double Poisson models
are also discussed and found in some literature. Recently zero
inflated inverse trinomial model and zero inflated strict arcsine
models are advocated and proven to serve as alternative models in
modeling overdispersed count data caused by excessive zeros and
unobserved heterogeneity. The purpose of this paper is to review
some related literature and provide a variety of examples from
different disciplines in the application of zero inflated models.
Different model selection methods used in model comparison are
discussed.
Abstract: A full six degrees of freedom (6-DOF) flight dynamics
model is proposed for the accurate prediction of short and long-range
trajectories of high spin and fin-stabilized projectiles via atmospheric
flight to final impact point. The projectiles is assumed to be both rigid
(non-flexible), and rotationally symmetric about its spin axis launched
at low and high pitch angles. The mathematical model is based on the
full equations of motion set up in the no-roll body reference frame and
is integrated numerically from given initial conditions at the firing
site. The projectiles maneuvering motion depends on the most
significant force and moment variations, in addition to wind and
gravity. The computational flight analysis takes into consideration the
Mach number and total angle of attack effects by means of the
variable aerodynamic coefficients. For the purposes of the present
work, linear interpolation has been applied from the tabulated database
of McCoy-s book. The developed computational method gives
satisfactory agreement with published data of verified experiments and
computational codes on atmospheric projectile trajectory analysis for
various initial firing flight conditions.
Abstract: In the paper the method of product analysis from
recycling point of view has been described. The analysis bases on set
of measures that assess a product from the point of view of final
stages of its lifecycle. It was assumed that such analysis will be
performed at the design phase – in order to conduct such analysis the
computer system that aids the designer during the design process has
been developed. The structure of the computer tool, based on agent
technology, and example results has been also included in the paper.