Abstract: Amphawa is the most popular weekend destination for
both domestic and international tourists in Thailand. More than 112
homestays and resorts have been developed along the water
resources. This research aims to initiate appropriate environmental
management system for riverside tourist accommodations in
Amphawa by investigating current environmental characteristics.
Eighty-eight riverside tourist accommodations were survey from
specific questionnaire, GPS data were also gathered for spatial
analysis. The results revealed that the accommodations are welled
manage in regards to some environmental aspects. In order to reduce
economic costs, energy efficiency equipment is utilized. A substantial
number of tourist accommodations encouraged waste separation,
followed by transfer to local administration organization. Grease
traps also utilized in order to decrease chemical discharged, grease
and oil from canteen and restaurants on natural environment. The
most notable mitigation is to initiate environmental friendly cleansers
for tourist accommodation along the riverside in tourism destinations.
Abstract: The roll center is one of the key parameters for designing a suspension. Several driving characteristics are affected significantly by the migration of the roll center during the suspension-s motion. The strut/SLA (strut/short-long-arm) suspension, which is widely used in production cars, combines the space-saving characteristics of a MacPherson strut suspension with some of the preferred handling characteristics of an SLA suspension. In this study, a front strut/SLA suspension is modeled by ADAMS/Car software. Kinematic roll analysis is then employed to investigate how the rolling characteristics change under the wheel travel and steering input. The related parameters, including the roll center height, roll camber gain, toe change, scrub radius and wheel track width change, are analyzed and discussed. It is found that the strut/SLA suspension clearly has a higher roll center than strut and SLA suspensions do. The variations in the roll center height under roll analysis are very different as the wheel travel displacement and steering angle are added. The results of the roll camber gain, scrub radius and wheel track width change are considered satisfactory. However, the toe change is too large and needs fine-tuning through a sensitivity analysis.
Abstract: Water samples were collected from river Pandu at six
stations where human and animal activities were high. Composite
samples were analyzed for dissolved oxygen (DO), biochemical
oxygen demand (BOD), chemical oxygen demand (COD) , pH values
during dry and wet seasons as well as the harmattan period. The total
data points were used to establish relationships between the
parameters and data were also subjected to statistical analysis and
expressed as mean ± standard error of mean (SEM) at a level of
significance of p
Abstract: The problem of incompressible steady flow simulation around an airfoil is discussed. For some simplest airfoils (circular, elliptical, Zhukovsky airfoils) the exact solution is known from complex analysis. It allows to compute the intensity of vortex layer which simulates the airfoil. Some modifications of the vortex element method are proposed and test computations are carried out. It-s shown that the these approaches are much more effective in comparison with the classical numerical scheme.
Abstract: The purpose of this study was to investigate the effectiveness of a recreational workout program for adults with disabilities over two semesters. This investigation was an action study conducted in a naturalistic setting. Participants included equal numbers of adults with severe cognitive impairments (n = 35) and adults without disabilities (n = 35). Adults with disabilities severe cognitive impairments were trained 6 self-initiated workout activities over two semesters by adults without disabilities. The numbers of task-analyzed steps of each activity performed correctly by each participant at the first and last weeks of each semester were used for data analysis. Results of the paired t-tests indicate that across two semesters, significant differences between the first and last weeks were found on 4 out of the 6 task-analyzed workout activities at a statistical level of significance p < .05. The recreational workout program developed in this study was effective.
Abstract: Lacking an inherent “natural" dissimilarity measure
between objects in categorical dataset presents special difficulties in
clustering analysis. However, each categorical attributes from a given
dataset provides natural probability and information in the sense of
Shannon. In this paper, we proposed a novel method which
heuristically converts categorical attributes to numerical values by
exploiting such associated information. We conduct an experimental
study with real-life categorical dataset. The experiment demonstrates
the effectiveness of our approach.
Abstract: The purpose of this paper is applied Taguchi method on the optimization for PEMFC performance, and a representative Computational Fluid Dynamics (CFD) model is selectively performed for statistical analysis. The studied factors in this paper are pressure of fuel cell, operating temperature, the relative humidity of anode and cathode, porosity of gas diffusion electrode (GDE) and conductivity of GDE. The optimal combination for maximum power density is gained by using a three-level statistical method. The results confirmed that the robustness of the optimum design parameters influencing the performance of fuel cell are founded by pressure of fuel cell, 3atm; operating temperature, 353K; the relative humidity of anode, 50%; conductivity of GDE, 1000 S/m, but the relative humidity of cathode and porosity of GDE are pooled as error due to a small sum of squares. The present simulation results give designers the ideas ratify the effectiveness of the proposed robust design methodology for the performance of fuel cell.
Abstract: Image mosaicing techniques are usually employed to offer researchers a wider field of view of microscopic image of biological samples. a mosaic is commonly achieved using automated microscopes and often with one “color" channel, whether it refers to natural or fluorescent analysis. In this work we present a method to achieve three subsequent mosaics of the same part of a stem cell culture analyzed in phase contrast and in fluorescence, with a common non-automated inverted microscope. The mosaics obtained are then merged together to mark, in the original contrast phase images, nuclei and cytoplasm of the cells referring to a mosaic of the culture, rather than to single images. The experiments carried out prove the effectiveness of our approach with cultures of cells stained with calcein (green/cytoplasm and nuclei) and hoechst (blue/nuclei) probes.
Abstract: The study aimed to verify a hypothesis that a sense of
fulfillment in student life and perceived stress in training in the
facilities could affect vocational identity among social welfare
university students, in order to acquire implications for enhancing the
vocational consciousness. A questionnaire survey was conducted with
388 third- and fourth-year students of training course for certified
social workers in three universities in A prefecture in Japan. The
questionnaire was returned by 338 students, and 288 responses
(85.2%) were valid and used for the analysis. As a SEM result, the
hypothesized model proved to be fit to the data. Path coefficient of
sense of fulfillment of student life to vocational identity was
statistically positive. Path coefficient of training stress to vocational
identity was statistically negative.
Abstract: Performance of a dual maximal ratio combining
receiver has been analyzed for M-ary coherent and non-coherent
modulations over correlated Nakagami-m fading channels with nonidentical
and arbitrary fading parameter. The classical probability
density function (PDF) based approach is used for analysis.
Expressions for outage probability and average symbol error
performance for M-ary coherent and non-coherent modulations have
been obtained. The obtained results are verified against the special
case published results and found to be matching. The effect of the
unequal fading parameters, branch correlation and unequal input
average SNR on the receiver performance has been studied.
Abstract: Most of the well known methods for generating
Gaussian variables require at least one standard uniform distributed
value, for each Gaussian variable generated. The length of the
random number generator therefore, limits the number of
independent Gaussian distributed variables that can be generated
meanwhile the statistical solution of complex systems requires a
large number of random numbers for their statistical analysis. We
propose an alternative simple method of generating almost infinite
number of Gaussian distributed variables using a limited number of
standard uniform distributed random numbers.
Abstract: Repeated observation of a given area over time yields
potential for many forms of change detection analysis. These
repeated observations are confounded in terms of radiometric
consistency due to changes in sensor calibration over time,
differences in illumination, observation angles and variation in
atmospheric effects.
This paper demonstrates applicability of an empirical relative
radiometric normalization method to a set of multitemporal cloudy
images acquired by Resourcesat1 LISS III sensor. Objective of this
study is to detect and remove cloud cover and normalize an image
radiometrically. Cloud detection is achieved by using Average
Brightness Threshold (ABT) algorithm. The detected cloud is
removed and replaced with data from another images of the same
area. After cloud removal, the proposed normalization method is
applied to reduce the radiometric influence caused by non surface
factors. This process identifies landscape elements whose reflectance
values are nearly constant over time, i.e. the subset of non-changing
pixels are identified using frequency based correlation technique. The
quality of radiometric normalization is statistically assessed by R2
value and mean square error (MSE) between each pair of analogous
band.
Abstract: Since polymerase chain reaction (PCR) has been
invented, it has emerged as a powerful tool in genetic analysis. The
PCR products are closely linked with thermal cycles. Therefore, to
reduce the reaction time and make temperature distribution uniform in
the reaction chamber, a novel oscillatory thermal cycler is designed.
The sample is placed in a fixed chamber, and three constant isothermal
zones are established and lined in the system. The sample is oscillated
and contacted with three different isothermal zones to complete
thermal cycles. This study presents the design of the geometric
characteristics of the chamber. The commercial software
CFD-ACE+TM is utilized to investigate the influences of various
materials, heating times, chamber volumes, and moving speed of the
chamber on the temperature distributions inside the chamber. The
chamber moves at a specific velocity and the boundary conditions
with time variations are related to the moving speed. Whereas the
chamber moves, the boundary is specified at the conditions of the
convection or the uniform temperature. The user subroutines compiled
by the FORTRAN language are used to make the numerical results
realistically. Results show that the reaction chamber with a rectangular
prism is heated on six faces; the effects of various moving speeds of
the chamber on the temperature distributions are examined. Regarding
to the temperature profiles and the standard deviation of the
temperature at the Y-cut cross section, the non-uniform temperature
inside chamber is found as the moving speed is larger than 0.01 m/s.
By reducing the heating faces to four, the standard deviation of the
temperature of the reaction chamber is under 1.4×10-3K with the range
of velocities between 0.0001 m/s and 1 m/s. The nature convective
boundary conditions are set at all boundaries while the chamber moves
between two heaters, the effects of various moving velocities of the
chamber on the temperature distributions are negligible at the assigned
time duration.
Abstract: The automatic discrimination of seismic signals is an important practical goal for earth-science observatories due to the large amount of information that they receive continuously. An essential discrimination task is to allocate the incoming signal to a group associated with the kind of physical phenomena producing it. In this paper, two classes of seismic signals recorded routinely in geophysical laboratory of the National Center for Scientific and Technical Research in Morocco are considered. They correspond to signals associated to local earthquakes and chemical explosions. The approach adopted for the development of an automatic discrimination system is a modular system composed by three blocs: 1) Representation, 2) Dimensionality reduction and 3) Classification. The originality of our work consists in the use of a new wavelet called "modified Mexican hat wavelet" in the representation stage. For the dimensionality reduction, we propose a new algorithm based on the random projection and the principal component analysis.
Abstract: This research presents the development of simulation
modeling for WIP management in semiconductor fabrication.
Manufacturing simulation modeling is needed for productivity
optimization analysis due to the complex process flows involved
more than 35 percent re-entrance processing steps more than 15 times
at same equipment. Furthermore, semiconductor fabrication required
to produce high product mixed with total processing steps varies from
300 to 800 steps and cycle time between 30 to 70 days. Besides the
complexity, expansive wafer cost that potentially impact the
company profits margin once miss due date is another motivation to
explore options to experiment any analysis using simulation
modeling. In this paper, the simulation model is developed using
existing commercial software platform AutoSched AP, with
customized integration with Manufacturing Execution Systems
(MES) and Advanced Productivity Family (APF) for data collections
used to configure the model parameters and data source. Model
parameters such as processing steps cycle time, equipment
performance, handling time, efficiency of operator are collected
through this customization. Once the parameters are validated, few
customizations are made to ensure the prior model is executed. The
accuracy for the simulation model is validated with the actual output
per day for all equipments. The comparison analysis from result of
the simulation model compared to actual for achieved 95 percent
accuracy for 30 days. This model later was used to perform various
what if analysis to understand impacts on cycle time and overall
output. By using this simulation model, complex manufacturing
environment like semiconductor fabrication (fab) now have
alternative source of validation for any new requirements impact
analysis.
Abstract: A method is presented for obtaining the error probability for block codes. The method is based on the eigenvalueeigenvector properties of the code correlation matrix. It is found that under a unary transformation and for an additive white Gaussian noise environment, the performance evaluation of a block code becomes a one-dimensional problem in which only one eigenvalue and its corresponding eigenvector are needed in the computation. The obtained error rate results show remarkable agreement between simulations and analysis.
Abstract: Documents retrieval in Information Retrieval
Systems (IRS) is generally about understanding of
information in the documents concern. The more the system
able to understand the contents of documents the more
effective will be the retrieval outcomes. But understanding of the
contents is a very complex task. Conventional IRS apply algorithms
that can only approximate the meaning of document contents through
keywords approach using vector space model. Keywords may be
unstemmed or stemmed. When keywords are stemmed and conflated
in retrieving process, we are a step forwards in applying semantic
technology in IRS. Word stemming is a process in morphological
analysis under natural language processing, before syntactic and
semantic analysis. We have developed algorithms for Malay and
Arabic and incorporated stemming in our experimental systems in
order to measure retrieval effectiveness. The results have shown that
the retrieval effectiveness has increased when stemming is used in
the systems.
Abstract: The study of proteomics reached unexpected levels of
interest, as a direct consequence of its discovered influence over some
complex biological phenomena, such as problematic diseases like
cancer. This paper presents the latest authors- achievements regarding
the analysis of the networks of proteins (interactome networks), by
computing more efficiently the betweenness centrality measure. The
paper introduces the concept of betweenness centrality, and then
describes how betweenness computation can help the interactome net-
work analysis. Current sequential implementations for the between-
ness computation do not perform satisfactory in terms of execution
times. The paper-s main contribution is centered towards introducing
a speedup technique for the betweenness computation, based on
modified shortest path algorithms for sparse graphs. Three optimized
generic algorithms for betweenness computation are described and
implemented, and their performance tested against real biological
data, which is part of the IntAct dataset.
Abstract: In this work, the primary compressive strength
components of human femur trabecular bone are qualitatively
assessed using image processing and wavelet analysis. The Primary
Compressive (PC) component in planar radiographic femur trabecular
images (N=50) is delineated by semi-automatic image processing
procedure. Auto threshold binarization algorithm is employed to
recognize the presence of mineralization in the digitized images. The
qualitative parameters such as apparent mineralization and total area
associated with the PC region are derived for normal and abnormal
images.The two-dimensional discrete wavelet transforms are utilized
to obtain appropriate features that quantify texture changes in medical
images .The normal and abnormal samples of the human femur are
comprehensively analyzed using Harr wavelet.The six statistical
parameters such as mean, median, mode, standard deviation, mean
absolute deviation and median absolute deviation are derived at level
4 decomposition for both approximation and horizontal wavelet
coefficients. The correlation coefficient of various wavelet derived
parameters with normal and abnormal for both approximated and
horizontal coefficients are estimated. It is seen that in almost all cases
the abnormal show higher degree of correlation than normals. Further
the parameters derived from approximation coefficient show more
correlation than those derived from the horizontal coefficients. The
parameters mean and median computed at the output of level 4 Harr
wavelet channel was found to be a useful predictor to delineate the
normal and the abnormal groups.
Abstract: The clinical usefulness of heart rate variability is
limited to the range of Holter monitoring software available. These
software algorithms require a normal sinus rhythm to accurately
acquire heart rate variability (HRV) measures in the frequency
domain. Premature ventricular contractions (PVC) or more
commonly referred to as ectopic beats, frequent in heart failure,
hinder this analysis and introduce ambiguity. This investigation
demonstrates an algorithm to automatically detect ectopic beats by
analyzing discrete wavelet transform coefficients. Two techniques
for filtering and replacing the ectopic beats from the RR signal are
compared. One technique applies wavelet hard thresholding
techniques and another applies linear interpolation to replace ectopic
cycles. The results demonstrate through simulation, and signals
acquired from a 24hr ambulatory recorder, that these techniques can
accurately detect PVC-s and remove the noise and leakage effects
produced by ectopic cycles retaining smooth spectra with the
minimum of error.