Abstract: An advanced Monte Carlo simulation method, called Subset Simulation (SS) for the time-dependent reliability prediction for underground pipelines has been presented in this paper. The SS can provide better resolution for low failure probability level with efficient investigating of rare failure events which are commonly encountered in pipeline engineering applications. In SS method, random samples leading to progressive failure are generated efficiently and used for computing probabilistic performance by statistical variables. SS gains its efficiency as small probability event as a product of a sequence of intermediate events with larger conditional probabilities. The efficiency of SS has been demonstrated by numerical studies and attention in this work is devoted to scrutinise the robustness of the SS application in pipe reliability assessment. It is hoped that the development work can promote the use of SS tools for uncertainty propagation in the decision-making process of underground pipelines network reliability prediction.
Abstract: In Peer-to-Peer service networks, where peers offer any kind of publicly available services or applications, intuitive navigation through all services in the network becomes more difficult as the number of services increases. In this article, a concept is discussed that enables users to intuitively browse and use large scale P2P service networks. The concept extends the idea of creating virtual 3D-environments solely based on Peer-to-Peer technologies. Aside from browsing, users shall have the possibility to emphasize services of interest using their own semantic criteria. The appearance of the virtual world shall intuitively reflect network properties that may be of interest for the user. Additionally, the concept comprises options for load- and traffic-balancing. In this article, the requirements concerning the underlying infrastructure and the graphical user interface are defined. First impressions of the appearance of future systems are presented and the next steps towards a prototypical implementation are discussed.
Abstract: The purposes of this research are to study and develop
the algorithm of Thai spoonerism words by semi-automatic computer
programs, that is to say, in part of data input, syllables are already
separated and in part of spoonerism, the developed algorithm is
utilized, which can establish rules and mechanisms in Thai
spoonerism words for bi-syllables by utilizing analysis in elements of
the syllables, namely cluster consonant, vowel, intonation mark and
final consonant. From the study, it is found that bi-syllable Thai
spoonerism has 1 case of spoonerism mechanism, namely
transposition in value of vowel, intonation mark and consonant of
both 2 syllables but keeping consonant value and cluster word (if
any).
From the study, the rules and mechanisms in Thai spoonerism
word were applied to develop as Thai spoonerism word software,
utilizing PHP program. the software was brought to conduct a
performance test on software execution; it is found that the program
performs bi-syllable Thai spoonerism correctly or 99% of all words
used in the test and found faults on the program at 1% as the words
obtained from spoonerism may not be spelling in conformity with
Thai grammar and the answer in Thai spoonerism could be more than
1 answer.
Abstract: Empty Fruit Bunches (EFB) and Palm Oil Mill
Effluent (POME) are two main wastes from oil palm industries which
contain rich lignocellulose. Degradation of EFB and POME by
microorganisms will produce hydrolytic enzyme which will degrade
cellulose and hemicellulose during composting process. However,
normal composting takes about four to six months to reach maturity.
Hence, application of fungi into compost can shorten the period of
composting. This study identifies the effect of xylanase and cellulase
produced by Aspergillus niger and Trichoderma virens on
composting process using EFB and POME. The degradation of EFB
and POME indicates the lignocellulolytic capacity of Aspergillus
niger and Trichoderma virens with more than 7% decrease in
hemicellulose and more than 25% decrease in cellulose for both
inoculated compost. Inoculation of Aspergillus niger and
Trichoderma virens also increased the enzyme activities during the
composting period compared to the control compost by 21% for both
xylanase and cellulase. Rapid rise in the activities of cellulase and
xylanase was observed by Aspergillus niger with the highest
activities of 14.41 FPU/mg and 3.89 IU/mg, respectively. Increased
activities of cellulase and xylanase also occurred in inoculation of
Trichoderma virens with the highest activities obtained at 13.21
FPU/mg and 4.43 IU/mg, respectively. Therefore, it is evident that
the inoculation of fungi can increase the enzyme activities hence
effectively degrading the EFB and POME.
Abstract: Pretreatment of lignocellulosic biomass materials from
poplar, acacia, oak, and fir with different ionic liquids (ILs)
containing 1-alkyl-3-methyl-imidazolium cations and various anions
has been carried out. The dissolved cellulose from biomass was
precipitated by adding anti-solvents into the solution and vigorous
stirring. Commercial cellulases Celluclast 1.5L and Accelerase 1000
have been used for hydrolysis of untreated and pretreated
lignocellulosic biomass. Among the tested ILs, [Emim]COOCH3
showed the best efficiency, resulting in highest amount of liberated
reducing sugars. Pretreatment of lignocellulosic biomass using
glycerol-ionic liquids combined pretreatment and dilute acid-ionic
liquids combined pretreatment were evaluated and compared with
glycerol pretreatment, ionic liquids pretreatment and dilute acid
pretreatment.
Abstract: IVE toolkit has been created for facilitating research,education and development in the field of virtual storytelling and computer games. Primarily, the toolkit is intended for modelling action selection mechanisms of virtual humans, investigating level-of-detail AI techniques for large virtual environments, and for exploring joint behaviour and role-passing technique (Sec. V). Additionally, the toolkit can be used as an AI middleware without any changes. The main facility of IVE is that it serves for prototyping both the AI and virtual worlds themselves. The purpose of this paper is to describe IVE's features in general and to present our current work - including an educational game - on this platform.
Abstract: Historical monuments as architectural heritage are,
economically and culturally, considered one of the key aspects for
modern communities. Cultural heritage represents a country-s
national identity and pride and maintains and enriches that country-s
culture. Therefore, conservation of the monuments remained from
our ancestors requires everybody-s serious and unremitting effort.
Conservation, renewal, restoration, and technical study of cultural
and historical matters are issues which have a special status among
various forms of art and science in the present century and this is due
to two reasons: firstly, progress of humankind in this century has
created a factor called environmental pollution which not only has
caused new destructive processes of cultural/historical monuments
but also has accelerated the previous destructive processes by several
times, and secondly, the rapid advance of various sciences, especially
chemistry, has lead to the contribution of new methods and materials
to this significant issue.
Abstract: In this paper, we present a preconditioned AOR-type iterative method for solving the linear systems Ax = b, where A is a Z-matrix. And give some comparison theorems to show that the rate of convergence of the preconditioned AOR-type iterative method is faster than the rate of convergence of the AOR-type iterative method.
Abstract: This paper deals optimized model to investigate the
effects of peak current, pulse on time and pulse off time in EDM performance on material removal rate of titanium alloy utilizing copper tungsten as electrode and positive polarity of the electrode. The experiments are carried out on Ti6Al4V. Experiments were
conducted by varying the peak current, pulse on time and pulse off time. A mathematical model is developed to correlate the influences of these variables and material removal rate of workpiece. Design of
experiments (DOE) method and response surface methodology
(RSM) techniques are implemented. The validity test of the fit and adequacy of the proposed models has been carried out through
analysis of variance (ANOVA). The obtained results evidence that as
the material removal rate increases as peak current and pulse on time
increases. The effect of pulse off time on MRR changes with peak ampere. The optimum machining conditions in favor of material removal rate are verified and compared. The optimum machining
conditions in favor of material removal rate are estimated and verified with proposed optimized results. It is observed that the developed model is within the limits of the agreeable error (about
4%) when compared to experimental results. This result leads to desirable material removal rate and economical industrial machining to optimize the input parameters.
Abstract: The world economic crises and budget constraints
have caused authorities, especially those in developing countries, to
rationalize water quality monitoring activities. Rationalization
consists of reducing the number of monitoring sites, the number of
samples, and/or the number of water quality variables measured. The
reduction in water quality variables is usually based on correlation. If
two variables exhibit high correlation, it is an indication that some of
the information produced may be redundant. Consequently, one
variable can be discontinued, and the other continues to be measured.
Later, the ordinary least squares (OLS) regression technique is
employed to reconstitute information about discontinued variable by
using the continuously measured one as an explanatory variable. In
this paper, two record extension techniques are employed to
reconstitute information about discontinued water quality variables,
the OLS and the Line of Organic Correlation (LOC). An empirical
experiment is conducted using water quality records from the Nile
Delta water quality monitoring network in Egypt. The record
extension techniques are compared for their ability to predict
different statistical parameters of the discontinued variables. Results
show that the OLS is better at estimating individual water quality
records. However, results indicate an underestimation of the variance
in the extended records. The LOC technique is superior in preserving
characteristics of the entire distribution and avoids underestimation
of the variance. It is concluded from this study that the OLS can be
used for the substitution of missing values, while LOC is preferable
for inferring statements about the probability distribution.
Abstract: It was analyzed of fatty acid composition of 16 strains
of microalgae lipid fractions isolated from different basins of
Kazakhstan and characterized by stable active growth in the
laboratory. Three species of green microalgae (Oocystis
rhomboideus, Chlorococcum infusionum, Dictyochlorella globosa)
and three species of diatoms (Synedra sp., Nitzshia sp., Pleurosigma
attenuatum) are characterized by a high content of lipids and are
promising for further study as a source of polyunsaturated fatty acids.
Abstract: This paper looks into detailed investigation of
thermal-hydraulic characteristics of the flow field in a fuel rod
model, especially near the spacer. The area investigate represents a
source of information on the velocity flow field, vortex, and on the
amount of heat transfer into the coolant all of which are critical for
the design and improvement of the fuel rod in nuclear power plants.
The flow field investigation uses three-dimensional Computational
Fluid Dynamics (CFD) with the Reynolds stresses turbulence model
(RSM). The fuel rod model incorporates a vertical annular channel
where three different shapes of spacers are used; each spacer shape is
addressed individually. These spacers are mutually compared in
consideration of heat transfer capabilities between the coolant and
the fuel rod model. The results are complemented with the calculated
heat transfer coefficient in the location of the spacer and along the
stainless-steel pipe.
Abstract: According to the statistics, the prevalence of congenital hearing loss in Taiwan is approximately six thousandths; furthermore, one thousandths of infants have severe hearing impairment. Hearing ability during infancy has significant impact in the development of children-s oral expressions, language maturity, cognitive performance, education ability and social behaviors in the future. Although most children born with hearing impairment have sensorineural hearing loss, almost every child more or less still retains some residual hearing. If provided with a hearing aid or cochlear implant (a bionic ear) timely in addition to hearing speech training, even severely hearing-impaired children can still learn to talk. On the other hand, those who failed to be diagnosed and thus unable to begin hearing and speech rehabilitations on a timely manner might lose an important opportunity to live a complete and healthy life. Eventually, the lack of hearing and speaking ability will affect the development of both mental and physical functions, intelligence, and social adaptability. Not only will this problem result in an irreparable regret to the hearing-impaired child for the life time, but also create a heavy burden for the family and society. Therefore, it is necessary to establish a set of computer-assisted predictive model that can accurately detect and help diagnose newborn hearing loss so that early interventions can be provided timely to eliminate waste of medical resources. This study uses information from the neonatal database of the case hospital as the subjects, adopting two different analysis methods of using support vector machine (SVM) for model predictions and using logistic regression to conduct factor screening prior to model predictions in SVM to examine the results. The results indicate that prediction accuracy is as high as 96.43% when the factors are screened and selected through logistic regression. Hence, the model constructed in this study will have real help in clinical diagnosis for the physicians and actually beneficial to the early interventions of newborn hearing impairment.
Abstract: The major problem that wireless communication
systems undergo is multipath fading caused by scattering of the
transmitted signal. However, we can treat multipath propagation as
multiple channels between the transmitter and receiver to improve
the signal-to-scattering-noise ratio. While using Single Input
Multiple Output (SIMO) systems, the diversity receivers extract
multiple signal branches or copies of the same signal received from
different channels and apply gain combining schemes such as Root
Mean Square Gain Combining (RMSGC). RMSGC asymptotically
yields an identical performance to that of the theoretically optimal
Maximum Ratio Combining (MRC) for values of mean Signal-to-
Noise-Ratio (SNR) above a certain threshold value without the need
for SNR estimation. This paper introduces an improvement of
RMSGC using two different issues. We found that post-detection and
de-noising the received signals improve the performance of RMSGC
and lower the threshold SNR.
Abstract: This document details the process of developing a
wireless device that captures the basic movements of the foot (plantar
flexion, dorsal flexion, abduction, adduction.), and the knee
movement (flexion). It implements a motion capture system by using
a hardware based on optical fiber sensors, due to the advantages in
terms of scope, noise immunity and speed of data transmission and
reception. The operating principle used by this system is the detection
and transmission of joint movement by mechanical elements and
their respective measurement by optical ones (in this case infrared).
Likewise, Visual Basic software is used for reception, analysis and
signal processing of data acquired by the device, generating a 3D
graphical representation in real time of each movement. The result is
a boot in charge of capturing the movement, a transmission module
(Implementing Xbee Technology) and a receiver module for
receiving information and sending it to the PC for their respective
processing.
The main idea with this device is to help on topics such as
bioengineering and medicine, by helping to improve the quality of
life and movement analysis.
Abstract: It is known that the heart interacts with and adapts to
its venous and arterial loading conditions. Various experimental
studies and modeling approaches have been developed to investigate
the underlying mechanisms. This paper presents a model of the left
ventricle derived based on nonlinear stress-length myocardial
characteristics integrated over truncated ellipsoidal geometry, and
second-order dynamic mechanism for the excitation-contraction
coupling system. The results of the model presented here describe the
effects of the viscoelastic damping element of the electromechanical
coupling system on the hemodynamic response. Different heart rates
are considered to study the pacing effects on the performance of the
left-ventricle against constant preload and afterload conditions under
various damping conditions. The results indicate that the pacing
process of the left ventricle has to take into account, among other
things, the viscoelastic damping conditions of the myofilament
excitation-contraction process.
Abstract: Retention in the IT profession is critical for
organizations to stay competitive and operate reliably in the dynamic
business environment. Most organizations rely on compensation and
rewards as primary tools to enhance retention of employees. In this
quantitative survey-based study conducted at a large global bank, we
analyze the perceptions of 575 information technology (IT) software
professionals in India and Malaysia and find that fairness of rewards
has very little impact on retention likelihood. It is far more important
to actively involve employees in organizational activities. In
addition, our findings indicate that involvement is far more important
than information flow: the typical organizational communication to
keep employees informed.
Abstract: In this paper, in order to investigate the effects of
photovoltaic system introduction to detached houses in Japan, two
kinds of works were done. Firstly, the hourly generation amount of a
4.2kW photovoltaic system were simulated in 46 cities to investigate
the potential of the system in different regions in Japan using a
simulation model of photovoltaic system. Secondly, based on the
simulated electricity generation amount, the energy saving, the
environmental and the economic effect of the photovoltaic system
were examined from hourly to annual timescales, based upon
calculations of typical electricity, heating, cooling and hot water
supply load profiles for Japanese dwellings. The above analysis was
carried out using a standard year-s hourly weather data for the
different city provided by the Expanded AMeDAS Weather Data
issued by AIJ (Architectural Institute of Japan).
Abstract: In quality control of freeze-dried durian, crispiness is
a key quality index of the product. Generally, crispy testing has to be
done by a destructive method. A nondestructive testing of the
crispiness is required because the samples can be reused for other
kinds of testing. This paper proposed a crispiness classification
method of freeze-dried durians using fuzzy logic for decision
making. The physical changes of a freeze-dried durian include the
pores appearing in the images. Three physical features including (1)
the diameters of pores, (2) the ratio of the pore area and the
remaining area, and (3) the distribution of the pores are considered to
contribute to the crispiness. The fuzzy logic is applied for making the
decision. The experimental results comparing with food expert
opinion showed that the accuracy of the proposed classification
method is 83.33 percent.
Abstract: Data mining is the process of sifting through large
volumes of data, analyzing data from different perspectives and
summarizing it into useful information. One of the widely used
desktop applications for data mining is the Weka tool which is
nothing but a collection of machine learning algorithms implemented
in Java and open sourced under the General Public License (GPL). A
web service is a software system designed to support interoperable
machine to machine interaction over a network using SOAP
messages. Unlike a desktop application, a web service is easy to
upgrade, deliver and access and does not occupy any memory on the
system. Keeping in mind the advantages of a web service over a
desktop application, in this paper we are demonstrating how this Java
based desktop data mining application can be implemented as a web
service to support data mining across the internet.