Abstract: This study was investigated on sampling and
analyzing water quality in water reservoir & water tower installed in
two kind of residential buildings and school facilities. Data of water
quality was collected for correlation analysis with frequency of
sanitization of water reservoir through questioning managers of
building about the inspection charts recorded on equipment for water
reservoir. Statistical software packages (SPSS) were applied to the
data of two groups (cleaning frequency and water quality) for
regression analysis to determine the optimal cleaning frequency of
sanitization. The correlation coefficient (R) in this paper represented
the degree of correlation, with values of R ranging from +1 to -1.After
investigating three categories of drinking water users; this study found
that the frequency of sanitization of water reservoir significantly
influenced the water quality of drinking water. A higher frequency of
sanitization (more than four times per 1 year) implied a higher quality
of drinking water. Results indicated that sanitizing water reservoir &
water tower should at least twice annually for achieving the aim of
safety of drinking water.
Abstract: Ethanol is generally used as a therapeutic reagent against Hepatocellular carcinoma (HCC or hepatoma) worldwide, as it can induce Hepatocellular carcinoma cell apoptosis at low concentration through a multifactorial process regulated by several unknown proteins. This paper provides a simple and available proteomic strategy for exploring differentially expressed proteins in the apoptotic pathway. The appropriate concentrations of ethanol required to induce HepG2 cell apoptosis were first assessed by MTT assay, Gisma and fluorescence staining. Next, the central proteins involved in the apoptosis pathway processs were determined using 2D-PAGE, SDS-PAGE, and bio-software analysis. Finally the downregulation of two proteins, AFP and survivin, were determined by immunocytochemistry and reverse transcriptase PCR (RT-PCR) technology. The simple, useful method demonstrated here provides a new approach to proteomic analysis in key bio-regulating process including proliferation, differentiation, apoptosis, immunity and metastasis.
Abstract: We have measured the pressure drop and convective
heat transfer coefficient of water – based AL(25nm),AL2O3(30nm)
and CuO(50nm) Nanofluids flowing through a uniform heated
circular tube in the fully developed laminar flow regime. The
experimental results show that the data for Nanofluids friction factor
show a good agreement with analytical prediction from the Darcy's
equation for single-phase flow. After reducing the experimental
results to the form of Reynolds, Rayleigh and Nusselt numbers. The
results show the local Nusselt number and temperature have
distribution with the non-dimensional axial distance from the tube
entry. Study decided that thenNanofluid as Newtonian fluids through
the design of the linear relationship between shear stress and the rate
of stress has been the study of three chains of the Nanofluid with
different concentrations and where the AL, AL2O3 and CuO – water
ranging from (0.25 - 2.5 vol %). In addition to measuring the four
properties of the Nanofluid in practice so as to ensure the validity of
equations of properties developed by the researchers in this area and
these properties is viscosity, specific heat, and density and found that
the difference does not exceed 3.5% for the experimental equations
between them and the practical. The study also demonstrated that the
amount of the increase in heat transfer coefficient for three types of
Nano fluid is AL, AL2O3, and CuO – Water and these ratios are
respectively (45%, 32%, 25%) with insulation and without insulation
(36%, 23%, 19%), and the statement of any of the cases the best
increase in heat transfer has been proven that using insulation is
better than not using it. I have been using three types of Nano
particles and one metallic Nanoparticle and two oxide Nanoparticle
and a statement, whichever gives the best increase in heat transfer.
Abstract: This paper presents an advance in monitoring and
process control of surface roughness in CNC machine for the turning
and milling processes. An integration of the in-process monitoring
and process control of the surface roughness is proposed and
developed during the machining process by using the cutting force
ratio. The previously developed surface roughness models for turning
and milling processes of the author are adopted to predict the inprocess
surface roughness, which consist of the cutting speed, the
feed rate, the tool nose radius, the depth of cut, the rake angle, and
the cutting force ratio. The cutting force ratios obtained from the
turning and the milling are utilized to estimate the in-process surface
roughness. The dynamometers are installed on the tool turret of CNC
turning machine and the table of 5-axis machining center to monitor
the cutting forces. The in-process control of the surface roughness
has been developed and proposed to control the predicted surface
roughness. It has been proved by the cutting tests that the proposed
integration system of the in-process monitoring and the process
control can be used to check the surface roughness during the cutting
by utilizing the cutting force ratio.
Abstract: In this paper, a novel approach for the multidisciplinary design optimization (MDO) of complex mechatronic systems. This approach, which is a part of a global project aiming to include the MDO aspect inside an innovative design process. As a first step, the paper considers the MDO as a redesign approach which is limited to the parametric optimization. After defining and introducing the different keywords, the proposed method which is based on the V-Model which is commonly used in mechatronics.
Abstract: One of the main advantages of the LO paradigm is to
allow the availability of good quality, shareable learning material
through the Web. The effectiveness of the retrieval process requires a
formal description of the resources (metadata) that closely fits the
user-s search criteria; in spite of the huge international efforts in this
field, educational metadata schemata often fail to fulfil this
requirement. This work aims to improve the situation, by the
definition of a metadata model capturing specific didactic features of
shareable learning resources. It classifies LOs into “teacher-oriented"
and “student-oriented" categories, in order to describe the role a LO
is to play when it is integrated into the educational process. This
article describes the model and a first experimental validation process
that has been carried out in a controlled environment.
Abstract: The effect of extraction solvent upon properties
of carrageenan from Eucheuma cottonii was studied. The
distilled water and KOH solution (concentration 0.1- 0.5N) were
used as the solvent. Extraction process was carried out in water
bath equipped by stirrer with constant speed of 275 rpm with a
constant ratio of seaweed weight to solvent volume ( 1:50 g/mL)
at 86oC for 45 minutes. The extract was then precipitated in 3
volume of 90% ethanol, oven dried at 60oC. Based on
experimental data, alkali significantly influenced yield and
properties of extracted carrageenan. The extracted carrageenan
was found to have essentially identical FTIR spectra to the
reference samples of kappa-carrageenan. Increasing the KOH
concentration led to carrageenan containing less sulfate content
and intrinsic viscosity. The gel strength increased along with the
increasing of KOH concentration. The decreasing of intrinsic
viscosity value indicates that a polymer degradation occurs
during alkali extraction.
Abstract: In single trial analysis, when using Principal
Component Analysis (PCA) to extract Visual Evoked Potential
(VEP) signals, the selection of principal components (PCs) is an
important issue. We propose a new method here that selects only
the appropriate PCs. We denote the method as selective eigen-rate
(SER). In the method, the VEP is reconstructed based on the rate
of the eigen-values of the PCs. When this technique is applied on
emulated VEP signals added with background
electroencephalogram (EEG), with a focus on extracting the
evoked P3 parameter, it is found to be feasible. The improvement
in signal to noise ratio (SNR) is superior to two other existing
methods of PC selection: Kaiser (KSR) and Residual Power (RP).
Though another PC selection method, Spectral Power Ratio (SPR)
gives a comparable SNR with high noise factors (i.e. EEGs), SER
give more impressive results in such cases. Next, we applied SER
method to real VEP signals to analyse the P3 responses for
matched and non-matched stimuli. The P3 parameters extracted
through our proposed SER method showed higher P3 response for
matched stimulus, which confirms to the existing neuroscience
knowledge. Single trial PCA using KSR and RP methods failed to
indicate any difference for the stimuli.
Abstract: Chronic conditions carry with them strong emotions
and often lead to charged relationships between patients and their
health providers and, by extension, patients and health researchers.
Persons are both autonomous and relational and a purely cognitive
model of autonomy neglects the social and relational basis of chronic
illness. Ensuring genuine informed consent in research requires a
thorough understanding of how participants perceive a study and
their reasons for participation. Surveys may not capture the
complexities of reasoning that underlies study participation.
Contradictory reasons for participation, for instance an initial claim
of altruism as rationale and a subsequent claim of personal benefit
(therapeutic misconception), affect the quality of informed consent.
Individuals apply principles through the filter of personal values and
lived experience. Authentic autonomy, and hence authentic consent
to research, occurs within the context of patients- unique life
narratives and illness experiences.
Abstract: In Korea, the technology of a load fo nuclear power plant has been being developed.
automatic controller which is able to control temperature and axial power distribution was developed. identification algorithm and a model predictive contact former transforms the nuclear reactor status into
numerically. And the latter uses them and ge
manipulated values such as two kinds of control ro
this automatic controller, the performance of a coperation was evaluated. As a result, the automatic generated model parameters of a nuclear react to nuclear reactor average temperature and axial power the desired targets during a daily load follow.
Abstract: The separation of dissolved gas including dissolved oxygen can be used in breathing for a human under water. When one is suddenly wrecked or meets a tsunami, one is instantly drowned and cannot breathe under water. To avoid this crisis, when we meet waves, the dissolved gas separated from water by wave is used, while air can be used to breathe when we are about to escape from water. In this thesis, we investigated the separation characteristics of dissolved gas using the pipe type of hollow fiber membrane with polypropylene and the nude type of one with polysulfone. The hollow fiber membranes with good characteristics under water are used to separate the dissolved gas. The hollow fiber membranes with good characteristics in an air are used to transfer air. The combination of membranes with good separation characteristics under water and good transferring one in an air is used to breathe instantly under water to be alive at crisis. These results showed that polypropylene represented better performance than polysulfone under both of air and water conditions.
Abstract: Recent years have seen a growing trend towards the
integration of multiple information sources to support large-scale
prediction of protein-protein interaction (PPI) networks in model
organisms. Despite advances in computational approaches, the
combination of multiple “omic" datasets representing the same type
of data, e.g. different gene expression datasets, has not been
rigorously studied. Furthermore, there is a need to further investigate
the inference capability of powerful approaches, such as fullyconnected
Bayesian networks, in the context of the prediction of PPI
networks. This paper addresses these limitations by proposing a
Bayesian approach to integrate multiple datasets, some of which
encode the same type of “omic" data to support the identification of
PPI networks. The case study reported involved the combination of
three gene expression datasets relevant to human heart failure (HF).
In comparison with two traditional methods, Naive Bayesian and
maximum likelihood ratio approaches, the proposed technique can
accurately identify known PPI and can be applied to infer potentially
novel interactions.
Abstract: An active RC filters with a 880 / 1760 MHz dual bandwidth tuning ability is present for 60 GHz unlicensed band applications. A third order Butterworth low-pass filter utilizes two Cherry-Hooper amplifiers to satisfy the very high bandwidth requirements of an amplifier. The low-pass filter is fabricated in 90nm standard CMOS process. Drawing 6.7 mW from 1.2 V power supply, the low frequency gains of the filter are -2.5 and -4.1 dB, and the output third order intercept points (OIP3) are +2.2 and +1.9 dBm for the single channel and channel bonding conditions, respectively.
Abstract: This paper presents an algorithm which
combining ant colony optimization in the dynamic
programming for solving a dynamic facility layout problem.
The problem is separated into 2 phases, static and dynamic
phase. In static phase, ant colony optimization is used to find
the best ranked of layouts for each period. Then the dynamic
programming (DP) procedure is performed in the dynamic
phase to evaluate the layout set during multi-period planning
horizon. The proposed algorithm is tested over many
problems with size ranging from 9 to 49 departments, 2 and 4
periods. The experimental results show that the proposed
method is an alternative way for the plant layout designer to
determine the layouts during multi-period planning horizon.
Abstract: There are three distinct stages in the evolution of
economic thought, namely:
1. in the first stage, the major concern was to accelerate
economic growth with increased availability of material
goods, especially in developing economies with very low
living standards, because poverty eradication meant faster
economic growth.
2. in the second stage, economists made distinction between
growth and development. Development was seen as going
beyond economic growth, and bringing certain changes in
the structure of the economy with more equitable
distribution of the benefits of growth, with the growth
coming automatic and sustained.
3. the third stage is now reached. Our concern is now with
“sustainable development", that is, development not only
for the present but also of the future.
Thus the focus changed from “sustained growth" to “sustained
development". Sustained development brings to the fore the long
term relationship between the ecology and economic development.
Since the creation of UNEP in 1972 it has worked for
development without destruction for environmentally sound and
sustained development. It was realised that the environment cannot
be viewed in a vaccum, it is not separate from development, nor is it
competing. It suggested for the integration of the environment with
development whereby ecological factors enter development planning,
socio-economic policies, cost-benefit analysis, trade, technology
transfer, waste management, educational and other specific areas.
Industrialisation has contributed to the growth of economy of
several countries. It has improved the standards of living of its people
and provided benefits to the society. It has also created in the process
great environmental problems like climate change, forest destruction
and denudation, soil erosion and desertification etc.
On the other hand, industry has provided jobs and improved the
prospects of wealth for the industrialists. The working class
communities had to simply put up with the high levels of pollution in
order to keep up their jobs and also to save their income.
There are many roots of the environmental problem. They may be
political, economic, cultural and technological conditions of the
modern society. The experts concede that industrial growth lies
somewhere close to the heart of the matter. Therefore, the objective
of this paper is not to document all roots of an environmental crisis
but rather to discuss the effects of industrial growth and
development.
We have come to the conclusion that although public intervention
is often unnecessary to ensure that perfectly competitive markets will
function in society-s best interests, such intervention is necessary
when firms or consumers pollute.
Abstract: The multidelays linear control systems described by
difference differential equations are often studied in modern control
theory. In this paper, the delay-independent stabilization algebraic
criteria and the theorem of delay-independent stabilization for linear
systems with multiple time-delays are established by using the
Lyapunov functional and the Riccati algebra matrix equation in the
matrix theory. An illustrative example and the simulation result, show
that the approach to linear systems with multiple time-delays is
effective.
Abstract: The present study concentrates on solving the along wind oscillation problem of a tall square building from first principles and across wind oscillation problem of the same from empirical relations obtained by experiments. The criterion for human comfort at the worst condition at the top floor of the building is being considered and a limiting value of height of a building for a given cross section is predicted. Numerical integrations are carried out as and when required. The results show severeness of across wind oscillations in comparison to along wind oscillation. The comfort criterion is combined with across wind oscillation results to determine the maximum allowable height of a building for a given square cross-section.
Abstract: Most Decision Support Systems (DSS) for waste
management (WM) constructed are not widely marketed and lack
practical applications. This is due to the number of variables and
complexity of the mathematical models which include the
assumptions and constraints required in decision making. The
approach made by many researchers in DSS modelling is to isolate a
few key factors that have a significant influence to the DSS. This
segmented approach does not provide a thorough understanding of
the complex relationships of the many elements involved. The
various elements in constructing the DSS must be integrated and
optimized in order to produce a viable model that is marketable and
has practical application. The DSS model used in assisting decision
makers should be integrated with GIS, able to give robust prediction
despite the inherent uncertainties of waste generation and the plethora
of waste characteristics, and gives optimal allocation of waste stream
for recycling, incineration, landfill and composting.
Abstract: In this paper we propose a computational model for the representation and processing of morpho-phonological phenomena in a natural language, like Modern Greek. We aim at a unified treatment of inflection, compounding, and word-internal phonological changes, in a model that is used for both analysis and generation. After discussing certain difficulties cuase by well-known finitestate approaches, such as Koskenniemi-s two-level model [7] when applied to a computational treatment of compounding, we argue that a morphology-based model provides a more adequate account of word-internal phenomena. Contrary to the finite state approaches that cannot handle hierarchical word constituency in a satisfactory way, we propose a unification-based word grammar, as the nucleus of our strategy, which takes into consideration word representations that are based on affixation and [stem stem] or [stem word] compounds. In our formalism, feature-passing operations are formulated with the use of the unification device, and phonological rules modeling the correspondence between lexical and surface forms apply at morpheme boundaries. In the paper, examples from Modern Greek illustrate our approach. Morpheme structures, stress, and morphologically conditioned phoneme changes are analyzed and generated in a principled way.
Abstract: Mathematical programming has been applied to various
problems. For many actual problems, the assumption that the parameters
involved are deterministic known data is often unjustified. In
such cases, these data contain uncertainty and are thus represented
as random variables, since they represent information about the
future. Decision-making under uncertainty involves potential risk.
Stochastic programming is a commonly used method for optimization
under uncertainty. A stochastic programming problem with recourse
is referred to as a two-stage stochastic problem. In this study, we
consider a stochastic programming problem with simple integer
recourse in which the value of the recourse variable is restricted to a
multiple of a nonnegative integer. The algorithm of a dynamic slope
scaling procedure for solving this problem is developed by using a
property of the expected recourse function. Numerical experiments
demonstrate that the proposed algorithm is quite efficient. The
stochastic programming model defined in this paper is quite useful
for a variety of design and operational problems.