Abstract: The objective of this research was to investigate biodegradation of water hyacinth (Eichhornia crassipes) to produce bioethanol using dilute-acid pretreatment (1% sulfuric acid) results in high hemicellulose decomposition and using yeast (Pachysolen tannophilus) as bioethanol producing strain. A maximum ethanol yield of 1.14g/L with coefficient, 0.24g g-1; productivity, 0.015g l-1h-1 was comparable to predicted value 32.05g/L obtained by Central Composite Design (CCD). Maximum ethanol yield coefficient was comparable to those obtained through enzymatic saccharification and fermentation of acid hydrolysate using fully equipped fermentor. Although maximum ethanol concentration was low in lab scale, the improvement of lignocellulosic ethanol yield is necessary for large scale production.
Abstract: This study was carried out to reveal the bacterial composition of aerosol in the studied abattoirs. Bacteria isolated were characterized according to microbiological standards. Factors such as temperature and distance were considered as variable in this study. The isolation was carried out at different temperatures such as 27oC, 31oC and 29oC and at various distances of 100meters and 200meters away from the slaughter sites. Result obtained showed that strains of Staphylococcus aureus, Escherichia coli, Bacillus subtilis, Lactobacillus alimentarius and Micrococcus sp. were identified. The total viable counts showed that more microorganisms were present in the morning while the least viable count of 388cfu was recorded in the evening period of this study. This study also showed that more microbial loads were recorded the further the distance is to the slaughter site. Conclusively, the array of bacteria isolated suggests that abattoir sites may be a potential source of pathogenic organisms to commuters if located within residential environment.
Abstract: This paper presents a new method for read out of the piezoresistive accelerometer sensors. The circuit works based on Instrumentation amplifier and it is useful for reducing offset In Wheatstone Bridge. The obtained gain is 645 with 1μv/°c Equivalent drift and 1.58mw power consumption. A Schmitt trigger and multiplexer circuit control output node. a high speed counter is designed in this work .the proposed circuit is designed and simulated In 0.18μm CMOS technology with 1.8v power supply.
Abstract: Flash floods are considered natural disasters that can
cause casualties and demolishing of infra structures. The problem is
that flash floods, particularly in arid and semi arid zones, take place
in very short time. So, it is important to forecast flash floods earlier to
its events with a lead time up to 48 hours to give early warning alert
to avoid or minimize disasters. The flash flood took place over Wadi
Watier - Sinai Peninsula, in October 24th, 2008, has been simulated,
investigated and analyzed using the state of the art regional weather
model. The Weather Research and Forecast (WRF) model, which is a
reliable short term forecasting tool for precipitation events, has been
utilized over the study area. The model results have been calibrated
with the real data, for the same date and time, of the rainfall
measurements recorded at Sorah gauging station. The WRF model
forecasted total rainfall of 11.6 mm while the real measured one was
10.8 mm. The calibration shows significant consistency between
WRF model and real measurements results.
Abstract: This paper addresses the problem of determining the current 3D location of a moving object and robustly tracking it from a sequence of camera images. The approach presented here uses a particle filter and does not perform any explicit triangulation. Only the color of the object to be tracked is required, but not any precisemotion model. The observation model we have developed avoids the color filtering of the entire image. That and the Monte Carlotechniques inside the particle filter provide real time performance.Experiments with two real cameras are presented and lessons learned are commented. The approach scales easily to more than two cameras and new sensor cues.
Abstract: Using Internet communication, new home electronics
have functions of monitoring and control from remote. However in
many case these electronics work as standalone, and old electronics
are not followed. Then, we developed the total remote system include
not only new electronics but olds. This systems node is a adapter of
electrical power plug that embed relay switch and some sensors, and
these nodes communicate with each other. the system server was build
on the Internet, and users access to this system from web browsers.
To reduce the cost to set up of this system, communication between
adapters are used ZigBee wireless network instead of wired LAN
cable[3]. From measured RSSI(received signal strength indicator)
information between each nodes, the system can estimate roughly
adapters were mounted on which room, and where in the room. So
also it reduces the cost of mapping nodes. Using this system, energy
saving and house monitoring are expected.
Abstract: Series of experimental tests were conducted on a
section of a 660 kW wind turbine blade to measure the pressure
distribution of this model oscillating in plunging motion. In order to
minimize the amount of data required to predict aerodynamic loads
of the airfoil, a General Regression Neural Network, GRNN, was
trained using the measured experimental data. The network once
proved to be accurate enough, was used to predict the flow behavior
of the airfoil for the desired conditions.
Results showed that with using a few of the acquired data, the
trained neural network was able to predict accurate results with
minimal errors when compared with the corresponding measured
values. Therefore with employing this trained network the
aerodynamic coefficients of the plunging airfoil, are predicted
accurately at different oscillation frequencies, amplitudes, and angles
of attack; hence reducing the cost of tests while achieving acceptable
accuracy.
Abstract: This research is aimed to describe the application of robust regression and its advantages over the least square regression method in analyzing financial data. To do this, relationship between earning per share, book value of equity per share and share price as price model and earning per share, annual change of earning per share and return of stock as return model is discussed using both robust and least square regressions, and finally the outcomes are compared. Comparing the results from the robust regression and the least square regression shows that the former can provide the possibility of a better and more realistic analysis owing to eliminating or reducing the contribution of outliers and influential data. Therefore, robust regression is recommended for getting more precise results in financial data analysis.
Abstract: This essay presents applicative methods to reduce human exposure levels in the area around base transceiver stations in a environment with multiple sources based on ITU-T recommendation K.70. An example is presented to understand the mitigation techniques and their results and also to learn how they can be applied, especially in developing countries where there is not much research on non-ionizing radiations.
Abstract: Workflow Management Systems (WfMS) alloworganizations to streamline and automate business processes and reengineer their structure. One important requirement for this type of system is the management and computation of the Quality of Service(QoS) of processes and workflows. Currently, a range of Web processes and workflow languages exist. Each language can be characterized by the set of patterns they support. Developing andimplementing a suitable and generic algorithm to compute the QoSof processes that have been designed using different languages is a difficult task. This is because some patterns are specific to particular process languages and new patterns may be introduced in future versions of a language. In this paper, we describe an adaptive algorithm implemented to cope with these two problems. The algorithm is called adaptive since it can be dynamically changed as the patterns of a process language also change.
Abstract: Malay Folk Literature in early childhood education
served as an important agent in child development that involved
emotional, thinking and language aspects. Up to this moment not
much research has been carried out in Malaysia particularly in the
teaching and learning aspects nor has there been an effort to publish
“big books." Hence this article will discuss the stance taken by
university undergraduate students, teachers and parents in evaluating
Malay Folk Literature in early childhood education to be used as big
books. The data collated and analyzed were taken from 646
respondents comprising 347 undergraduates and 299 teachers. Results
of the study indicated that Malay Folk Literature can be absorbed into
teaching and learning for early childhood with a mean of 4.25 while it
can be in big books with a mean of 4.14. Meanwhile the highest mean
value required for placing Malay Folk Literature genre as big books in
early childhood education rests on exemplary stories for
undergraduates with mean of 4.47; animal fables for teachers with a
mean of 4.38. The lowest mean value of 3.57 is given to lipurlara
stories. The most popular Malay Folk Literature found suitable for
early children is Sang Kancil and the Crocodile, followed by Bawang
Putih Bawang Merah. Pak Padir, Legends of Mahsuri, Origin of
Malacca, and Origin of Rainbow are among the popular stories as
well. Overall the undergraduates show a positive attitude toward all
the items compared to teachers. The t-test analysis has revealed a non
significant relationship between the undergraduate students and
teachers with all the items for the teaching and learning of Malay Folk
Literature.
Abstract: A novel idea presented in this paper is to combine
multihop routing with single-frequency networks (SFNs) for a
broadcasting scenario. An SFN is a set of multiple nodes that transmit
the same data simultaneously, resulting in transmitter macrodiversity.
Two of the most important performance factors of multihop
networks, node reachability and routing robustness, are analyzed.
Simulation results show that our proposed SFN-D routing algorithm
improves the node reachability by 37 percentage points as compared
to non-SFN multihop routing. It shows a diversity gain of 3.7 dB,
meaning that 3.7 dB lower transmission powers are required for the
same reachability. Even better results are possible for larger
networks. If an important node becomes inactive, this algorithm can
find new routes that a non-SFN scheme would not be able to find.
Thus, two of the major problems in multihopping are addressed;
achieving robust routing as well as improving node reachability or
reducing transmission power.
Abstract: The public sector holds large amounts of data of
various areas such as social affairs, economy, or tourism. Various
initiatives such as Open Government Data or the EU Directive on
public sector information aim to make these data available for public
and private service providers. Requirements for the provision of
public sector data are defined by legal and organizational
frameworks. Surprisingly, the defined requirements hardly cover
security aspects such as integrity or authenticity.
In this paper we discuss the importance of these missing
requirements and present a concept to assure the integrity and
authenticity of provided data based on electronic signatures. We
show that our concept is perfectly suitable for the provisioning of
unaltered data. We also show that our concept can also be extended
to data that needs to be anonymized before provisioning by
incorporating redactable signatures. Our proposed concept enhances
trust and reliability of provided public sector data.
Abstract: This paper compares six approaches of object serialization
from qualitative and quantitative aspects. Those are object
serialization in Java, IDL, XStream, Protocol Buffers, Apache Avro,
and MessagePack. Using each approach, a common example is
serialized to a file and the size of the file is measured. The qualitative
comparison works are investigated in the way of checking whether
schema definition is required or not, whether schema compiler is
required or not, whether serialization is based on ascii or binary, and
which programming languages are supported. It is clear that there
is no best solution. Each solution makes good in the context it was
developed.
Abstract: Recently, lots of researchers are attracted to retrieving
multimedia database by using some impression words and their values.
Ikezoe-s research is one of the representatives and uses eight pairs of
opposite impression words. We had modified its retrieval interface and
proposed '2D-RIB' in the previous work. The aim of the present paper
is to improve his/her satisfaction level to the retrieval result in the
2D-RIB. Our method is to extend the 2D-RIB. One of our extensions is
to define and introduce the following two measures: 'melody
goodness' and 'general acceptance'. Another extension is three types
of customization menus. The result of evaluation using a pilot system
is as follows. Both of these two measures 'melody goodness'
and -general acceptance- can contribute to the improvement.
Moreover, it is effective if we introduce the customization menu
which enables a retrieval person to reduce the strictness level of
retrieval condition in an impression pair based on his/her need.
Abstract: The paper discusses optimising work on a method of processing ceramic / metal composite coatings for various applications and is based on preliminary work on processing anodes for solid oxide fuel cells (SOFCs). The composite coating is manufactured by the electroless co-deposition of nickel and yttria stabilised zirconia (YSZ) simultaneously on to a ceramic substrate. The effect on coating characteristics of substrate surface treatments and electroless nickel bath parameters such as pH and agitation methods are also investigated. Characterisation of the resulting deposit by scanning electron microscopy (SEM) and energy dispersive X-ray analysis (EDXA) is also discussed.
Abstract: Coherent and incoherent scattering cross section measurements have been carried out using a HPGe detector on elements in the range of Z = 13 - 50 using 241Am gamma rays. The cross sections have been derived by comparing the net count rate obtained from the Compton peak of aluminium with the corresponding peak of the target. The measured cross sections for the coherent and incoherent processes are compared with theoretical values and earlier reported values. Our results are in agreement with the theoretical values.
Abstract: This paper introduces a novel design for boring bar with enhanced damping capability. The principle followed in the design phase was to enhance the damping capability minimizing the loss in static stiffness through implementation of composite material interfaces. The newly designed tool has been compared to a conventional tool. The evaluation criteria were the dynamic characteristics, frequency and damping ratio, of the machining system, as well as the surface roughness of the machined workpieces. The use of composite material in the design of damped tool has been demonstrated effective. Furthermore, the autoregressive moving average (ARMA) models presented in this paper take into consideration the interaction between the elastic structure of the machine tool and the cutting process and can therefore be used to characterize the machining system in operational conditions.
Abstract: In the Equivalent Transformation (ET) computation
model, a program is constructed by the successive accumulation of
ET rules. A method by meta-computation by which a correct ET
rule is generated has been proposed. Although the method covers a
broad range in the generation of ET rules, all important ET rules
are not necessarily generated. Generation of more ET rules can be
achieved by supplementing generation methods which are specialized
for important ET rules. A Specialization-by-Equation (Speq) rule is
one of those important rules. A Speq rule describes a procedure in
which two variables included in an atom conjunction are equalized
due to predicate constraints. In this paper, we propose an algorithm
that systematically and recursively generate Speq rules and discuss
its effectiveness in the synthesis of ET programs. A Speq rule is
generated based on proof of a logical formula consisting of given
atom set and dis-equality. The proof is carried out by utilizing some
ET rules and the ultimately obtained rules in generating Speq rules.
Abstract: The link between Gröbner basis and linear algebra was
described by Lazard [4,5] where he realized the Gr¨obner basis
computation could be archived by applying Gaussian elimination over
Macaulay-s matrix .
In this paper, we indicate how same technique may be used to
SAGBI- Gröbner basis computations in invariant rings.