Abstract: The objective of this paper was to designing a
ventilation system to enhance the performance of roof solar collector
(RSC) for reducing heat accumulation inside the house. The RSC has
1.8 m2 surface area made of CPAC monier roof tiles on the upper part
and gypsum board on the lower part. The space between CPAC
monier and gypsum board was fixed at 14 cm.
Ventilation system of modified roof solar collector (modified
RSC) consists of 9 tubes of 0.15m diameter and installed in the
lower part of RSC. Experimental result showed that the temperature
of the room, and attic temperature. The average temperature
reduction of room of house used modified RSC is about 2oC. and the
percentage of room temperature reduction varied between 0 to 10%.
Therefore, modified RSC is an interesting option in the sense that it
promotes solar energy and conserve energy.
Abstract: Concept maps can be generated manually or
automatically. It is important to recognize differences of the two
types of concept maps. The automatically generated concept maps
are dynamic, interactive, and full of associations between the terms
on the maps and the underlying documents. Through a specific
concept mapping system, Visual Concept Explorer (VCE), this paper
discusses how automatically generated concept maps are different
from manually generated concept maps and how different
applications and learning opportunities might be created with the
automatically generated concept maps. The paper presents several
examples of learning strategies that take advantages of the
automatically generated concept maps for concept learning and
exploration.
Abstract: The main objective of this paper is to contribute the
existing knowledge transfer and IT Outsourcing literature
specifically in the context of Malaysia by reviewing the current
practices of e-government IT outsourcing in Malaysia including the
issues and challenges faced by the public agencies in transferring the
knowledge during the engagement. This paper discusses various
factors and different theoretical model of knowledge transfer starting
from the traditional model to the recent model suggested by the
scholars. The present paper attempts to align organizational
knowledge from the knowledge-based view (KBV) and
organizational learning (OL) lens. This review could help shape the
direction of both future theoretical and empirical studies on inter-firm
knowledge transfer specifically on how KBV and OL perspectives
could play significant role in explaining the complex relationships
between the client and vendor in inter-firm knowledge transfer and
the role of organizational management information system and
Transactive Memory System (TMS) to facilitate the organizational
knowledge transferring process. Conclusion is drawn and further
research is suggested.
Abstract: An end-member selection method for spectral unmixing that is based on Particle Swarm Optimization (PSO) is developed in this paper. The algorithm uses the K-means clustering algorithm and a method of dynamic selection of end-members subsets to find the appropriate set of end-members for a given set of multispectral images. The proposed algorithm has been successfully applied to test image sets from various platforms such as LANDSAT 5 MSS and NOAA's AVHRR. The experimental results of the proposed algorithm are encouraging. The influence of different values of the algorithm control parameters on performance is studied. Furthermore, the performance of different versions of PSO is also investigated.
Abstract: Enzymatic saccharification of biomass for reducing
sugar production is one of the crucial processes in biofuel production
through biochemical conversion. In this study, enzymatic
saccharification of dilute potassium hydroxide (KOH) pre-treated
Tetraselmis suecica biomass was carried out by using cellulase
enzyme obtained from Trichoderma longibrachiatum. Initially, the
pre-treatment conditions were optimised by changing alkali reagent
concentration, retention time for reaction, and temperature. The T.
suecica biomass after pre-treatment was also characterized using
Fourier Transform Infrared Spectra and Scanning Electron
Microscope. These analyses revealed that the functional group such
as acetyl and hydroxyl groups, structure and surface of T. suecica
biomass were changed through pre-treatment, which is favourable for
enzymatic saccharification process. Comparison of enzymatic
saccharification of untreated and pre-treated microalgal biomass
indicated that higher level of reducing sugar can be obtained from
pre-treated T. suecica. Enzymatic saccharification of pre-treated T.
suecica biomass was optimised by changing temperature, pH, and
enzyme concentration to solid ratio ([E]/[S]). Highest conversion of
carbohydrate into reducing sugar of 95% amounted to reducing sugar
yield of 20 (wt%) from pre-treated T. suecica was obtained from
saccharification, at temperature: 40°C, pH: 4.5 and [E]/[S] of 0.1
after 72 h of incubation. Hydrolysate obtained from enzymatic
saccharification of pretreated T. suecica biomass was further
fermented into biobutanol using Clostridium saccharoperbutyliticum
as biocatalyst. The results from this study demonstrate a positive
prospect of application of dilute alkaline pre-treatment to enhance
enzymatic saccharification and biobutanol production from
microalgal biomass.
Abstract: Data envelopment analysis (DEA) has gained great popularity in environmental performance measurement because it can provide a synthetic standardized environmental performance index when pollutants are suitably incorporated into the traditional DEA framework. Since some of the environmental performance indicators cannot be controlled by companies managers, it is necessary to develop the model in a way that it could be applied when discretionary and/or non-discretionary factors were involved. In this paper, we present a semi-radial DEA approach to measuring environmental performance, which consists of non-discretionary factors. The model, then, has been applied on a real case.
Abstract: The trends of design and development of information systems have undergone a variety of ongoing phases and stages. These variations have been evolved due to brisk changes in user requirements and business needs. To meet these requirements and needs, a flexible and agile business solution was required to come up with the latest business trends and styles. Another obstacle in agility of information systems was typically different treatment of same diseases of two patients: business processes and information services. After the emergence of information technology, the business processes and information systems have become counterparts. But these two business halves have been treated under totally different standards. There is need to streamline the boundaries of these both pillars that are equally sharing information system's burdens and liabilities. In last decade, the object orientation has evolved into one of the major solutions for modern business needs and now, SOA is the solution to shift business on ranks of electronic platform. BPM is another modern business solution that assists to regularize optimization of business processes. This paper discusses how object orientation can be conformed to incorporate or embed SOA in BPM for improved information systems.
Abstract: Every organization is continually subject to new damages and threats which can be resulted from their operations or their goal accomplishment. Methods of providing the security of space and applied tools have been widely changed with increasing application and development of information technology (IT). From this viewpoint, information security management systems were evolved to construct and prevent reiterating the experienced methods. In general, the correct response in information security management systems requires correct decision making, which in turn requires the comprehensive effort of managers and everyone involved in each plan or decision making. Obviously, all aspects of work or decision are not defined in all decision making conditions; therefore, the possible or certain risks should be considered when making decisions. This is the subject of risk management and it can influence the decisions. Investigation of different approaches in the field of risk management demonstrates their progress from quantitative to qualitative methods with a process approach.
Abstract: Knowledge sharing in general and the contextual
access to knowledge in particular, still represent a key challenge in
the knowledge management framework. Researchers on semantic
web and human machine interface study techniques to enhance this
access. For instance, in semantic web, the information retrieval is
based on domain ontology. In human machine interface, keeping
track of user's activity provides some elements of the context that can
guide the access to information. We suggest an approach based on
these two key guidelines, whilst avoiding some of their weaknesses.
The approach permits a representation of both the context and the
design rationale of a project for an efficient access to knowledge. In
fact, the method consists of an information retrieval environment
that, in the one hand, can infer knowledge, modeled as a semantic
network, and on the other hand, is based on the context and the
objectives of a specific activity (the design). The environment we
defined can also be used to gather similar project elements in order to
build classifications of tasks, problems, arguments, etc. produced in a
company. These classifications can show the evolution of design
strategies in the company.
Abstract: Results of Chilean wine classification based on the
information provided by an electronic nose are reported in this paper.
The classification scheme consists of two parts; in the first stage,
Principal Component Analysis is used as feature extraction method to
reduce the dimensionality of the original information. Then, Radial
Basis Functions Neural Networks is used as pattern recognition
technique to perform the classification. The objective of this study is
to classify different Cabernet Sauvignon, Merlot and Carménère wine
samples from different years, valleys and vineyards of Chile.
Abstract: The aim of this study was to investigate the
environmental conservation behavior of the Applied Health Science
students of Suranaree University of Technology, a green and clean
university. The sample group was 184 Applied Health Science
students (medical, nursing, and public health). A questionnaire was
used to collect information.
The result of the study found that the students had more negative
than positive behaviors towards energy, water, and forest
conservation. This result can be used as basic information for
designing long-term behavior modification activities or research
projects on environmental conservation. Thus Applied Health
Science students will be encouraged to be conscious and also be a
good example of environmental conservation behavior.
Abstract: This study reports the implementation of Good
Manufacturing Practice (GMP) in a polycarbonate film processing
plant. The implementation of GMP took place with the creation of a
multidisciplinary team. It was carried out in four steps: conduct gap
assessment, create gap closure plan, close gaps, and follow up the
GMP implementation. The basis for the gap assessment is the
guideline for GMP for plastic materials and articles intended for Food
Contact Material (FCM), which was edited by Plastic Europe. The
effective results of the GMP implementation in this study showed
100% completion of gap assessment. The key success factors for
implementing GMP in production process are the commitment,
intention and support of top management.
Abstract: Previous the 3D model texture generation from multi-view images and mapping algorithms has issues in the texture chart generation which are the self-intersection and the concentration of the texture in texture space. Also we may suffer from some problems due to the occluded areas, such as inside parts of thighs. In this paper we propose a texture mapping technique for 3D models using multi-view images on the GPU. We do texture mapping directly on the GPU fragment shader per pixel without generation of the texture map. And we solve for the occluded area using the 3D model depth information. Our method needs more calculation on the GPU than previous works, but it has shown real-time performance and previously mentioned problems do not occur.
Abstract: The present study was designed to investigate the
cardio protective role of chronic oral administration of alcoholic
extract of Terminalia arjuna in in-vivo ischemic reperfusion injury
and the induction of HSP72. Rabbits, divided into three groups, and
were administered with the alcoholic extract of the bark powder of
Terminalia arjuna (TAAE) by oral gavage [6.75mg/kg: (T1) and
9.75mg/kg: (T2), 6 days /week for 12 weeks]. In open-chest
Ketamine pentobarbitone anaesthetized rabbits, the left anterior
descending coronary artery was occluded for 15 min of ischemia
followed by 60 min of reperfusion. In the vehicle-treated group,
ischemic-reperfusion injury (IRI) was evidenced by depression of
global hemodynamic function (MAP, HR, LVEDP, peak LV (+) & (-
) (dP/dt) along with depletion of HEP compounds. Oxidative stress
in IRI was evidenced by, raised levels of myocardial TBARS and
depletion of endogenous myocardial antioxidants GSH, SOD and
catalase. Western blot analysis showed a single band corresponding
to 72 kDa in homogenates of hearts from rabbits treated with both the
doses. In the alcoholic extract of the bark powder of Terminalia
arjuna treatment groups, both the doses had better recovery of
myocardial hemodynamic function, with significant reduction in
TBARS, and rise in SOD, GSH, catalase were observed. The results
of the present study suggest that the alcoholic extract of the bark
powder of Terminalia arjuna in rabbit induces myocardial HSP 72
and augments myocardial endogenous antioxidants, without causing
any cellular injury and offered better cardioprotection against
oxidative stress associated with myocardial IR injury.
Abstract: A key to success of high quality software development
is to define valid and feasible requirements specification. We have
proposed a method of model-driven requirements analysis using
Unified Modeling Language (UML). The main feature of our method
is to automatically generate a Web user interface mock-up from UML
requirements analysis model so that we can confirm validity of
input/output data for each page and page transition on the system by
directly operating the mock-up. This paper proposes a support method
to check the validity of a data life cycle by using a model checking tool
“UPPAAL" focusing on CRUD (Create, Read, Update and Delete).
Exhaustive checking improves the quality of requirements analysis
model which are validated by the customers through automatically
generated mock-up. The effectiveness of our method is discussed by a
case study of requirements modeling of two small projects which are a
library management system and a supportive sales system for text
books in a university.
Abstract: In ad hoc networks, the main issue about designing of protocols is quality of service, so that in wireless sensor networks the main constraint in designing protocols is limited energy of sensors. In fact, protocols which minimize the power consumption in sensors are more considered in wireless sensor networks. One approach of reducing energy consumption in wireless sensor networks is to reduce the number of packages that are transmitted in network. The technique of collecting data that combines related data and prevent transmission of additional packages in network can be effective in the reducing of transmitted packages- number. According to this fact that information processing consumes less power than information transmitting, Data Aggregation has great importance and because of this fact this technique is used in many protocols [5]. One of the Data Aggregation techniques is to use Data Aggregation tree. But finding one optimum Data Aggregation tree to collect data in networks with one sink is a NP-hard problem. In the Data Aggregation technique, related information packages are combined in intermediate nodes and form one package. So the number of packages which are transmitted in network reduces and therefore, less energy will be consumed that at last results in improvement of longevity of network. Heuristic methods are used in order to solve the NP-hard problem that one of these optimization methods is to solve Simulated Annealing problems. In this article, we will propose new method in order to build data collection tree in wireless sensor networks by using Simulated Annealing algorithm and we will evaluate its efficiency whit Genetic Algorithm.
Abstract: We investigated oxidative DNA damage caused by
radio frequency radiation using 8-oxo-7, 8-dihydro-2'-
deoxyguanosine (8-oxodG) generated in mice tissues after exposure
to 900 MHz mobile phone radio frequency in three independent
experiments. The RF was generated by a Global System for Mobile
Communication (GSM) signal generator. The radio frequency field
was adjusted to 25 V/m. The whole body specific absorption rate
(SAR) was 1.0 W/kg. Animals were exposed to this field for 30 min
daily for 30 days. 24 h post-exposure, blood serum, brain and spleen
were removed and DNA was isolated. Enzyme-linked
immunosorbent assay (ELISA) was used to measure 8-oxodG
concentration. All animals survived the whole experimental period.
The body weight of animals did not change significantly at the end of
the experiment. No statistically significant differences observed in
the levels of oxidative stress. Our results are not in favor of the
hypothesis that 900 MHz RF induces oxidative damage.
Abstract: Researchers have been applying tional intelligence (AI/CI) methods to computer games. In this research field, further researchesare required to compare AI/CI
methods with respect to each game application. In th
our experimental result on the comparison of three evolutionary algorithms – evolution strategy, genetic algorithm, and their hybrid
applied to evolving controller agents for the CIG 2007 Simulated Car Racing competition. Our experimental result shows that, premature
convergence of solutions was observed in the case of ES, and GA outperformed ES in the last half of generations. Besides, a hybrid
which uses GA first and ES next evolved the best solution among the whole solutions being generated. This result shows the ability of GA in
globally searching promising areas in the early stage and the ability of ES in locally searching the focused area (fine-tuning solutions).
Abstract: A ten-year grazing study was conducted at the
Agriculture and Agri-Food Canada Brandon Research Centre in
Manitoba to study the effect of alfalfa inclusion and fertilizer (N, P,
K, and S) addition on economics and efficiency of non-renewable
energy use in meadow brome grass-based pasture systems for beef
production. Fertilizing grass-only or alfalfa-grass pastures to full soil
test recommendations improved pasture productivity, but did not
improve profitability compared to unfertilized pastures. Fertilizing
grass-only pastures resulted in the highest net loss of any pasture
management strategy in this study. Adding alfalfa at the time of
seeding, with no added fertilizer, was economically the best pasture
improvement strategy in this study. Because of moisture limitations,
adding commercial fertilizer to full soil test recommendations is
probably not economically justifiable in most years, especially with
the rising cost of fertilizer. Improving grass-only pastures by adding
fertilizer and/or alfalfa required additional non-renewable energy
inputs; however, the additional energy required for unfertilized
alfalfa-grass pastures was minimal compared to the fertilized
pastures. Of the four pasture management strategies, adding alfalfa
to grass pastures without adding fertilizer had the highest efficiency
of energy use. Based on energy use and economic performance, the
unfertilized alfalfa-grass pasture was the most efficient and
sustainable pasture system.
Abstract: Image Compression using Artificial Neural Networks
is a topic where research is being carried out in various directions
towards achieving a generalized and economical network.
Feedforward Networks using Back propagation Algorithm adopting
the method of steepest descent for error minimization is popular and
widely adopted and is directly applied to image compression.
Various research works are directed towards achieving quick
convergence of the network without loss of quality of the restored
image. In general the images used for compression are of different
types like dark image, high intensity image etc. When these images
are compressed using Back-propagation Network, it takes longer
time to converge. The reason for this is, the given image may
contain a number of distinct gray levels with narrow difference with
their neighborhood pixels. If the gray levels of the pixels in an image
and their neighbors are mapped in such a way that the difference in
the gray levels of the neighbors with the pixel is minimum, then
compression ratio as well as the convergence of the network can be
improved. To achieve this, a Cumulative distribution function is
estimated for the image and it is used to map the image pixels. When
the mapped image pixels are used, the Back-propagation Neural
Network yields high compression ratio as well as it converges
quickly.