Abstract: An overview of the important aspects of managing
and controlling industrial effluent discharges to public sewers namely
sampling, characterization, quantification and legislative controls has
been presented. The findings have been validated by means of a case
study covering three industrial sectors namely, tanning, textile
finishing and food processing industries. Industrial effluents
discharges were found to be best monitored by systematic and
automatic sampling and quantified using water meter readings
corrected for evaporative and consumptive losses. Based on the
treatment processes employed in the public owned treatment works
and the chemical oxygen demand and biochemical oxygen demand
levels obtained, the effluent from all the three industrial sectors
studied were found to lie in the toxic zone. Thus, physico-chemical
treatment of these effluents is required to bring them into the
biodegradable zone. KL values (quoted to base e) were greater than
0.50 day-1 compared to 0.39 day-1 for typical municipality
wastewater.
Abstract: In the highly competitive and rapidly changing global
marketplace, independent organizations and enterprises often come
together and form a temporary alignment of virtual enterprise in a
supply chain to better provide products or service. As firms adopt the
systems approach implicit in supply chain management, they must
manage the quality from both internal process control and external
control of supplier quality and customer requirements. How to
incorporate quality management of upstream and downstream supply
chain partners into their own quality management system has recently
received a great deal of attention from both academic and practice.
This paper investigate the collaborative feature and the entities-
relationship in a supply chain, and presents an ontology of
collaborative supply chain from an approach of aligning
service-oriented framework with service-dominant logic. This
perspective facilitates the segregation of material flow management
from manufacturing capability management, which provides a
foundation for the coordination and integration of the business process
to measure, analyze, and continually improve the quality of products,
services, and process. Further, this approach characterizes the different
interests of supply chain partners, providing an innovative approach to
analyze the collaborative features of supply chain. Furthermore, this
ontology is the foundation to develop quality management system
which internalizes the quality management in upstream and
downstream supply chain partners and manages the quality in supply
chain systematically.
Abstract: In this paper we will develop a sequential life test approach applied to a modified low alloy-high strength steel part used in highway overpasses in Brazil.We will consider two possible underlying sampling distributions: the Normal and theInverse Weibull models. The minimum life will be considered equal to zero. We will use the two underlying models to analyze a fatigue life test situation, comparing the results obtained from both.Since a major chemical component of this low alloy-high strength steel part has been changed, there is little information available about the possible values that the parameters of the corresponding Normal and Inverse Weibull underlying sampling distributions could have. To estimate the shape and the scale parameters of these two sampling models we will use a maximum likelihood approach for censored failure data. We will also develop a truncation mechanism for the Inverse Weibull and Normal models. We will provide rules to truncate a sequential life testing situation making one of the two possible decisions at the moment of truncation; that is, accept or reject the null hypothesis H0. An example will develop the proposed truncated sequential life testing approach for the Inverse Weibull and Normal models.
Abstract: Over the years, there is a growing trend towards
quality-based specifications in highway construction. In many
Quality Control/Quality Assurance (QC/QA) specifications, the
contractor is primarily responsible for quality control of the process,
whereas the highway agency is responsible for testing the acceptance
of the product. A cooperative investigation was conducted in Illinois
over several years to develop a prototype End-Result Specification
(ERS) for asphalt pavement construction. The final characteristics of
the product are stipulated in the ERS and the contractor is given
considerable freedom in achieving those characteristics. The risk for
the contractor or agency depends on how the acceptance limits and
processes are specified. Stochastic simulation models are very useful
in estimating and analyzing payment risk in ERS systems and these
form an integral part of the Illinois-s prototype ERS system. This
paper describes the development of an innovative methodology to
estimate the variability components in in-situ density, air voids and
asphalt content data from ERS projects. The information gained from
this would be crucial in simulating these ERS projects for estimation
and analysis of payment risks associated with asphalt pavement
construction. However, these methods require at least two parties to
conduct tests on all the split samples obtained according to the
sampling scheme prescribed in present ERS implemented in Illinois.
Abstract: Deoxyribonucleic Acid or DNA computing has
emerged as an interdisciplinary field that draws together chemistry,
molecular biology, computer science and mathematics. Thus, in this
paper, the possibility of DNA-based computing to solve an absolute
1-center problem by molecular manipulations is presented. This is
truly the first attempt to solve such a problem by DNA-based
computing approach. Since, part of the procedures involve with
shortest path computation, research works on DNA computing for
shortest path Traveling Salesman Problem, in short, TSP are reviewed.
These approaches are studied and only the appropriate one is adapted
in designing the computation procedures. This DNA-based
computation is designed in such a way that every path is encoded by
oligonucleotides and the path-s length is directly proportional to the
length of oligonucleotides. Using these properties, gel electrophoresis
is performed in order to separate the respective DNA molecules
according to their length. One expectation arise from this paper is that
it is possible to verify the instance absolute 1-center problem using
DNA computing by laboratory experiments.
Abstract: This paper is focusing on designing a control system
for wind turbine which can control the speed and output power
according to arbitrary algorithm. Reference Tracking Method is used
to control the turbine spinning speed in order to increase its output
energy.
Abstract: This study aims to investigate empirically the valuerelevance
of accounting information to domestic investors in Tehran
stock exchange from 1999 to 2006. During the present research
impacts of two factors, including positive vs. negative earnings and
the firm size are considered as well. The authors used earnings per
share and annual change of earnings per share as the income
statement indices, and book value of equity per share as the balance
sheet index. Return and Price models through regression analysis are
deployed in order to test the research hypothesis. Results depicted
that accounting information is value-relevance to domestic investors
in Tehran Stock Exchange according to both studied models.
However, income statement information has more value-relevance
than the balance sheet information. Furthermore, positive vs. negative
earnings and firm size seems to have significant impact on valuerelevance
of accounting information.
Abstract: In recent years, it has been proposed security
architecture for sensor network.[2][4]. One of these, TinySec by Chris
Kalof, Naveen Sastry, David Wagner had proposed Link layer security
architecture, considering some problems of sensor network. (i.e :
energy, bandwidth, computation capability,etc). The TinySec employs
CBC_mode of encryption and CBC-MAC for authentication based on
SkipJack Block Cipher. Currently, This TinySec is incorporated in the
TinyOS for sensor network security.
This paper introduces TinyHash based on general hash algorithm.
TinyHash is the module in order to replace parts of authentication and
integrity in the TinySec. it implies that apply hash algorithm on
TinySec architecture. For compatibility about TinySec, Components
in TinyHash is constructed as similar structure of TinySec. And
TinyHash implements the HMAC component for authentication and
the Digest component for integrity of messages. Additionally, we
define the some interfaces for service associated with hash algorithm.
Abstract: A serious problem on the WWW is finding reliable
information. Not everything found on the Web is true and the
Semantic Web does not change that in any way. The problem will be
even more crucial for the Semantic Web, where agents will be
integrating and using information from multiple sources. Thus, if an
incorrect premise is used due to a single faulty source, then any
conclusions drawn may be in error. Thus, statements published on
the Semantic Web have to be seen as claims rather than as facts, and
there should be a way to decide which among many possibly
inconsistent sources is most reliable. In this work, we propose a trust
model for the Semantic Web. The proposed model is inspired by the
use trust in human society. Trust is a type of social knowledge and
encodes evaluations about which agents can be taken as reliable
sources of information or services. Our proposed model allows
agents to decide which among different sources of information to
trust and thus act rationally on the semantic web.
Abstract: Recently, business environment and customer needs
have become rapidly changing, hence it is very difficult to fulfill
sophisticated customer needs by product or service innovation only. In
practice, to cope with this problem, various manufacturing companies
have developed services to combine with their products. Along with
this, many academic studies on PSS (Product Service System) which is
the integrated system of products and services have been conducted
from the viewpoint of manufacturers. On the other hand, service
providers are also attempting to develop service-supporting products
to increase their service competitiveness and provide differentiated
value. However, there is a lack of research based on the service-centric
point of view. Accordingly, this paper proposes a concept generation
method for service-supporting product development from the
service-centric point of view. This method is designed to be executed
in five consecutive steps: situation analysis, problem definition,
problem resolution, solution evaluation, and concept generation. In
the proposed approach, some tools of TRIZ (Theory of Solving
Inventive Problem) such as ISQ (Innovative Situation Questionnaire)
and 40 inventive principles are employed in order to define problems
of the current services and solve them by generating
service-supporting product concepts. This research contributes to the
development of service-supporting products and service-centric PSSs.
Abstract: Bioinformatics methods for predicting the T cell
coreceptor usage from the array of membrane protein of HIV-1 are
investigated. In this study, we aim to propose an effective prediction
method for dealing with the three-class classification problem of
CXCR4 (X4), CCR5 (R5) and CCR5/CXCR4 (R5X4). We made
efforts in investigating the coreceptor prediction problem as follows: 1)
proposing a feature set of informative physicochemical properties
which is cooperated with SVM to achieve high prediction test
accuracy of 81.48%, compared with the existing method with
accuracy of 70.00%; 2) establishing a large up-to-date data set by
increasing the size from 159 to 1225 sequences to verify the proposed
prediction method where the mean test accuracy is 88.59%, and 3)
analyzing the set of 14 informative physicochemical properties to
further understand the characteristics of HIV-1coreceptors.
Abstract: This research was aimed to develop and determine the
quality of online learning activities kit as well as to examine the
learning achievement of students and their satisfaction towards the kit
through authentic assessment. The tools in this research contained
online learning activities kit on plant in Thai literature in compliance
with the School Botanical Garden of Plant Genetic Conservation
Project under the Royal Initiative of Her Royal Highness Princess
Maha Chakri Sirindhorn, the assessment form, the learning
achievement test, the satisfaction form and the authentic assessment
form. The population consisted of 40 students in the second range of
primary years (Prathomsuksa 4 to 6) at Ban Khao Rak School,
Suratthani Province, Thailand. The research results showed that the
content quality of the developed online learning activities kit as
assessed by the experts was 4.70 on average or at very high level.
The pre-test and post-test comparison was made to examine the
learning achievement and it revealed that the post-test score was
higher than the pre-test score with statistical significance at the .01
level. The satisfaction of the sampling group towards the online
learning activities kit was 4.74 or at the highest level. The authentic
assessment showed an average of 1.69 or at good level. Therefore,
the online learning activities kit on plant in Thai literature in
compliance with the School Botanical Garden of Plant Genetic
Conservation Project under the Royal Initiative of Her Royal
Highness Princess Maha Chakri Sirindhorn could be used in real
classroom situations.
Abstract: An image compression method has been developed
using fuzzy edge image utilizing the basic Block Truncation Coding
(BTC) algorithm. The fuzzy edge image has been validated with
classical edge detectors on the basis of the results of the well-known
Canny edge detector prior to applying to the proposed method. The
bit plane generated by the conventional BTC method is replaced with
the fuzzy bit plane generated by the logical OR operation between
the fuzzy edge image and the corresponding conventional BTC bit
plane. The input image is encoded with the block mean and standard
deviation and the fuzzy bit plane. The proposed method has been
tested with test images of 8 bits/pixel and size 512×512 and found to
be superior with better Peak Signal to Noise Ratio (PSNR) when
compared to the conventional BTC, and adaptive bit plane selection
BTC (ABTC) methods. The raggedness and jagged appearance, and
the ringing artifacts at sharp edges are greatly reduced in
reconstructed images by the proposed method with the fuzzy bit
plane.
Abstract: Most of the Question Answering systems
composed of three main modules: question processing,
document processing and answer processing. Question
processing module plays an important role in QA systems. If
this module doesn't work properly, it will make problems for
other sections. Moreover answer processing module is an
emerging topic in Question Answering, where these systems
are often required to rank and validate candidate answers.
These techniques aiming at finding short and precise answers
are often based on the semantic classification.
This paper discussed about a new model for question
answering which improved two main modules, question
processing and answer processing.
There are two important components which are the bases
of the question processing. First component is question
classification that specifies types of question and answer.
Second one is reformulation which converts the user's
question into an understandable question by QA system in a
specific domain. Answer processing module, consists of
candidate answer filtering, candidate answer ordering
components and also it has a validation section for interacting
with user. This module makes it more suitable to find exact
answer. In this paper we have described question and answer
processing modules with modeling, implementing and
evaluating the system. System implemented in two versions.
Results show that 'Version No.1' gave correct answer to 70%
of questions (30 correct answers to 50 asked questions) and
'version No.2' gave correct answers to 94% of questions (47
correct answers to 50 asked questions).
Abstract: In this paper, we propose a new method to describe fractal shapes using parametric l-systems. First we introduce scaling factors in the production rules of the parametric l-systems grammars. Then we decorticate these grammars with scaling factors using turtle algebra to show the mathematical relation between l-systems and iterated function systems (IFS). We demonstrate that with specific values of the scaling factors, we find the exact relationship established by Prusinkiewicz and Hammel between l-systems and IFS.
Abstract: To help overcome limits to the density of conventional SRAMs and leakage current of SRAM cell in nanoscaled CMOS technology, we have developed a four-transistor SRAM cell. The newly developed CMOS four-transistor SRAM cell uses one word-line and one bit-line during read/write operation. This cell retains its data with leakage current and positive feedback without refresh cycle. The new cell size is 19% smaller than a conventional six-transistor cell using same design rules. Also the leakage current of new cell is 60% smaller than a conventional sixtransistor SRAM cell. Simulation result in 65nm CMOS technology shows new cell has correct operation during read/write operation and idle mode.
Abstract: Calcite aCalcite and aragonite are the two common
polymorphs of CaCO3 observed as biominerals. It is universal that
the sea water contents a high Mg2+ (50mM) relative to Ca2+ (10mM).
In vivo crystallization, Mg2+ inhibits calcite formation. For this
reason, stony corals skeleton may be formed only aragonite crystals
in the biocalcification. It is special in case of soft corals of which
formed only calcite crystal; however, this interesting phenomenon,
still uncharacterized in the marine environment, has been explored in
this study using newly purified cell-free proteins isolated from the
endoskeletal sclerites of soft coral. By recording the decline of pH in
vitro, the control of CaCO3 nucleation and crystal growth by the cellfree
proteins was revealed. Using Atomic Force Microscope, here we
find that these endoskeletal cell-free proteins significantly design the
morphological shape in the molecular-scale kinetics of crystal
formation and those proteins act as surfactants to promote ion
attachment at calcite steps.nd aragonite are the two common polymorphs of CaCO3 observed as biominerals. It is universal that the sea water contents a high Mg2+ (50mM) relative to Ca2+ (10mM). In vivo crystallization, Mg2+ inhibits calcite formation. For this reason, stony corals skeleton may be formed only aragonite crystals in the biocalcification. It is special in case of soft corals of which formed only calcite crystal; however, this interesting phenomenon, still uncharacterized in the marine environment, has been explored in this study using newly purified cell-free proteins isolated from the endoskeletal sclerites of soft coral. By recording the decline of pH in vitro, the control of CaCO3 nucleation and crystal growth by the cell-free proteins was revealed. Using Atomic Force Microscope, here we find that these endoskeletal cell-free proteins significantly design the morphological shape in the molecular-scale kinetics of crystal formation and those proteins act as surfactants to promote ion attachment at calcite steps. KeywordsBiomineralization, Calcite, Cell-free protein, Soft coral
Abstract: Water level forecasting using records of past time series is of importance in water resources engineering and management. For example, water level affects groundwater tables in low-lying coastal areas, as well as hydrological regimes of some coastal rivers. Then, a reliable prediction of sea-level variations is required in coastal engineering and hydrologic studies. During the past two decades, the approaches based on the Genetic Programming (GP) and Artificial Neural Networks (ANN) were developed. In the present study, the GP is used to forecast daily water level variations for a set of time intervals using observed water levels. The measurements from a single tide gauge at Urmia Lake, Northwest Iran, were used to train and validate the GP approach for the period from January 1997 to July 2008. Statistics, the root mean square error and correlation coefficient, are used to verify model by comparing with a corresponding outputs from Artificial Neural Network model. The results show that both these artificial intelligence methodologies are satisfactory and can be considered as alternatives to the conventional harmonic analysis.
Abstract: The mosques have been appearance in Thailand since
Ayutthaya Kingdom (1350 to 1767 A.D.) Until today, more than 400 years later; there are many styles of art form behind their structure.
This research intended to identify Islamic Art in Thai mosques. A framework was applied using qualitative research methods; Thai
Muslims with dynamic roles in Islamic culture were interviewed. In
addition, a field survey of 40 selected mosques from 175 Thai
mosques was studied. Data analysis will be according to the pattern
of each period. The identification of Islamic Art in Thai Mosques are
1) the image of Thai identity: with Thai traditional art style and Government policy. 2) The image of the Ethnological identity: with
the traditional culture of Asian Muslims in Thailand. 3) The image of
the Nostalgia identity: with Islamic and Arabian conservative style.
4) The image of the Neo Classic identity: with Neo – Classic and
Contemporary art. 5) The image of the new identity: with Post
Modern and Deconstruction art.
Abstract: The structural interpretation of a part of eastern Potwar
(Missa Keswal) has been carried out with available seismological,
seismic and well data. Seismological data contains both the source
parameters and fault plane solution (FPS) parameters and seismic data
contains ten seismic lines that were re-interpreted by using well data.
Structural interpretation depicts two broad types of fault sets namely,
thrust and back thrust faults. These faults together give rise to pop up
structures in the study area and also responsible for many structural
traps and seismicity. Seismic interpretation includes time and depth
contour maps of Chorgali Formation while seismological interpretation
includes focal mechanism solution (FMS), depth, frequency,
magnitude bar graphs and renewal of Seismotectonic map. The Focal
Mechanism Solutions (FMS) that surrounds the study area are
correlated with the different geological and structural maps of the area
for the determination of the nature of subsurface faults. Results of
structural interpretation from both seismic and seismological data
show good correlation. It is hoped that the present work will help in
better understanding of the variations in the subsurface structure and
can be a useful tool for earthquake prediction, planning of oil field and
reservoir monitoring.