Abstract: The main objective of incorporating natural fibers such as Henequen microfibers (NF) into the High Density Polyethylene (HDPE) polymer matrix is to reduce the cost and to enhance the mechanical as well as other properties. The Henequen microfibers were chopped manually to 5-7mm in length and added into the polymer matrix at the optimized concentration of 8 wt %. In order to facilitate the link between Henequen microfibers (NF) and HDPE matrix, coupling agent such as Glycidoxy (Epoxy) Functional Methoxy Silane (GPTS) at various concentrations from 0.1%, 0.3%, 0.5%, 0.7%, 0.9% and 1% by weight to the total fibers were added. The tensile strength of the composite increased marginally while % elongation at break of the composites decreased with increase in silane loading by wt %. Tensile modulus and stiffness observed increased at 0.9 wt % GPTS loading. Flexural as well as impact strength of the composite decreased with increase in GPTS loading by weight %. Dielectric strength of the composite also found increased marginally up to 0.5wt % silane loading and thereafter remained constant.
Abstract: Testability modeling is a commonly used method in
testability design and analysis of system. A dependency matrix will be
obtained from testability modeling, and we will give a quantitative
evaluation about fault detection and isolation.
Based on the dependency matrix, we can obtain the diagnosis tree.
The tree provides the procedures of the fault detection and isolation.
But the dependency matrix usually includes built-in test (BIT) and
manual test in fact. BIT runs the test automatically and is not limited
by the procedures. The method above cannot give a more efficient
diagnosis and use the advantages of the BIT.
A Comprehensive method of fault detection and isolation is
proposed. This method combines the advantages of the BIT and
Manual test by splitting the matrix. The result of the case study shows
that the method is effective.
Abstract: Most people today are aware that global climate
change is not just a scientific theory but also a fact with worldwide
consequences. Global climate change is due to rapid urbanization,
industrialization, high population growth and current vulnerability of
the climatic condition. Water is becoming scarce as a result of global
climate change. To mitigate the problem arising due to global climate
change and its drought effect, harvesting rainwater from green roofs,
an environmentally-friendly and versatile technology, is becoming
one of the best assessment criteria and gaining attention in Malaysia.
This paper addresses the sustainability of green roofs and examines
the quality of water harvested from green roofs in comparison to
rainwater. The factors that affect the quality of such water, taking
into account, for example, roofing materials, climatic conditions, the
frequency of rainfall frequency and the first flush. A green roof was
installed on the Humid Tropic Centre (HTC) is a place of the study
on monitoring program for urban Stormwater Management Manual
for Malaysia (MSMA), Eco-Hydrological Project in Kuala Lumpur,
and the rainwater was harvested and evaluated on the basis of four
parameters i.e., conductivity, dissolved oxygen (DO), pH and
temperature. These parameters were found to fall between Class I and
Class III of the Interim National Water Quality Standards (INWQS)
and the Water Quality Index (WQI). Some preliminary treatment
such as disinfection and filtration could likely to improve the value of
these parameters to class I. This review paper clearly indicates that
there is a need for more research to address other microbiological and
chemical quality parameters to ensure that the harvested water is
suitable for use potable water for domestic purposes. The change in
all physical, chemical and microbiological parameters with respect to
storage time will be a major focus of future studies in this field.
Abstract: Quantification of cardiac function is performed by
calculating blood volume and ejection fraction in routine clinical
practice. However, these works have been performed by manual
contouring, which requires computational costs and varies on the
observer. In this paper, an automatic left ventricle segmentation
algorithm on cardiac magnetic resonance images (MRI) is presented.
Using knowledge on cardiac MRI, a K-mean clustering technique is
applied to segment blood region on a coil-sensitivity corrected image.
Then, a graph searching technique is used to correct segmentation
errors from coil distortion and noises. Finally, blood volume and
ejection fraction are calculated. Using cardiac MRI from 15 subjects,
the presented algorithm is tested and compared with manual
contouring by experts to show outstanding performance.
Abstract: This paper describes the tradeoffs and the design from
scratch of a self-contained, easy-to-use health dashboard software
system that provides customizable data tracking for patients in smart
homes. The system is made up of different software modules and
comprises a front-end and a back-end component. Built with HTML,
CSS, and JavaScript, the front-end allows adding users, logging into
the system, selecting metrics, and specifying health goals. The backend
consists of a NoSQL Mongo database, a Python script, and a
SimpleHTTPServer written in Python. The database stores user
profiles and health data in JSON format. The Python script makes use
of the PyMongo driver library to query the database and displays
formatted data as a daily snapshot of user health metrics against
target goals. Any number of standard and custom metrics can be
added to the system, and corresponding health data can be fed
automatically, via sensor APIs or manually, as text or picture data
files. A real-time METAR request API permits correlating weather
data with patient health, and an advanced query system is
implemented to allow trend analysis of selected health metrics over
custom time intervals. Available on the GitHub repository system,
the project is free to use for academic purposes of learning and
experimenting, or practical purposes by building on it.
Abstract: Construction cost estimation is one of the most
important aspects of construction project design. For generations, the
process of cost estimating has been manual, time-consuming and
error-prone. This has partly led to most cost estimates to be unclear
and riddled with inaccuracies that at times lead to over- or underestimation
of construction cost. The development of standard set of
measurement rules that are understandable by all those involved in a
construction project, have not totally solved the challenges. Emerging
Building Information Modelling (BIM) technologies can exploit
standard measurement methods to automate cost estimation process
and improve accuracies. This requires standard measurement
methods to be structured in ontological and machine readable format;
so that BIM software packages can easily read them. Most standard
measurement methods are still text-based in textbooks and require
manual editing into tables or Spreadsheet during cost estimation. The
aim of this study is to explore the development of an ontology based
on New Rules of Measurement (NRM) commonly used in the UK for
cost estimation. The methodology adopted is Methontology, one of
the most widely used ontology engineering methodologies. The
challenges in this exploratory study are also reported and
recommendations for future studies proposed.
Abstract: Bacterial strains capable of degradation of malathion
from the domestic sewage were isolated by an enrichment culture
technique. Three bacterial strains were screened and identified as
Acinetobacter baumannii (AFA), Pseudomonas aeruginosa (PS1),
and Pseudomonas mendocina (PS2) based on morphological,
biochemical identification and 16S rRNA sequence analysis.
Acinetobacter baumannii AFA was the most efficient malathion
degrading bacterium, so used for further biodegradation study. AFA
was able to grow in mineral salt medium (MSM) supplemented with
malathion (100 mg/l) as a sole carbon source, and within 14 days,
84% of the initial dose was degraded by the isolate measured by high
performance liquid chromatography. Strain AFA could also degrade
other organophosphorus compounds including diazinon, chlorpyrifos
and fenitrothion. The effect of different culture conditions on the
degradation of malathion like inoculum density, other carbon or
nitrogen sources, temperature and shaking were examined.
Degradation of malathion and bacterial cell growth were accelerated
when culture media were supplemented with yeast extract, glucose
and citrate. The optimum conditions for malathion degradation by
strain AFA were; an inoculum density of 1.5x 10^12CFU/ml at 30°C
with shaking. A specific polymerase chain reaction primers were
designed manually using multiple sequence alignment of the
corresponding carboxylesterase enzymes of Acinetobacter species.
Sequencing result of amplified PCR product and phylogenetic
analysis showed low degree of homology with the other
carboxylesterase enzymes of Acinetobacter strains, so we suggested
that this enzyme is a novel esterase enzyme. Isolated bacterial strains
may have potential role for use in bioremediation of malathion
contaminated.
Abstract: In medical imaging, segmentation of different areas of
human body like bones, organs, tissues, etc. is an important issue.
Image segmentation allows isolating the object of interest for further
processing that can lead for example to 3D model reconstruction of
whole organs. Difficulty of this procedure varies from trivial for
bones to quite difficult for organs like liver. The liver is being
considered as one of the most difficult human body organ to segment.
It is mainly for its complexity, shape versatility and proximity of
other organs and tissues. Due to this facts usually substantial user
effort has to be applied to obtain satisfactory results of the image
segmentation. Process of image segmentation then deteriorates from
automatic or semi-automatic to fairly manual one. In this paper,
overview of selected available software applications that can handle
semi-automatic image segmentation with further 3D volume
reconstruction of human liver is presented. The applications are being
evaluated based on the segmentation results of several consecutive
DICOM images covering the abdominal area of the human body.
Abstract: This paper identifies limitations of existing two e-
Governance services viz. railway ticket booking and passport service
in India. The comparison has been made as to how in the past these
two citizen services were operating manually and how these services
are taken online via e-Governance. Different e-Governance projects,
investment aspects, and role of corporate are discussed. For Indian
Railway online ticketing a comparison has been made between state
run booking website and popular private firm run booking websites.
For passport service, observations through personal visit to passport
center is described. Suggestions are made to improve these services
further to improve citizen service experiences.
Abstract: e-Service has moved from the usual manual and
traditional way of rendering services to electronic service provision
for the public and there are several reasons for implementing these
services, Airline ticketing have gone from its manual traditional way
to an intelligent web-driven service of purchasing. Many companies
have seen their profits doubled through the use of online services in
their operation and a typical example is Hewlett Packard (HP) which
is rapidly transforming their after sales business into a profit
generating e-service business unit.
This paper will examine the various challenges confronting e-
Service adoption and implementation in Nigeria and also analyse
lessons learnt from e-Service adoption and implementation in Asia to
see how it could be useful in Nigeria which is a lower middle income
country. From the analysis of the online survey data, it has been
identified that the public in Nigeria are much aware of e-Services but
successful adoption and implementation have been the problems
faced.
Abstract: Nowadays social media information, such as news,
links, images, or VDOs, is shared extensively. However, the
effectiveness of disseminating information through social media
lacks in quality: less fact checking, more biases, and several rumors.
Many researchers have investigated about credibility on Twitter, but
there is no the research report about credibility information on
Facebook. This paper proposes features for measuring credibility on
Facebook information. We developed the system for credibility on
Facebook. First, we have developed FB credibility evaluator for
measuring credibility of each post by manual human’s labelling. We
then collected the training data for creating a model using Support
Vector Machine (SVM). Secondly, we developed a chrome extension
of FB credibility for Facebook users to evaluate the credibility of
each post. Based on the usage analysis of our FB credibility chrome
extension, about 81% of users’ responses agree with suggested
credibility automatically computed by the proposed system.
Abstract: Creating a database scheme is essentially a manual
process. From a requirement specification the information contained
within has to be analyzed and reduced into a set of tables, attributes
and relationships. This is a time consuming process that has to go
through several stages before an acceptable database schema is
achieved. The purpose of this paper is to implement a Natural
Language Processing (NLP) based tool to produce a relational
database from a requirement specification. The Stanford CoreNLP
version 3.3.1 and the Java programming were used to implement the
proposed model. The outcome of this study indicates that a first draft
of a relational database schema can be extracted from a requirement
specification by using NLP tools and techniques with minimum user
intervention. Therefore this method is a step forward in finding a
solution that requires little or no user intervention.
Abstract: The purpose of this work is examining the multiproduct
multi-stage in a battery production line. To improve the
performances of an assembly production line by determine the
efficiency of each workstation. Data collected from every
workstation. The data are throughput rate, number of operator, and
number of parts that arrive and leaves during part processing. Data
for the number of parts that arrives and leaves are collected at least at
the amount of ten samples to make the data is possible to be analyzed
by Chi-Squared Goodness Test and queuing theory. Measures of this
model served as the comparison with the standard data available in
the company. Validation of the task time value resulted by comparing
it with the task time value based on the company database. Some
performance factors for the multi-product multi-stage in a battery
production line in this work are shown.
The efficiency in each workstation was also shown. Total
production time to produce each part can be determined by adding
the total task time in each workstation. To reduce the queuing time
and increase the efficiency based on the analysis any probably
improvement should be done. One probably action is by increasing
the number of operators how manually operate this workstation.
Abstract: The development of the agricultural sector in Ghana
has been reliant on the use of irrigation systems to ensure food
security. However, the manual operation of these systems has not
facilitated their maximum efficiency due to human limitations.
This paper seeks to address this problem by designing and
implementing an efficient, cost effective automated system which
monitors and controls the water flow of irrigation through
communication with an authorized operator via text messages. The
automatic control component of the system is timer based with an
Atmega32 microcontroller and a real time clock from the SM5100B
cellular module. For monitoring purposes, the system sends periodic
notification of the system on the performance of duty via SMS to the
authorized person(s). Moreover, the GSM based Irrigation
Monitoring and Control System saves time and labour and reduces
cost of operating irrigation systems by saving electricity usage and
conserving water.
Field tests conducted have proven its operational efficiency and
ease of assessment of farm irrigation equipment due to its costeffectiveness
and data logging capabilities.
Abstract: Different strategies and tools are available at the oil
and gas industry for detecting and analyzing tension and possible
fractures in borehole walls. Most of these techniques are based on
manual observation of the captured borehole images. While this
strategy may be possible and convenient with small images and few
data, it may become difficult and suitable to errors when big
databases of images must be treated. While the patterns may differ
among the image area, depending on many characteristics (drilling
strategy, rock components, rock strength, etc.). In this work we
propose the inclusion of data-mining classification strategies in order
to create a knowledge database of the segmented curves. These
classifiers allow that, after some time using and manually pointing
parts of borehole images that correspond to tension regions and
breakout areas, the system will indicate and suggest automatically
new candidate regions, with higher accuracy. We suggest the use of
different classifiers methods, in order to achieve different knowledge
dataset configurations.
Abstract: The effect of trucks on the level of service is
determined by considering passenger car equivalents (PCE) of trucks.
The current version of Highway Capacity Manual (HCM) uses a
single PCE value for all tucks combined. However, the composition
of truck traffic varies from location to location; therefore, a single
PCE value for all trucks may not correctly represent the impact of
truck traffic at specific locations. Consequently, present study
developed separate PCE values for single-unit and combination
trucks to replace the single value provided in the HCM on different
freeways. Site specific PCE values, were developed using concept of
spatial lagging headways (that is the distance between rear bumpers
of two vehicles in a traffic stream) measured from field traffic data.
The study used data from four locations on a single urban freeway
and three different rural freeways in Indiana. Three-stage-leastsquares
(3SLS) regression techniques were used to generate models
that predicted lagging headways for passenger cars, single unit trucks
(SUT), and combination trucks (CT). The estimated PCE values for
single-unit and combination truck for basic urban freeways (level
terrain) were: 1.35 and 1.60, respectively. For rural freeways the
estimated PCE values for single-unit and combination truck were:
1.30 and 1.45, respectively. As expected, traffic variables such as
vehicle flow rates and speed have significant impacts on vehicle
headways. Study results revealed that the use of separate PCE values
for different truck classes can have significant influence on the LOS
estimation.
Abstract: A blood pressure monitor or sphygmomanometer can
be either manual or automatic, employing respectively either the
auscultatory method or the oscillometric method.
The manual version of the sphygmomanometer involves an
inflatable cuff with a stethoscope adopted to detect the sounds
generated by the arterial walls to measure blood pressure in an artery.
An automatic sphygmomanometer can be effectively used to
monitor blood pressure through a pressure sensor, which detects
vibrations provoked by oscillations of the arterial walls.
The pressure sensor implemented in this device improves the
accuracy of the measurements taken.
Abstract: This article deals with a new approach to the airport
emergency plans, which are the basic documents and manuals for
dealing with events with impact on safety or security. The article
describes the identified parts in which the current airport emergency
plans do not fulfill their role and which should therefore be
considered in the creation of corrective measures. All these issues
have been identified at airports in the Czech Republic and confirmed
at airports in neighboring countries.
Abstract: In Knowledge and Data Engineering field, relational
database is the best repository to store data in a real world. It has
been using around the world more than eight decades. Normalization
is the most important process for the analysis and design of relational
databases. It aims at creating a set of relational tables with minimum
data redundancy that preserve consistency and facilitate correct
insertion, deletion, and modification. Normalization is a major task in
the design of relational databases. Despite its importance, very few
algorithms have been developed to be used in the design of
commercial automatic normalization tools. It is also rare technique to
do it automatically rather manually. Moreover, for a large and
complex database as of now, it make even harder to do it manually.
This paper presents a new complete automated relational database
normalization method. It produces the directed graph and spanning
tree, first. It then proceeds with generating the 2NF, 3NF and also
BCNF normal forms. The benefit of this new algorithm is that it can
cope with a large set of complex function dependencies.
Abstract: Molding process in IC manufacturing secures chips against the harms done by hot, moisture or other external forces. While a chip was being molded,defects like cracks, dilapidation, or voids may be embedding on the molding surface. The molding surfaces the study poises to treat and the ones on the market, though, differ in the surface where texture similar to defects is everywhere. Manual inspection usually passes over low-contrast cracks or voids; hence an automatic optical inspection system for molding surface is necessary. The proposed system is consisted of a CCD, a coaxial light, a back light as well as a motion control unit. Based on the property of statistical textures of the molding surface, a series of digital image processing and classification procedure is carried out. After training of the parameter associated with above algorithm, result of the experiment suggests that the accuracy rate is up to 93.75%, contributing to the inspection quality of IC molding surface.