Abstract: Missing data is a persistent problem in almost all
areas of empirical research. The missing data must be treated very
carefully, as data plays a fundamental role in every analysis.
Improper treatment can distort the analysis or generate biased results.
In this paper, we compare and contrast various imputation techniques
on missing data sets and make an empirical evaluation of these
methods so as to construct quality software models. Our empirical
study is based on NASA-s two public dataset. KC4 and KC1. The
actual data sets of 125 cases and 2107 cases respectively, without
any missing values were considered. The data set is used to create
Missing at Random (MAR) data Listwise Deletion(LD), Mean
Substitution(MS), Interpolation, Regression with an error term and
Expectation-Maximization (EM) approaches were used to compare
the effects of the various techniques.
Abstract: ZnO nanostructures including nanowires, nanorods,
and nanoneedles were successfully deposited on GaAs substrates,
respectively, by simple two-step chemical method for the first time. A
ZnO seed layer was firstly pre-coated on the O2-plasma treated
substrate by sol-gel process, followed by the nucleation of ZnO
nanostructures through hydrothermal synthesis. Nanostructures with
different average diameter (15-250 nm), length (0.9-1.8 μm), density
(0.9-16×109 cm-2) were obtained via adjusting the growth time and
concentration of precursors. From the reflectivity spectra, we
concluded ordered and taper nanostructures were preferential for
photovoltaic applications. ZnO nanoneedles with an average diameter
of 106 nm, a moderate length of 2.4 μm, and the density of 7.2×109
cm-2 could be synthesized in the concentration of 0.04 M for 18 h.
Integrated with the nanoneedle array, the power conversion efficiency
of single junction solar cell was increased from 7.3 to 12.2%,
corresponding to a 67% improvement.
Abstract: Present wireless communication demands compact and intelligent devices with multitasking capabilities at affordable cost. The focus in the presented paper is on a dual band antenna for wireless communication with the capability of operating at two frequency bands with same structure. Two resonance frequencies are observed with the second operation band at 4.2GHz approximately three times the first resonance frequency at 1.5GHz. Structure is simple loop of microstrip line with characteristic impedance 50 ohms. The proposed antenna is designed using defective ground structure (DGS) and shows the nearly one third reductions in size as compared to without DGS. This antenna was simulated on electromagnetic (EM) simulation software and fabricated using microwave integrated circuit technique on RT-Duroid dielectric substrate (εr= 2.22) of thickness (H=15 mils). The designed antenna was tested on automatic network analyzer and shows the good agreement with simulated results. The proposed structure is modeled into an equivalent electrical circuit and simulated on circuit simulator. Subsequently, theoretical analysis was carried out and simulated. The simulated, measured, equivalent circuit response, and theoretical results shows good resemblance. The bands of operation draw many potential applications in today’s wireless communication.
Abstract: Recently, the design is becoming important in product development. The technology which is a strong point of Japan is immediately caught up by the foreign countries, and the price competition begins. Therefore companies tend to plan differentiation of products by the design or a color. The purpose of my work was to consider the optimal color for using by product development. We needed to clarify the thing leading to color preference for this purpose. Two kinds of investigations were made. By the first investigation, we found out that a geographical factor difference existed in color preference. Then, investigation which regarded the difference as latitude was conducted. However, the result expected from the difference in latitude was not obtained. It seems that it is necessary to set up difference of latitude a little more greatly, or to reexamine by other geographical factors.
Abstract: In the current Grid environment, efficient workload
management presents a significant challenge, for which there are
exorbitant de facto standards encompassing resource discovery,
brokerage, and data transfer, among others. In addition, the real-time
resource status, essential for an optimal resource allocation strategy,
is often not readily accessible. To address these issues and provide a
cleaner abstraction of the Grid with the potential of generalizing into
arbitrary resource-sharing environment, this paper proposes a new
Condor-based pilot mechanism applied in the PanDA architecture,
PanDA-PF WMS, with the goal of providing a more generic yet
efficient resource allocating strategy. In this architecture, the PanDA
server primarily acts as a repository of user jobs, responding to pilot
requests from distributed, remote resources. Scheduling decisions are
subsequently made according to the real-time resource information
reported by pilots. Pilot Factory is a Condor-inspired solution for a
scalable pilot dissemination and effectively functions as a resource
provisioning mechanism through which the user-job server, PanDA,
reaches out to the candidate resources only on demand.
Abstract: This study aims to conduct a preliminary investigation to determine the topic to be focused in developing Virtual Laboratory For Biology (VLab-Bio). Samples involved in answering the questionnaire are form five students (equivalent to A-Level) and biology teachers. Time and economical resources for the setting up and construction of scientific laboratories can be solved with the adaptation of virtual laboratories as an educational tool. Thus, it is hoped that the proposed virtual laboratory will help students to learn the abstract concepts in biology. Findings show that the difficult topic chosen is Cell Division and the learning objective to be focused in developing the virtual lab is “Describe the application of knowledge on mitosis in cloning".
Abstract: Finding the shortest path between two positions is a
fundamental problem in transportation, routing, and communications
applications. In robot motion planning, the robot should pass around
the obstacles touching none of them, i.e. the goal is to find a
collision-free path from a starting to a target position. This task has
many specific formulations depending on the shape of obstacles,
allowable directions of movements, knowledge of the scene, etc.
Research of path planning has yielded many fundamentally different
approaches to its solution, mainly based on various decomposition
and roadmap methods. In this paper, we show a possible use of
visibility graphs in point-to-point motion planning in the Euclidean
plane and an alternative approach using Voronoi diagrams that
decreases the probability of collisions with obstacles. The second
application area, investigated here, is focused on problems of finding
minimal networks connecting a set of given points in the plane using
either only straight connections between pairs of points (minimum
spanning tree) or allowing the addition of auxiliary points to the set
to obtain shorter spanning networks (minimum Steiner tree).
Abstract: Proprietary sensor network systems are typically expensive, rigid and difficult to incorporate technologies from other vendors. When using competing and incompatible technologies, a non-proprietary system is complex to create because it requires significant technical expertise and effort, which can be more expensive than a proprietary product. This paper presents the Sensor Abstraction Layer (SAL) that provides middleware architectures with a consistent and uniform view of heterogeneous sensor networks, regardless of the technologies involved. SAL abstracts and hides the hardware disparities and specificities related to accessing, controlling, probing and piloting heterogeneous sensors. SAL is a single software library containing a stable hardware-independent interface with consistent access and control functions to remotely manage the network. The end-user has near-real-time access to the collected data via the network, which results in a cost-effective, flexible and simplified system suitable for novice users. SAL has been used for successfully implementing several low-cost sensor network systems.
Abstract: The purpose of this study is to investigate the chemical
degradation of the organophosphorus pesticide of parathion and
carbamate insecticide of methomyl in the aqueous phase through
Fenton process. With the employment of batch Fenton process, the
degradation of the two selected pesticides at different pH, initial
concentration, humic acid concentration, and Fenton reagent dosages
was explored. The Fenton process was found effective to degrade
parathion and methomyl. The optimal dosage of Fenton reagents (i.e.,
molar concentration ratio of H2O2 to Fe2+) at pH 7 for parathion
degradation was equal to 3, which resulted in 50% removal of
parathion. Similarly, the optimal dosage for methomyl degradation
was 1, resulting in 80% removal of methomyl. This study also found
that the presence of humic substances has enhanced pesticide
degradation by Fenton process significantly. The mass spectroscopy
results showed that the hydroxyl free radical may attack the single
bonds with least energy of investigated pesticides to form smaller
molecules which is more easily to degrade either through
physio-chemical or bilolgical processes.
Abstract: Machine-understandable data when strongly
interlinked constitutes the basis for the SemanticWeb. Annotating
web documents is one of the major techniques for creating metadata
on the Web. Annotating websites defines the containing data in a
form which is suitable for interpretation by machines. In this paper,
we present a new approach to annotate websites and documents by
promoting the abstraction level of the annotation process to a
conceptual level. By this means, we hope to solve some of the
problems of the current annotation solutions.
Abstract: The present study addresses problems and solutions
related to new functional food production. Wheat (Triticum aestivum
L) bran obtained from industrial mill company “Dobeles
dzirnavieks”, was used to investigate them as raw material like
nutrients for Bifidobacterium lactis Bb-12. Enzymatic hydrolysis of
wheat bran starch was carried out by α-amylase from Bacillus
amyloliquefaciens (Sigma Aldrich). The Viscozyme L purchased
from (Sigma Aldrich) were used for reducing released sugar.
Bifidibacterium lactis Bb-12 purchased from (Probio-Tec® CHR
Hansen) was cultivated in enzymatically hydrolysed wheat bran
mash. All procedures ensured the number of active Bifidobacterium
lactis Bb-12 in the final product reached 105 CFUg-1. After enzymatic
and bacterial fermentations sample were freeze dried for analysis of
chemical compounds. All experiments were performed at Faculty of
Food Technology of Latvia University of Agriculture in January-
March 2013. The obtained results show that both types of wheat bran
(enzymatically treated and non-treated) influenced the fermentative
activity and number of Bifidibacterium lactis Bb-12 viable in wheat
bran mash. Amount of acidity strongly increase during the wheat
bran mash fermentation. The main objective of this work was to
create low-energy functional enzymatically and bacterially treated
food from wheat bran using enzymatic hydrolysis of carbohydrates
and following cultivation of Bifidobacterium lactis Bb-12.
Abstract: Agricultural residue such as oil palm fronds (OPF) is
cheap, widespread and available throughout the year. Hemicelluloses
extracted from OPF can be hydrolyzed to their monomers and used in
production of xylooligosaccharides (XOs). The objective of the
present study was to optimize the enzymatic hydrolysis process of
OPF hemicellulose by varying pH, temperature, enzyme and substrate
concentration for production of XOs. Hemicelluloses was extracted
from OPF by using 3 M potassium hydroxide (KOH) at temperature of
40°C for 4 hrs and stirred at 400 rpm. The hemicellulose was then
hydrolyzed using Trichoderma longibrachiatum xylanase at different
pH, temperature, enzyme and substrate concentration. XOs were
characterized based on reducing sugar determination. The optimum
conditions to produced XOs from OPF hemicellulose was obtained at
pH 4.6, temperature of 40°C , enzyme concentration of 2 U/mL and
2% substrate concentration. The results established the suitability of
oil palm fronds as raw material for production of XOs.
Abstract: In the present investigation, H13 tool steel has been
deposited on copper alloy substrate using both CO2 and diode laser.
A detailed parametric analysis has been carried out in order to find
out optimum processing zone for coating defect free H13 tool steel
on copper alloy substrate. Followed by parametric optimization, the
microstructure and microhardness of the deposited clads have been
evaluated. SEM micrographs revealed dendritic microstructure in
both clads. However, the microhardness of CO2 laser deposited clad
was much higher compared to diode laser deposited clad.
Abstract: Perceptions of quality from both designers and users
perspective have now stretched beyond the traditional usability,
incorporating abstract and subjective concepts. This has led to a shift
in human computer interaction research communities- focus; a shift
that focuses on achieving user experience (UX) by not only fulfilling
conventional usability needs but also those that go beyond them. The
term UX, although widely spread and given significant importance,
lacks consensus in its unified definition. In this paper, we survey
various UX definitions and modeling frameworks and examine them
as the foundation for proposing a UX evolution lifecycle framework
for understanding UX in detail. In the proposed framework we identify
the building blocks of UX and discuss how UX evolves in various
phases. The framework can be used as a tool to understand experience
requirements and evaluate them, resulting in better UX design and
hence improved user satisfaction.
Abstract: A separation-kernel-based operating system (OS) has been designed for use in secure embedded systems by applying formal methods to the design of the separation-kernel part. The separation kernel is a small OS kernel that provides an abstract distributed environment on a single CPU. The design of the separation kernel was verified using two formal methods, the B method and the Spin model checker. A newly designed semi-formal method, the extended state transition method, was also applied. An OS comprising the separation-kernel part and additional OS services on top of the separation kernel was prototyped on the Intel IA-32 architecture. Developing and testing of a prototype embedded application, a point-of-sale application, on the prototype OS demonstrated that the proposed architecture and the use of formal methods to design its kernel part are effective for achieving a secure embedded system having a high-assurance separation kernel.
Abstract: This paper presents an efficient emission constrained
economic dispatch algorithm that deals with nonlinear cost function
and constraints. It is then incorporated into the dynamic
programming based hydrothermal coordination program. The
program has been tested on a practical utility system having 32
thermal and 12 hydro generating units. Test results show that a slight
increase in production cost causes a substantial reduction in
emission.
Abstract: Product Lead Time (PLT) is the period of time from
receiving a customer's order to delivering the final product. PLT is an
indicator of the manufacturing controllability, efficiency and
performance. Due to the explosion in the rate of technological
innovations and the rapid changes in the nature of manufacturing
processes, manufacturing firms can bring the new products to market
quicker only if they can reduce their PLT and speed up the rate at
which they can design, plan, control, and manufacture. Although
there is a substantial body of research on manufacturing relating to
cost and quality issues, there is no much specific research conducted
in relation to the formulation of PLT, despite its significance and
importance. This paper analyzes and formulates PLT which can be
used as a guideline for achieving the shorter PLT. Further more this
paper identifies the causes of delay and factors that contributes to the
increased product lead-time.
Abstract: Detecting protein-protein interactions is a central problem in computational biology and aberrant such interactions may have implicated in a number of neurological disorders. As a result, the prediction of protein-protein interactions has recently received considerable attention from biologist around the globe. Computational tools that are capable of effectively identifying protein-protein interactions are much needed. In this paper, we propose a method to detect protein-protein interaction based on substring similarity measure. Two protein sequences may interact by the mean of the similarities of the substrings they contain. When applied on the currently available protein-protein interaction data for the yeast Saccharomyces cerevisiae, the proposed method delivered reasonable improvement over the existing ones.
Abstract: Knowledge modelling, a main activity for the development of Knowledge Based Systems, have no set standards and are mostly done in an ad hoc way. There is a lack of support for the transition from abstract level to implementation. In this paper, a methodology for the development of the knowledge model, which is inspired by both Software and Knowledge Engineering, is proposed. Use of UML which is the de-facto standard for modelling in the software engineering arena is explored for knowledge modelling. The methodology proposed, is used to develop a knowledge model of a knowledge based system for recommending suitable hotels for tourists visiting Mauritius.
Abstract: The scale, complexity and worldwide geographical
spread of the LHC computing and data analysis problems are
unprecedented in scientific research. The complexity of processing
and accessing this data is increased substantially by the size and
global span of the major experiments, combined with the limited
wide area network bandwidth available. We present the latest
generation of the MONARC (MOdels of Networked Analysis at
Regional Centers) simulation framework, as a design and modeling
tool for large scale distributed systems applied to HEP experiments.
We present simulation experiments designed to evaluate the
capabilities of the current real-world distributed infrastructure to
support existing physics analysis processes and the means by which
the experiments bands together to meet the technical challenges
posed by the storage, access and computing requirements of LHC
data analysis within the CMS experiment.