Abstract: Simultaneous recovery of copper and DCA from
simulated MEUF concentrated stream was investigated. Effects of
surfactant (DCA) and metal (copper) concentrations, surfactant to
metal molar ratio (S/M ratio), electroplating voltage, EDTA
concentration, solution pH, and salt concentration on metal recovery
and current efficiency were studied. Electric voltage of -0.5 V was
shown to be optimum operation condition in terms of Cu recovery,
current efficiency, and surfactant recovery. Increasing Cu recovery and
current efficiency were observed with increases of Cu concentration
while keeping concentration of DCA constant. However, increasing
both Cu and DCA concentration while keeping S/M ratio constant at
2.5 showed detrimental effect on Cu recovery at DCA concentration
higher than 15 mM. Cu recovery decreases with increasing pH while
current efficiency showed an opposite trend. It is believed that
conductivity is the main cause for discrepancy of Cu recovery and
current efficiency observed at different pH. Finally, it was shown that
EDTA had adverse effect on both Cu recovery and current efficiency
while addition of NaCl salt had negative impact on current efficiency
at concentration higher than 8000 mg/L.
Abstract: A data warehouse (DW) is a system which has value and role for decision-making by querying. Queries to DW are critical regarding to their complexity and length. They often access millions of tuples, and involve joins between relations and aggregations. Materialized views are able to provide the better performance for DW queries. However, these views have maintenance cost, so materialization of all views is not possible. An important challenge of DW environment is materialized view selection because we have to realize the trade-off between performance and view maintenance cost. Therefore, in this paper, we introduce a new approach aimed at solve this challenge based on Two-Phase Optimization (2PO), which is a combination of Simulated Annealing (SA) and Iterative Improvement (II), with the use of Multiple View Processing Plan (MVPP). Our experiments show that our method provides a further improvement in term of query processing cost and view maintenance cost.
Abstract: Fisheries management all around the world is
hampered by the lack, or poor quality, of critical data on fish
resources and fishing operations. The main reasons for the chronic
inability to collect good quality data during fishing operations is the
culture of secrecy common among fishers and the lack of modern
data gathering technology onboard most fishing vessels. In response,
OLRAC-SPS, a South African company, developed fisheries datalogging
software (eLog in short) and named it Olrac. The Olrac eLog
solution is capable of collecting, analysing, plotting, mapping,
reporting, tracing and transmitting all data related to fishing
operations. Olrac can be used by skippers, fleet/company managers,
offshore mariculture farmers, scientists, observers, compliance
inspectors and fisheries management authorities. The authors believe
that using eLog onboard fishing vessels has the potential to
revolutionise the entire process of data collection and reporting
during fishing operations and, if properly deployed and utilised,
could transform the entire commercial fleet to a provider of good
quality data and forever change the way fish resources are managed.
In addition it will make it possible to trace catches back to the actual
individual fishing operation, to improve fishing efficiency and to
dramatically improve control of fishing operations and enforcement
of fishing regulations.
Abstract: We have defined two suites of metrics, which cover
static and dynamic aspects of component assembly. The static
metrics measure complexity and criticality of component assembly,
wherein complexity is measured using Component Packing Density
and Component Interaction Density metrics. Further, four criticality
conditions namely, Link, Bridge, Inheritance and Size criticalities
have been identified and quantified. The complexity and criticality
metrics are combined to form a Triangular Metric, which can be used
to classify the type and nature of applications. Dynamic metrics are
collected during the runtime of a complete application. Dynamic
metrics are useful to identify super-component and to evaluate the
degree of utilisation of various components. In this paper both static
and dynamic metrics are evaluated using Weyuker-s set of properties.
The result shows that the metrics provide a valid means to measure
issues in component assembly. We relate our metrics suite with
McCall-s Quality Model and illustrate their impact on product
quality and to the management of component-based product
development.
Abstract: Faced with social and health system capacity
constraints and rising and changing demand for welfare services,
governments and welfare providers are increasingly relying on
innovation to help support and enhance services. However, the
evidence reported by several studies indicates that the realization of
that potential is not an easy task. Innovations can be deemed
inherently complex to implement and operate, because many of them
involve a combination of technological and organizational renewal
within an environment featuring a diversity of stakeholders. Many
public welfare service innovations are markedly systemic in their
nature, which means that they emerge from, and must address, the
complex interplay between political, administrative, technological,
institutional and legal issues. This paper suggests that stakeholders
dealing with systemic innovation in welfare services must deal with
ambiguous and incomplete information in circumstances of
uncertainty. Employing a literature review methodology and case
study, this paper identifies, categorizes and discusses different
aspects of the uncertainty of systemic innovation in public welfare
services, and argues that uncertainty can be classified into eight
categories: technological uncertainty, market uncertainty,
regulatory/institutional uncertainty, social/political uncertainty,
acceptance/legitimacy uncertainty, managerial uncertainty, timing
uncertainty and consequence uncertainty.
Abstract: Hydrogen is an important chemical in many industries
and it is expected to become one of the major fuels for energy
generation in the future. Unfortunately, hydrogen does not exist in its
elemental form in nature and therefore has to be produced from
hydrocarbons, hydrogen-containing compounds or water.
Above its critical point (374.8oC and 22.1MPa), water has lower
density and viscosity, and a higher heat capacity than those of
ambient water. Mass transfer in supercritical water (SCW) is
enhanced due to its increased diffusivity and transport ability. The
reduced dielectric constant makes supercritical water a better solvent
for organic compounds and gases. Hence, due to the aforementioned
desirable properties, there is a growing interest toward studies
regarding the gasification of organic matter containing biomass or
model biomass solutions in supercritical water.
In this study, hydrogen and biofuel production by the catalytic
gasification of 2-Propanol in supercritical conditions of water was
investigated. Pt/Al2O3and Ni/Al2O3were the catalysts used in the
gasification reactions. All of the experiments were performed under a
constant pressure of 25MPa. The effects of five reaction temperatures
(400, 450, 500, 550 and 600°C) and five reaction times (10, 15, 20,
25 and 30 s) on the gasification yield and flammable component
content were investigated.
Abstract: In this paper we have proposed a methodology to
develop an amperometric biosensor for the analysis of glucose
concentration using a simple microcontroller based data acquisition
system. The work involves the development of Detachable
Membrane Unit (enzyme based biomembrane) with immobilized
glucose oxidase on the membrane and interfacing the same to the
signal conditioning system. The current generated by the biosensor
for different glucose concentrations was signal conditioned, then
acquired and computed by a simple AT89C51-microcontroller. The
optimum operating parameters for the better performance were found
and reported. The detailed performance evaluation of the biosensor
has been carried out. The proposed microcontroller based biosensor
system has the sensitivity of 0.04V/g/dl, with a resolution of
50mg/dl. It has exhibited very good inter day stability observed up to
30 days. Comparing to the reference method such as HPLC, the
accuracy of the proposed biosensor system is well within ± 1.5%.
The system can be used for real time analysis of glucose
concentration in the field such as, food and fermentation and clinical
(In-Vitro) applications.
Abstract: Water hyacinth has been used in aquatic systems for
wastewater purification in many years worldwide. The role of water
hyacinth (Eichhornia crassipes) species in polishing nitrate and
phosphorus concentration from municipal wastewater treatment plant
effluent by phytoremediation method was evaluated. The objective
of this project is to determine the removal efficiency of water
hyacinth in polishing nitrate and phosphorus, as well as chemical
oxygen demand (COD) and ammonia. Water hyacinth is considered
as the most efficient aquatic plant used in removing vast range of
pollutants such as organic matters, nutrients and heavy metals. Water
hyacinth, also referred as macrophytes, were cultivated in the
treatment house in a reactor tank of approximately 90(L) x 40(W) x
25(H) in dimension and built with three compartments. Three water
hyacinths were placed in each compartments and water sample in
each compartment were collected in every two days. The plant
observation was conducted by weight measurement, plant uptake and
new young shoot development. Water hyacinth effectively removed
approximately 49% of COD, 81% of ammonia, 67% of phosphorus
and 92% of nitrate. It also showed significant growth rate at starting
from day 6 with 0.33 shoot/day and they kept developing up to 0.38
shoot/day at the end of day 24. From the studies conducted, it was
proved that water hyacinth is capable of polishing the effluent of
municipal wastewater which contains undesirable amount of nitrate
and phosphorus concentration.
Abstract: Based on the homotopy perturbation method (HPM)
and Padé approximants (PA), approximate and exact solutions are
obtained for cubic Boussinesq and modified Boussinesq equations.
The obtained solutions contain solitary waves, rational solutions.
HPM is used for analytic treatment to those equations and PA for
increasing the convergence region of the HPM analytical solution.
The results reveal that the HPM with the enhancement of PA is a
very effective, convenient and quite accurate to such types of partial
differential equations.
Abstract: Collaborative working environments for distance
education can be considered as a more generic form of contemporary
remote labs. At present, the majority of existing real laboratories are
not constructed to allow the involved participants to collaborate in
real time. To make this revolutionary learning environment possible
we must allow the different users to carry out an experiment
simultaneously. In recent times, multi-user environments are
successfully applied in many applications such as air traffic control
systems, team-oriented military systems, chat-text tools, multi-player
games etc. Thus, understanding the ideas and techniques behind these
systems could be of great importance in the contribution of ideas to
our e-learning environment for collaborative working. In this
investigation, collaborative working environments from theoretical
and practical perspectives are considered in order to build an
effective collaborative real laboratory, which allows two students or
more to conduct remote experiments at the same time as a team. In
order to achieve this goal, we have implemented distributed system
architecture, enabling students to obtain an automated help by either
a human tutor or a rule-based e-tutor.
Abstract: Partial discharge (PD) detection is an important
method to evaluate the insulation condition of metal-clad apparatus.
Non-intrusive sensors which are easy to install and have no
interruptions on operation are preferred in onsite PD detection.
However, it often lacks of accuracy due to the interferences in PD
signals. In this paper a novel PD extraction method that uses frequency
analysis and entropy based time-frequency (TF) analysis is introduced.
The repetitive pulses from convertor are first removed via frequency
analysis. Then, the relative entropy and relative peak-frequency of
each pulse (i.e. time-indexed vector TF spectrum) are calculated and
all pulses with similar parameters are grouped. According to the
characteristics of non-intrusive sensor and the frequency distribution
of PDs, the pulses of PD and interferences are separated. Finally the
PD signal and interferences are recovered via inverse TF transform.
The de-noised result of noisy PD data demonstrates that the
combination of frequency and time-frequency techniques can
discriminate PDs from interferences with various frequency
distributions.
Abstract: When cars are released from the factory, strut noises are very small and therefore it is difficult to perceive them. As the use time and travel distance increase, however, strut noises get larger so as to cause users much uneasiness. The noises generated at the field include engine noises and flow noises and therefore it is difficult to clearly discern the noises generated from struts. This study developed a test method which can reproduce field strut noises in the lab. Using the newly developed noise evaluation test, this study analyzed the effects that insulator performance degradation and failure can have on car noises. The study also confirmed that the insulator durability test by the simple back-and-forth motion cannot completely reflect the state of the parts failure in the field. Based on this, the study also confirmed that field noises can be reproduced through a durability test that considers heat aging.
Abstract: A new digital watermarking technique for images that
are sensitive to blocking artifacts is presented. Experimental results
show that the proposed MDCT based approach produces highly
imperceptible watermarked images and is robust to attacks such as
compression, noise, filtering and geometric transformations. The
proposed MDCT watermarking technique is applied to fingerprints
for ensuring security. The face image and demographic text data of
an individual are used as multiple watermarks. An AFIS system was
used to quantitatively evaluate the matching performance of the
MDCT-based watermarked fingerprint. The high fingerprint
matching scores show that the MDCT approach is resilient to
blocking artifacts. The quality of the extracted face and extracted text
images was computed using two human visual system metrics and
the results show that the image quality was high.
Abstract: In digital signal processing it is important to
approximate multi-dimensional data by the method called rank
reduction, in which we reduce the rank of multi-dimensional data from
higher to lower. For 2-dimennsional data, singular value
decomposition (SVD) is one of the most known rank reduction
techniques. Additional, outer product expansion expanded from SVD
was proposed and implemented for multi-dimensional data, which has
been widely applied to image processing and pattern recognition.
However, the multi-dimensional outer product expansion has behavior
of great computation complex and has not orthogonally between the
expansion terms. Therefore we have proposed an alterative method,
Third-order Orthogonal Tensor Product Expansion short for 3-OTPE.
3-OTPE uses the power method instead of nonlinear optimization
method for decreasing at computing time. At the same time the group
of B. D. Lathauwer proposed Higher-Order SVD (HOSVD) that is
also developed with SVD extensions for multi-dimensional data.
3-OTPE and HOSVD are similarly on the rank reduction of
multi-dimensional data. Using these two methods we can obtain
computation results respectively, some ones are the same while some
ones are slight different. In this paper, we compare 3-OTPE to
HOSVD in accuracy of calculation and computing time of resolution,
and clarify the difference between these two methods.
Abstract: Ontology is widely being used as a tool for organizing
information, creating the relation between the subjects within the
defined knowledge domain area. Various fields such as Civil,
Biology, and Management have successful integrated ontology in
decision support systems for managing domain knowledge and to
assist their decision makers. Gross pollutant traps (GPT) are devices
used in trapping and preventing large items or hazardous particles in
polluting and entering our waterways. However choosing and
determining GPT is a challenge in Malaysia as there are inadequate
GPT data repositories being captured and shared. Hence ontology is
needed to capture, organize and represent this knowledge into
meaningful information which can be contributed to the efficiency of
GPT selection in Malaysia urbanization. A GPT Ontology framework
is therefore built as the first step to capture GPT knowledge which
will then be integrated into the decision support system. This paper
will provide several examples of the GPT ontology, and explain how
it is constructed by using the Protégé tool.
Abstract: Circle grid space filling plate is a flow conditioner with a fractal pattern and used to eliminate turbulence originating from pipe fittings in experimental fluid flow applications. In this paper, steady state, incompressible, swirling turbulent flow through circle grid space filling plate has been studied. The solution and the analysis were carried out using finite volume CFD solver FLUENT 6.2. Three turbulence models were used in the numerical investigation and their results were compared with the pressure drop correlation of BS EN ISO 5167-2:2003. The turbulence models investigated here are the standard k-ε, realizable k-ε, and the Reynolds Stress Model (RSM). The results showed that the RSM model gave the best agreement with the ISO pressure drop correlation. The effects of circle grids space filling plate thickness and Reynolds number on the flow characteristics have been investigated as well.
Abstract: The development of shape and size of a crack in a
pressure vessel under uniaxial and biaxial loadings is important in
fitness-for-service evaluations such as leak-before-break. In this
work finite element modelling was used to evaluate the mean stress
and the J-integral around a front of a surface-breaking crack. A
procedure on the basis of ductile tearing resistance curves of high and
low constrained fracture mechanics geometries was developed to
estimate the amount of ductile crack extension for surface-breaking
cracks and to show the evolution of the initial crack shape. The
results showed non-uniform constraint levels and crack driving forces
around the crack front at large deformation levels. It was also shown
that initially semi-elliptical surface cracks under biaxial load
developed higher constraint levels around the crack front than in
uniaxial tension. However similar crack shapes were observed with
more extensions associated with cracks under biaxial loading.
Abstract: The third phase of web means semantic web requires many web pages which are annotated with metadata. Thus, a crucial question is where to acquire these metadata. In this paper we propose our approach, a semi-automatic method to annotate the texts of documents and web pages and employs with a quite comprehensive knowledge base to categorize instances with regard to ontology. The approach is evaluated against the manual annotations and one of the most popular annotation tools which works the same as our tool. The approach is implemented in .net framework and uses the WordNet for knowledge base, an annotation tool for the Semantic Web.
Abstract: Feature selection study is gaining importance due to its contribution to save classification cost in terms of time and computation load. In search of essential features, one of the methods to search the features is via the decision tree. Decision tree act as an intermediate feature space inducer in order to choose essential features. In decision tree-based feature selection, some studies used decision tree as a feature ranker with a direct threshold measure, while others remain the decision tree but utilized pruning condition that act as a threshold mechanism to choose features. This paper proposed threshold measure using Manhattan Hierarchical Cluster distance to be utilized in feature ranking in order to choose relevant features as part of the feature selection process. The result is promising, and this method can be improved in the future by including test cases of a higher number of attributes.
Abstract: The creation of a sustainable future depends on the knowledge and involvement of the people, as well as an understanding of the consequences of individual actions. Construction industry has long been associated with the detrimental effects to our mother earth. In Malaysia, the government, professional bodies and private companies are beginning to take heed in the necessity to reduce this environmental problem without restraining the need for development. This paper focuses on the actions undertaken by the Malaysian government, non-government organizations and construction players in promoting sustainability in construction. To ensure that those concerted efforts are not only skin deep in its impact, a survey was conducted to investigate the awareness of the developers regarding this issue and whether those developers has absorb the concept of sustainable construction in their current practices. The survey revealed that although the developers are aware of the rising issues on sustainability, little efforts are generated from them in implementing it. More effort is necessary to boost this application and further stimulate actions and strategies towards a sustainable built environment.