Abstract: When reconstructing a scenario, it is necessary to
know the structure of the elements present on the scene to have an
interpretation. In this work we link 3D scenes reconstruction to
evolutionary algorithms through the vision stereo theory. We
consider vision stereo as a method that provides the reconstruction of
a scene using only a couple of images of the scene and performing
some computation. Through several images of a scene, captured from
different positions, vision stereo can give us an idea about the threedimensional
characteristics of the world. Vision stereo usually
requires of two cameras, making an analogy to the mammalian vision
system. In this work we employ only a camera, which is translated
along a path, capturing images every certain distance. As we can not
perform all computations required for an exhaustive reconstruction,
we employ an evolutionary algorithm to partially reconstruct the
scene in real time. The algorithm employed is the fly algorithm,
which employ “flies" to reconstruct the principal characteristics of
the world following certain evolutionary rules.
Abstract: We present the development of a new underwater laser
cutting process in which a water-jet has been used along with the
laser beam to remove the molten material through kerf. The
conventional underwater laser cutting usually utilizes a high pressure
gas jet along with laser beam to create a dry condition in the cutting
zone and also to eject out the molten material. This causes a lot of gas
bubbles and turbulence in water, and produces aerosols and waste
gas. This may cause contamination in the surrounding atmosphere
while cutting radioactive components like burnt nuclear fuel. The
water-jet assisted underwater laser cutting process produces much
less turbulence and aerosols in the atmosphere. Some amount of
water vapor bubbles is formed at the laser-metal-water interface;
however, they tend to condense as they rise up through the
surrounding water. We present the design and development of a
water-jet assisted underwater laser cutting head and the parametric
study of the cutting of AISI 304 stainless steel sheets with a 2 kW
CW fiber laser. The cutting performance is similar to that of the gas
assist laser cutting; however, the process efficiency is reduced due to
heat convection by water-jet and laser beam scattering by vapor. This
process may be attractive for underwater cutting of nuclear reactor
components.
Abstract: The performance and the plasma created by a pulsed
magnetoplasmadynamic thruster for small satellite application is
studied to understand better the ablation and plasma propagation
processes occurring during the short-time discharge. The results can
be applied to improve the quality of the thruster in terms of efficiency,
and to tune the propulsion system to the needs required by the satellite
mission. Therefore, plasma measurements with a high-speed camera
and induction probes, and performance measurements of mass bit
and impulse bit were conducted. Values for current sheet propagation
speed, mean exhaust velocity and thrust efficiency were derived from
these experimental data. A maximum in current sheet propagation
was found by the high-speed camera measurements for a medium
energy input and confirmed by the induction probes. A quasilinear
tendency between the mass bit and the energy input, the current
action integral respectively, was found, as well as a linear tendency
between the created impulse and the discharge energy. The highest
mean exhaust velocity and thrust efficiency was found for the highest
energy input.
Abstract: This paper proposes a method, combining color and layout features, for identifying documents captured from low-resolution handheld devices. On one hand, the document image color density surface is estimated and represented with an equivalent ellipse and on the other hand, the document shallow layout structure is computed and hierarchically represented. Our identification method first uses the color information in the documents in order to focus the search space on documents having a similar color distribution, and finally selects the document having the most similar layout structure in the remaining of the search space.
Abstract: Economically transformers constitute one of the largest investments in a Power system. For this reason, transformer condition assessment and management is a high priority task. If a transformer fails, it would have a significant negative impact on revenue and service reliability. Monitoring the state of health of power transformers has traditionally been carried out using laboratory Dissolved Gas Analysis (DGA) tests performed at periodic intervals on the oil sample, collected from the transformers. DGA of transformer oil is the single best indicator of a transformer-s overall condition and is a universal practice today, which started somewhere in the 1960s. Failure can occur in a transformer due to different reasons. Some failures can be limited or prevented by maintenance. Oil filtration is one of the methods to remove the dissolve gases and prevent the deterioration of the oil. In this paper we analysis the DGA data by regression method and predict the gas concentration in the oil in the future. We bring about a comparative study of different traditional methods of regression and the errors generated out of their predictions. With the help of these data we can deduce the health of the transformer by finding the type of fault if it has occurred or will occur in future. Additional in this paper effect of filtration on the transformer health is highlight by calculating the probability of failure of a transformer with and without oil filtrating.
Abstract: In this paper we present semantic assistant agent
(SAA), an open source digital library agent which takes user query
for finding information in the digital library and takes resources-
metadata and stores it semantically. SAA uses Semantic Web to
improve browsing and searching for resources in digital library. All
metadata stored in the library are available in RDF format for
querying and processing by SemanSreach which is a part of SAA
architecture. The architecture includes a generic RDF-based model
that represents relationships among objects and their components.
Queries against these relationships are supported by an RDF triple
store.
Abstract: Appropriate description of business processes through
standard notations has become one of the most important assets for
organizations. Organizations must therefore deal with quality faults
in business process models such as the lack of understandability and
modifiability. These quality faults may be exacerbated if business
process models are mined by reverse engineering, e.g., from existing
information systems that support those business processes. Hence,
business process refactoring is often used, which change the internal
structure of business processes whilst its external behavior is
preserved. This paper aims to choose the most appropriate set of
refactoring operators through the quality assessment concerning
understandability and modifiability. These quality features are
assessed through well-proven measures proposed in the literature.
Additionally, a set of measure thresholds are heuristically established
for applying the most promising refactoring operators, i.e., those that
achieve the highest quality improvement according to the selected
measures in each case.
Abstract: The paper presents a simple and an accurate formula
that has been developed for the conduction angle (δ) of a single
phase half-wave or full-wave controlled rectifier with RL load. This
formula can be also used for calculating the conduction angle (δ) in
case of A.C. voltage regulator with inductive load under
discontinuous current mode. The simulation results shows that the
conduction angle calculated from the developed formula agree very
well with that obtained from the exact solution arrived from the
iterative method. Applying the developed formula can reduce the
computational time and reduce the time for manual classroom
calculation. In addition, the proposed formula is attractive for real
time implementations.
Abstract: Group-III nitride material as particularly AlxGa1-xN is
one of promising optoelectronic materials to require for shortwavelength
devices. To achieve the high-quality AlxGa1-xN films for
a high performance of such devices, AlN-nucleation layers are the
important factor. To improve the AlN-nucleation layers with a
variation of Ga-addition, XRD measurements were conducted to
analyze the crystalline quality of the subsequent Al0.1Ga0.9N with the
minimum ω-FWHMs of (0002) and (10-10) reflections of 425 arcsec
and 750 arcsec, respectively. SEM and AFM measurements were
performed to observe the surface morphology and TEM
measurements to identify the microstructures and orientations.
Results showed that the optimized Ga-atoms in the Al(Ga)Nnucleation
layers improved the surface diffusion to form moreuniform
crystallites in structure and size, better alignment of each
crystallite, and better homogeneity of island distribution. This, hence,
improves the orientation of epilayers on the Si-surface and finally
improves the crystalline quality and reduces the residual strain of
subsequent Al0.1Ga0.9N layers.
Abstract: An accurate and proficient artificial neural network
(ANN) based genetic algorithm (GA) is developed for predicting of
nanofluids viscosity. A genetic algorithm (GA) is used to optimize
the neural network parameters for minimizing the error between the
predictive viscosity and the experimental one. The experimental
viscosity in two nanofluids Al2O3-H2O and CuO-H2O from 278.15
to 343.15 K and volume fraction up to 15% were used from
literature. The result of this study reveals that GA-NN model is
outperform to the conventional neural nets in predicting the viscosity
of nanofluids with mean absolute relative error of 1.22% and 1.77%
for Al2O3-H2O and CuO-H2O, respectively. Furthermore, the results
of this work have also been compared with others models. The
findings of this work demonstrate that the GA-NN model is an
effective method for prediction viscosity of nanofluids and have
better accuracy and simplicity compared with the others models.
Abstract: The belief K-modes method (BKM) approach is a new
clustering technique handling uncertainty in the attribute values of
objects in both the cluster construction task and the classification one.
Like the standard version of this method, the BKM results depend on
the chosen initial modes. So, one selection method of initial modes
is developed, in this paper, aiming at improving the performances of
the BKM approach. Experiments with several sets of real data show
that by considered the developed selection initial modes method, the
clustering algorithm produces more accurate results.
Abstract: The performance of schedules released to a shop floor may greatly be affected by unexpected disruptions. Thus, this paper considers the flexible job shop scheduling problem when processing times of some operations are represented by a uniform distribution with given lower and upper bounds. The objective is to find a predictive schedule that can deal with this uncertainty. The paper compares two genetic approaches to obtain predictive schedule. To determine the performance of the predictive schedules obtained by both approaches, an experimental study is conducted on a number of benchmark problems.
Abstract: The main aim of this research is to develop a methodology to encourage people's awareness, knowledge and understanding on the participation of flood management for cultural heritage, as the cooperation and interaction among government section, private section, and public section through role-play gaming simulation theory. The format of this research is to develop Role-play gaming simulation from existing documents, game or role-playing from several sources and existing data of the research site. We found that role-play gaming simulation can be implemented to help improving the understanding of the existing problem and the impact of the flood on cultural heritage, and the role-play game can be developed into the tool to improve people's knowledge, understanding and awareness about people's participation for flood management on cultural heritage, moreover the cooperation among the government, private section and public section will be improved through the theory of role-play gaming simulation.
Abstract: This paper discusses the landscape design that could
increase energy efficiency in a house. By planting trees in a house
compound, the tree shades prevent direct sunlight from heating up
the building, and it enables cooling off the surrounding air. The
requirement for air-conditioning could be minimized and the air
quality could be improved. During the life time of a tree, the saving
cost from the mentioned benefits could be up to US $ 200 for each
tree. The project intends to visually describe the landscape design in
a house compound that could enhance energy efficiency and
consequently lead to energy saving. The house compound model was
developed in three dimensions by using AutoCAD 2005, the
animation was programmed by using LightWave 3D softwares i.e.
Modeler and Layout to display the tree shadings in the wall. The
visualization was executed on a VRML Pad platform and
implemented on a web environment.
Abstract: To investigate the energy performance of solar shading devices, this paper carried out a survey on the current status of solar shading utilization in buildings in Ningbo and performed building simulations to evaluate the energy savings potential by adopting different solar shading devices. Results show that solar shading utilization in this area is not popular and effective, and should be considered firstly in the design stage since the potential for energy savings is up to 6.8% for residential buildings and 9.4% for commercial buildings.
Abstract: The approach of subset selection in polynomial
regression model building assumes that the chosen fixed full set of
predefined basis functions contains a subset that is sufficient to
describe the target relation sufficiently well. However, in most cases
the necessary set of basis functions is not known and needs to be
guessed – a potentially non-trivial (and long) trial and error process.
In our research we consider a potentially more efficient approach –
Adaptive Basis Function Construction (ABFC). It lets the model
building method itself construct the basis functions necessary for
creating a model of arbitrary complexity with adequate predictive
performance. However, there are two issues that to some extent
plague the methods of both the subset selection and the ABFC,
especially when working with relatively small data samples: the
selection bias and the selection instability. We try to correct these
issues by model post-evaluation using Cross-Validation and model
ensembling. To evaluate the proposed method, we empirically
compare it to ABFC methods without ensembling, to a widely used
method of subset selection, as well as to some other well-known
regression modeling methods, using publicly available data sets.
Abstract: For more than 120 years, gold mining formed the
backbone the South Africa-s economy. The consequence of mine
closure was observed in large-scale land degradation and widespread
pollution of surface water and groundwater. This paper investigates
the feasibility of using natural zeolite in removing heavy metals
contaminating the Wonderfonteinspruit Catchment Area (WCA), a
water stream with high levels of heavy metals and radionuclide
pollution. Batch experiments were conducted to study the adsorption
behavior of natural zeolite with respect to Fe2+, Mn2+, Ni2+, and Zn2+.
The data was analysed using the Langmuir and Freudlich isotherms.
Langmuir was found to correlate the adsorption of Fe2+, Mn2+, Ni2+,
and Zn2+ better, with the adsorption capacity of 11.9 mg/g, 1.2 mg/g,
1.3 mg/g, and 14.7 mg/g, respectively. Two kinetic models namely,
pseudo-first order and pseudo second order were also tested to fit the
data. Pseudo-second order equation was found to be the best fit for
the adsorption of heavy metals by natural zeolite. Zeolite
functionalization with humic acid increased its uptake ability.
Abstract: Due to the stringent legislation for emission of diesel
engines and also increasing demand on fuel consumption, the
importance of detailed 3D simulation of fuel injection, mixing and
combustion have been increased in the recent years. In the present
work, FIRE code has been used to study the detailed modeling of
spray and mixture formation in a Caterpillar heavy-duty diesel
engine. The paper provides an overview of the submodels
implemented, which account for liquid spray atomization, droplet
secondary break-up, droplet collision, impingement, turbulent
dispersion and evaporation. The simulation was performed from
intake valve closing (IVC) to exhaust valve opening (EVO). The
predicted in-cylinder pressure is validated by comparing with
existing experimental data. A good agreement between the predicted
and experimental values ensures the accuracy of the numerical
predictions collected with the present work. Predictions of engine
emissions were also performed and a good quantitative agreement
between measured and predicted NOx and soot emission data were
obtained with the use of the present Zeldowich mechanism and
Hiroyasu model. In addition, the results reported in this paper
illustrate that the numerical simulation can be one of the most
powerful and beneficial tools for the internal combustion engine
design, optimization and performance analysis.
Abstract: As the gradual increase of the enterprise scale, the
firms may possess many manufacturing plants located in different
places geographically. This change will result in the multi-site
production planning problems under the environment of multiple
plants or production resources. Our research proposes the structural
framework to analyze the multi-site planning problems. The analytical
framework is composed of six elements: multi-site conceptual model,
product structure (bill of manufacturing), production strategy,
manufacturing capability and characteristics, production planning
constraints, and key performance indicators. As well as the discussion
of these six ingredients, we also review related literatures in this paper
to match our analytical framework. Finally we take a real-world
practical example of a TFT-LCD manufacturer in Taiwan to explain
our proposed analytical framework for the multi-site production
planning problems.
Abstract: Clustering in high dimensional space is a difficult
problem which is recurrent in many fields of science and
engineering, e.g., bioinformatics, image processing, pattern
reorganization and data mining. In high dimensional space some of
the dimensions are likely to be irrelevant, thus hiding the possible
clustering. In very high dimensions it is common for all the objects in
a dataset to be nearly equidistant from each other, completely
masking the clusters. Hence, performance of the clustering algorithm
decreases.
In this paper, we propose an algorithmic framework which
combines the (reduct) concept of rough set theory with the k-means
algorithm to remove the irrelevant dimensions in a high dimensional
space and obtain appropriate clusters. Our experiment on test data
shows that this framework increases efficiency of the clustering
process and accuracy of the results.