Abstract: Technology of thin film deposition is of interest in
many engineering fields, from electronic manufacturing to corrosion
protective coating. A typical deposition process, like that developed
at the University of Eindhoven, considers the deposition of a thin,
amorphous film of C:H or of Si:H on the substrate, using the
Expanding Thermal arc Plasma technique. In this paper a computing
procedure is proposed to simulate the flow field in a deposition
chamber similar to that at the University of Eindhoven and a
sensitivity analysis is carried out in terms of: precursor mass flow
rate, electrical power, supplied to the torch and fluid-dynamic
characteristics of the plasma jet, using different nozzles. To this
purpose a deposition chamber similar in shape, dimensions and
operating parameters to the above mentioned chamber is considered.
Furthermore, a method is proposed for a very preliminary evaluation
of the film thickness distribution on the substrate. The computing
procedure relies on two codes working in tandem; the output from
the first code is the input to the second one. The first code simulates
the flow field in the torch, where Argon is ionized according to the
Saha-s equation, and in the nozzle. The second code simulates the
flow field in the chamber. Due to high rarefaction level, this is a
(commercial) Direct Simulation Monte Carlo code. Gas is a mixture
of 21 chemical species and 24 chemical reactions from Argon plasma
and Acetylene are implemented in both codes. The effects of the
above mentioned operating parameters are evaluated and discussed
by 2-D maps and profiles of some important thermo-fluid-dynamic
parameters, as per Mach number, velocity and temperature. Intensity,
position and extension of the shock wave are evaluated and the
influence of the above mentioned test conditions on the film
thickness and uniformity of distribution are also evaluated.
Abstract: The purpose of this paper is to develop models that would enable predicting student success. These models could improve allocation of students among colleges and optimize the newly introduced model of government subsidies for higher education. For the purpose of collecting data, an anonymous survey was carried out in the last year of undergraduate degree student population using random sampling method. Decision trees were created of which two have been chosen that were most successful in predicting student success based on two criteria: Grade Point Average (GPA) and time that a student needs to finish the undergraduate program (time-to-degree). Decision trees have been shown as a good method of classification student success and they could be even more improved by increasing survey sample and developing specialized decision trees for each type of college. These types of methods have a big potential for use in decision support systems.
Abstract: Recent advances in both the testing and verification of software based on formal specifications of the system to be built have reached a point where the ideas can be applied in a powerful way in the design of agent-based systems. The software engineering research has highlighted a number of important issues: the importance of the type of modeling technique used; the careful design of the model to enable powerful testing techniques to be used; the automated verification of the behavioural properties of the system; the need to provide a mechanism for translating the formal models into executable software in a simple and transparent way. This paper introduces the use of the X-machine formalism as a tool for modeling biology inspired agents proposing the use of the techniques built around X-machine models for the construction of effective, and reliable agent-based software systems.
Abstract: The paper describes the futures trading and aims to
design the speculators trading strategy. The problem is formulated as
the decision making task and such as is solved. The solution of the
task leads to complex mathematical problems and the approximations
of the decision making is demanded. Two kind of approximation are
used in the paper: Monte Carlo for the multi-step prediction and
iteration spread in time for the optimization. The solution is applied to the real-market data and the results of the off-line experiments are
presented.
Abstract: This paper presents a generalized form of the
mechanistic deconvolution technique (GMD) to modeling image sensors applicable in various pan–tilt planes of view. The mechanistic deconvolution technique (UMD) is modified with the
given angles of a pan–tilt plane of view to formulate constraint parameters and characterize distortion effects, and thereby, determine
the corrected image data. This, as a result, does not require experimental setup or calibration. Due to the mechanistic nature of
the sensor model, the necessity for the sensor image plane to be
orthogonal to its z-axis is eliminated, and it reduces the dependency on image data. An experiment was constructed to evaluate the
accuracy of a model created by GMD and its insensitivity to changes in sensor properties and in pan and tilt angles. This was compared
with a pre-calibrated model and a model created by UMD using two sensors with different specifications. It achieved similar accuracy
with one-seventh the number of iterations and attained lower mean error by a factor of 2.4 when compared to the pre-calibrated and
UMD model respectively. The model has also shown itself to be robust and, in comparison to pre-calibrated and UMD model, improved the accuracy significantly.
Abstract: A laboratory study on the influence of compactive
effort on expansive black cotton specimens treated with up to 8%
ordinary Portland cement (OPC) admixed with up to 8% bagasse ash
(BA) by dry weight of soil and compacted using the energies of the
standard Proctor (SP), West African Standard (WAS) or
“intermediate” and modified Proctor (MP) were undertaken. The
expansive black cotton soil was classified as A-7-6 (16) or CL using
the American Association of Highway and Transportation Officials
(AASHTO) and Unified Soil Classification System (USCS),
respectively. The 7day unconfined compressive strength (UCS)
values of the natural soil for SP, WAS and MP compactive efforts are
286, 401 and 515kN/m2 respectively, while peak values of 1019,
1328 and 1420kN/m2 recorded at 8% OPC/ 6% BA, 8% OPC/ 2% BA
and 6% OPC/ 4% BA treatments, respectively were less than the
UCS value of 1710kN/m2 conventionally used as criterion for
adequate cement stabilization. The soaked California bearing ratio
(CBR) values of the OPC/BA stabilized soil increased with higher
energy level from 2, 4 and 10% for the natural soil to Peak values of
55, 18 and 8% were recorded at 8% OPC/4% BA 8% OPC/2% BA
and 8% OPC/4% BA, treatments when SP, WAS and MP compactive
effort were used, respectively. The durability of specimens was
determined by immersion in water. Soils treatment at 8% OPC/ 4%
BA blend gave a value of 50% resistance to loss in strength value
which is acceptable because of the harsh test condition of 7 days
soaking period specimens were subjected instead of the 4 days
soaking period that specified a minimum resistance to loss in strength
of 80%. Finally An optimal blend of is 8% OPC/ 4% BA is
recommended for treatment of expansive black cotton soil for use as
a sub-base material.
Abstract: The Knowledge Management (KM) Criteria is an
essential foundation to evaluate KM outcomes. Different sets of
criteria were developed and tailored by many researchers to
determine the results of KM initiatives. However, literature review
has emphasized on incomplete set of criteria for evaluating KM
outcomes. Hence, this paper tried to address the problem of
determining the criteria for measuring knowledge management
outcomes among different types of Malaysian organizations.
Successively, this paper was assumed to develop widely accepted
criteria to measure success of knowledge management efforts for
Malaysian organizations. Our analysis approach was based on the
ANOVA procedure to compare a set of criteria among different types
of organizations. This set of criteria was exploited from literature
review. It is hoped that this study provides a better picture for
different types of Malaysian organizations to establish a
comprehensive set of criteria due to measure results of KM programs.
Abstract: The upgrading of low quality crude natural gas (NG) is attracting interest due to high demand of pipeline-grade gas in recent years. Membrane processes are commercially proven technology for the removal of impurities like carbon dioxide from NG. In this work, cross flow mathematical model has been suggested to be incorporated with ASPEN HYSYS as a user defined unit operation in order to design the membrane system for CO2/CH4 separation. The effect of operating conditions (such as feed composition and pressure) and membrane selectivity on the design parameters (methane recovery and total membrane area required for the separation) has been studied for different design configurations. These configurations include single stage (with and without recycle) and double stage membrane systems (with and without permeate or retentate recycle). It is shown that methane recovery can be improved by recycling permeate or retentate stream as well as by using double stage membrane systems. The ASPEN HYSYS user defined unit operation proposed in the study has potential to be applied for complex membrane system design and optimization.
Abstract: The provision of urban public transport in Indonesia is not free of problems. Some of the problems include: an overall lack of capacity, lack of quality and choice, severe traffic congestions and insufficient fund to renew and repair vehicles. Generally, the comfort and quality of the city bus is poor, and many of the vehicles are dilapidated and dirty. Surveys were carried out in the city of Yogyakarta, by counting city bus vehicles and occupancies, interviewing the bus passengers, drivers and institutional staffs, who involve in public transport management. This paper will then analyze the possible plan to develop the public transport system to become more attractive and to improve the public transport management. The short, medium and long term plans are analyzed, to find the best solutions. Some constraints such as social impacts and financial impact are also taken into accounts.
Abstract: A low bit rate still image compression scheme by
compressing the indices of Vector Quantization (VQ) and generating
residual codebook is proposed. The indices of VQ are compressed by
exploiting correlation among image blocks, which reduces the bit per
index. A residual codebook similar to VQ codebook is generated that
represents the distortion produced in VQ. Using this residual
codebook the distortion in the reconstructed image is removed,
thereby increasing the image quality. Our scheme combines these two
methods. Experimental results on standard image Lena show that our
scheme can give a reconstructed image with a PSNR value of 31.6 db
at 0.396 bits per pixel. Our scheme is also faster than the existing VQ
variants.
Abstract: When the results of the total element concentrations using USEPA method 3051A are compared to the sequential extraction analyses (i.e. the sum of fractions BCR1, BCR2 and BRC3), it can be calculated that the recovery values of elements varied between 56.8-% and 69.4-% in the bottom ash, and between 11.3-% and 70.9-% in the fly ash. This indicates that most of the elements in the ashes do not occur as readily soluble forms.
Abstract: In this paper the problem of face recognition under variable illumination conditions is considered. Most of the works in the literature exhibit good performance under strictly controlled acquisition conditions, but the performance drastically drop when changes in pose and illumination occur, so that recently number of approaches have been proposed to deal with such variability. The aim of this work is to introduce an efficient local appearance feature extraction method based steerable pyramid (SP) for face recognition. Local information is extracted from SP sub-bands using LBP(Local binary Pattern). The underlying statistics allow us to reduce the required amount of data to be stored. The experiments carried out on different face databases confirm the effectiveness of the proposed approach.
Abstract: To overcome the product overload of Internet
shoppers, we introduce a semantic recommendation procedure which
is more efficient when applied to Internet shopping malls. The
suggested procedure recommends the semantic products to the
customers and is originally based on Web usage mining, product
classification, association rule mining, and frequently purchasing.
We applied the procedure to the data set of MovieLens Company for
performance evaluation, and some experimental results are provided.
The experimental results have shown superior performance in
terms of coverage and precision.
Abstract: Implicit equations play a crucial role in Engineering.
Based on this importance, several techniques have been applied to
solve this particular class of equations. When it comes to practical
applications, in general, iterative procedures are taken into account.
On the other hand, with the improvement of computers, other
numerical methods have been developed to provide a more
straightforward methodology of solution. Analytical exact approaches
seem to have been continuously neglected due to the difficulty
inherent in their application; notwithstanding, they are indispensable
to validate numerical routines. Lagrange-s Inversion Theorem is a
simple mathematical tool which has proved to be widely applicable to
engineering problems. In short, it provides the solution to implicit
equations by means of an infinite series. To show the validity of this
method, the tree-parameter infiltration equation is, for the first time,
analytically and exactly solved. After manipulating these series,
closed-form solutions are presented as H-functions.
Abstract: The aim of this paper is to rank the impact of Object
Oriented(OO) metrics in fault prediction modeling using Artificial
Neural Networks(ANNs). Past studies on empirical validation of
object oriented metrics as fault predictors using ANNs have focused
on the predictive quality of neural networks versus standard
statistical techniques. In this empirical study we turn our attention to
the capability of ANNs in ranking the impact of these explanatory
metrics on fault proneness. In ANNs data analysis approach, there is
no clear method of ranking the impact of individual metrics. Five
ANN based techniques are studied which rank object oriented
metrics in predicting fault proneness of classes. These techniques are
i) overall connection weights method ii) Garson-s method iii) The
partial derivatives methods iv) The Input Perturb method v) the
classical stepwise methods. We develop and evaluate different
prediction models based on the ranking of the metrics by the
individual techniques. The models based on overall connection
weights and partial derivatives methods have been found to be most
accurate.
Abstract: We demonstrate a nonfaradaic electrochemical impedance spectroscopy measurement of biochemically modified gold plated electrodes using a two-electrode system. The absence of any redox indicator in the impedance measurements provide more precise and accurate characterization of the measured bioanalyte at molecular resolution. An equivalent electrical circuit of the electrodeelectrolyte interface was deduced from the observed impedance data of saline solution at low and high concentrations. The detection of biomolecular interactions was fundamentally correlated to electrical double-layer variation at modified interface. The investigations were done using 20mer deoxyribonucleic acid (DNA) strands without any label. Surface modification was performed by creating mixed monolayer of the thiol-modified single-stranded DNA and a spacer thiol (mercaptohexanol) by a two-step self-assembly method. The results clearly distinguish between the noncomplementary and complementary hybridization of DNA, at low frequency region below several hundreds Hertz.
Abstract: The impact of OO design on software quality
characteristics such as defect density and rework by mean of
experimental validation. Encapsulation, inheritance, polymorphism,
reusability, Data hiding and message-passing are the major attribute
of an Object Oriented system. In order to evaluate the quality of an
Object oriented system the above said attributes can act as indicators.
The metrics are the well known quantifiable approach to express any
attribute. Hence, in this paper we tried to formulate a framework of
metrics representing the attributes of object oriented system.
Empirical Data is collected from three different projects based on
object oriented paradigms to calculate the metrics.
Abstract: Computer aided design accounts with the support of
parametric software in the design of machine components as well as
of any other pieces of interest. The complexities of the element under
study sometimes offer certain difficulties to computer design, or ever
might generate mistakes in the final body conception. Reverse
engineering techniques are based on the transformation of already
conceived body images into a matrix of points which can be
visualized by the design software. The literature exhibits several
techniques to obtain machine components dimensional fields, as
contact instrument (MMC), calipers and optical methods as laser
scanner, holograms as well as moiré methods. The objective of this
research work was to analyze the moiré technique as instrument of
reverse engineering, applied to bodies of nom complex geometry as
simple solid figures, creating matrices of points. These matrices were
forwarded to a parametric software named SolidWorks to generate
the virtual object. Volume data obtained by mechanical means, i.e.,
by caliper, the volume obtained through the moiré method and the
volume generated by the SolidWorks software were compared and
found to be in close agreement. This research work suggests the
application of phase shifting moiré methods as instrument of reverse
engineering, serving also to support farm machinery element designs.
Abstract: The paper reports on the results of experimental and
numerical study of nonstationary swirling flow in an isothermal
model of vortex burner. It has been identified that main source of the
instability is related to a precessing vortex core (PVC) phenomenon.
The PVC induced flow pulsation characteristics such as precession
frequency and its variation as a function of flowrate and swirl number
have been explored making use of acoustic probes. Additionally
pressure transducers were used to measure the pressure drops on the
working chamber and across the vortex flow. The experiments have
been included also the mean velocity measurements making use of a
laser-Doppler anemometry. The features of instantaneous flowfield
generated by the PVC were analyzed employing a commercial CFD
code (Star-CCM+) based on Detached Eddy Simulation (DES)
approach. Validity of the numerical code has been checked by
comparison calculated flowfield data with the obtained experimental
results. It has been confirmed particularly that the CFD code applied
correctly reproduces the flow features.
Abstract: Travelling salesman problem (TSP) is a combinational
optimization problem and solution approaches have been applied
many real world problems. Pure TSP assumes the cities to visit are
fixed in time and thus solutions are created to find shortest path
according to these point. But some of the points are canceled to visit
in time. If the problem is not time crucial it is not important to
determine new routing plan but if the points are changing rapidly and
time is necessary do decide a new route plan a new approach should
be applied in such cases. We developed a route plan transfer method
based on transfer learning and we achieved high performance against
determining a new model from scratch in every change.