Abstract: Nowadays, hand vein recognition has attracted more attentions in identification biometrics systems. Generally, hand vein image is acquired with low contrast and irregular illumination. Accordingly, if you have a good preprocessing of hand vein image, we can easy extracted the feature extraction even with simple binarization. In this paper, a proposed approach is processed to improve the quality of hand vein image. First, a brief survey on existing methods of enhancement is investigated. Then a Radon Like features method is applied to preprocessing hand vein image. Finally, experiments results show that the proposed method give the better effective and reliable in improving hand vein images.
Abstract: This research investigates the suitability of fuel oil in
improving gypseous soil. A detailed laboratory tests were carried-out
on two soils (soil I with 51.6% gypsum content, and soil II with
26.55%), where the two soils were obtained from Al-Therthar site
(Al-Anbar Province-Iraq).
This study examines the improvement of soil properties using the
gypsum material which is locally available with low cost to minimize
the effect of moisture on these soils by using the fuel oil. This study
was conducted on two models of the soil gypsum, from the Tharthar
area. The first model was sandy soil with Gypsum content of (51.6%)
and the second is clayey soil and the content of Gypsum is (26.55%).
The program included tests measuring the permeability and
compressibility of the soil and their collapse properties. The shear
strength of the soil and the amounts of weight loss of fuel oil due to
drying had been found. These tests have been conducted on the
treated and untreated soils to observe the effect of soil treatment on
the engineering properties when mixed with varying degrees of fuel
oil with the equivalent of the water content.
The results showed that fuel oil is a good material to modify the
basic properties of the gypseous soil of collapsibility and
permeability, which are the main problems of this soil and retained
the soil by an appropriate amount of the cohesion suitable for
carrying the loads from the structure.
Abstract: Excilamps are new UV sources with great potential
for application in wastewater treatment. In the present work, a XeBr
excilamp emitting radiation at 283 nm has been used for the
photodegradation of 4-chlorophenol within a range of concentrations
from 50 to 500 mg L-1. Total removal of 4-chlorophenol was
achieved for all concentrations assayed. The two main photoproduct
intermediates formed along the photodegradation process,
benzoquinone and hydroquinone, although not being completely
removed, remain at very low residual concentrations. Such
concentrations are insignificant compared to the 4-chlorophenol
initial ones and non-toxic. In order to simulate the process and scaleup,
a kinetic model has been developed and validated from the
experimental data.
Abstract: In the present work, we propose a new projection method for solving the matrix equation AXB = F. For implementing our new method, generalized forms of block Krylov subspace and global Arnoldi process are presented. The new method can be considered as an extended form of the well-known global generalized minimum residual (Gl-GMRES) method for solving multiple linear systems and it will be called as the extended Gl-GMRES (EGl- GMRES). Some new theoretical results have been established for proposed method by employing Schur complement. Finally, some numerical results are given to illustrate the efficiency of our new method.
Abstract: Virtual Assembly (VA) is one of the key technologies
in advanced manufacturing field. It is a promising application of
virtual reality in design and manufacturing field. It has drawn much
interest from industries and research institutes in the last two decades.
This paper describes a process for integrating an interactive Virtual
Reality-based assembly simulation of a digital mockup with the
CAD/CAM infrastructure. The necessary hardware and software
preconditions for the process are explained so that it can easily be
adopted by non VR experts. The article outlines how assembly
simulation can improve the CAD/CAM procedures and structures;
how CAD model preparations have to be carried out and which
virtual environment requirements have to be fulfilled. The issue of
data transfer is also explained in the paper. The other challenges and
requirements like anti-aliasing and collision detection have also been
explained. Finally, a VA simulation has been carried out for a ball
valve assembly and a car door assembly with the help of Vizard
virtual reality toolkit in a semi-immersive environment and their
performance analysis has been done on different workstations to
evaluate the importance of graphical processing unit (GPU) in the
field of VA.
Abstract: We study the typical domain size and configuration
character of a randomly perturbed system exhibiting continuous
symmetry breaking. As a model system we use rod-like objects
within a cubic lattice interacting via a Lebwohl–Lasher-type
interaction. We describe their local direction with a headless unit
director field. An example of such systems represents nematic LC or
nanotubes. We further introduce impurities of concentration p, which
impose the random anisotropy field-type disorder to directors. We
study the domain-type pattern of molecules as a function of p,
anchoring strength w between a neighboring director and impurity,
temperature, history of samples. In simulations we quenched the
directors either from the random or homogeneous initial
configuration. Our results show that a history of system strongly
influences: i) the average domain coherence length; and ii) the range
of ordering in the system. In the random case the obtained order is
always short ranged (SR). On the contrary, in the homogeneous case,
SR is obtained only for strong enough anchoring and large enough
concentration p. In other cases, the ordering is either of quasi long
range (QLR) or of long range (LR). We further studied memory
effects for the random initial configuration. With increasing external
ordering field B either QLR or LR is realized.
Abstract: The flow field and the motion of the free surface in an
oscillating container are simulated numerically to assess the numerical
approach for studying two-phase flows under oscillating conditions.
Two numerical methods are compared: one is to model the oscillating
container directly using the moving grid of the ALE method, and the
other is to simulate the effect of container motion using the oscillating
body force acting on the fluid in the stationary container. The
two-phase flow field in the container is simulated using the level set
method in both cases. It is found that the calculated results by the body
force method coinsides with those by the moving grid method and the
sloshing behavior is predicted well by both the methods. Theoretical
back ground and limitation of the body force method are discussed,
and the effects of oscillation amplitude and frequency are shown.
Abstract: The policies governing the business of any
organization are well reflected in her business rules. The business
rules are implemented by data validation techniques, coded during
the software development process. Any change in business
policies results in change in the code written for data validation
used to enforce the business policies. Implementing the change in
business rules without changing the code is the objective of this
paper. The proposed approach enables users to create rule sets at
run time once the software has been developed. The newly defined
rule sets by end users are associated with the data variables for
which the validation is required. The proposed approach facilitates
the users to define business rules using all the comparison
operators and Boolean operators. Multithreading is used to
validate the data entered by end user against the business rules
applied. The evaluation of the data is performed by a newly
created thread using an enhanced form of the RPN (Reverse Polish
Notation) algorithm.
Abstract: In this paper we introduce a novel kernel classifier
based on a iterative shrinkage algorithm developed for compressive
sensing. We have adopted Bregman iteration with soft and hard
shrinkage functions and generalized hinge loss for solving l1 norm
minimization problem for classification. Our experimental results
with face recognition and digit classification using SVM as the
benchmark have shown that our method has a close error rate
compared to SVM but do not perform better than SVM. We have
found that the soft shrinkage method give more accuracy and in some
situations more sparseness than hard shrinkage methods.
Abstract: This paper presents the result of three senior capstone
projects at the Department of Computer Engineering, Prince of
Songkla University, Thailand. These projects focus on developing an
examination management system for the Faculty of Engineering in
order to manage the examination both the examination room
assignments and the examination proctor assignments in each room.
The current version of the software is a web-based application. The
developed software allows the examination proctors to select their
scheduled time online while each subject is assigned to each available
examination room according to its type and the room capacity. The
developed system is evaluated using real data by prospective users of
the system. Several suggestions for further improvements are given
by the testers. Even though the features of the developed software are
not superior, the developing process can be a case study for a projectbased
teaching style. Furthermore, the process of developing this
software can show several issues in developing an educational
support application.
Abstract: This paper presents an optimal design of linear phase
digital high pass finite impulse response (FIR) filter using Improved
Particle Swarm Optimization (IPSO). In the design process, the filter
length, pass band and stop band frequencies, feasible pass band and
stop band ripple sizes are specified. FIR filter design is a multi-modal
optimization problem. An iterative method is introduced to find the
optimal solution of FIR filter design problem. Evolutionary
algorithms like real code genetic algorithm (RGA), particle swarm
optimization (PSO), improved particle swarm optimization (IPSO)
have been used in this work for the design of linear phase high pass
FIR filter. IPSO is an improved PSO that proposes a new definition
for the velocity vector and swarm updating and hence the solution
quality is improved. A comparison of simulation results reveals the
optimization efficacy of the algorithm over the prevailing
optimization techniques for the solution of the multimodal, nondifferentiable,
highly non-linear, and constrained FIR filter design
problems.
Abstract: Inadequate curriculum for software engineering is considered to be one of the most common software risks. A number of solutions, on improving Software Engineering Education (SEE) have been reported in literature but there is a need to collectively present these solutions at one place. We have performed a mapping study to present a broad view of literature; published on improving the current state of SEE. Our aim is to give academicians, practitioners and researchers an international view of the current state of SEE. Our study has identified 70 primary studies that met our selection criteria, which we further classified and categorized in a well-defined Software Engineering educational framework. We found that the most researched category within the SE educational framework is Innovative Teaching Methods whereas the least amount of research was found in Student Learning and Assessment category. Our future work is to conduct a Systematic Literature Review on SEE.
Abstract: We study how the outcome of evolutionary dynamics on
graphs depends on a randomness on the graph structure. We gradually
change the underlying graph from completely regular (e.g. a square lattice) to completely random. We find that the fixation probability increases as the randomness increases; nevertheless, the increase is
not significant and thus the fixation probability could be estimated by the known formulas for underlying regular graphs.
Abstract: In this paper, we present a novel objective nonreference performance assessment algorithm for image fusion. It takes into account local measurements to estimate how well the important information in the source images is represented by the fused image. The metric is based on the Universal Image Quality Index and uses the similarity between blocks of pixels in the input images and the fused image as the weighting factors for the metrics. Experimental results confirm that the values of the proposed metrics correlate well with the subjective quality of the fused images, giving a significant improvement over standard measures based on mean squared error and mutual information.
Abstract: In this paper we present discretization and decomposition methods for a multi-component transport model of a chemical vapor deposition (CVD) process. CVD processes are used to manufacture deposition layers or bulk materials. In our transport model we simulate the deposition of thin layers. The microscopic model is based on the heavy particles, which are derived by approximately solving a linearized multicomponent Boltzmann equation. For the drift-process of the particles we propose diffusionreaction equations as well as for the effects of heat conduction. We concentrate on solving the diffusion-reaction equation with analytical and numerical methods. For the chemical processes, modelled with reaction equations, we propose decomposition methods and decouple the multi-component models to simpler systems of differential equations. In the numerical experiments we present the computational results of our proposed models.
Abstract: The purpose of this study was to examine the viewpoints in terms of changing distances and levels and thereby, comparatively analyze the visual sensitivity to the elements of the natural views. The questionnaire survey was conducted separately for experts and non-experts. Summing up, it was confirmed that the visual sensitivity to the elements of the same natural views differed significantly depending on subjects' professionalism, changes of the viewpoint levels and distances, while the visual sensitivity to 'openness of visual/view axes' did not differ significantly when only the distances of the viewpoints were varied. In addition, the visual sensitivity to visual/view axes differed between experts and ordinary people when the levels of the viewpoints were varied, while the visual sensitivity to 'damaged natural view resources' differed between two groups when the distances of the viewpoints were varied.
Abstract: We have developed a computer program consisting of
6 subtests assessing the children hand dexterity applicable in the
rehabilitation medicine. We have carried out a normative study on a
representative sample of 285 children aged from 7 to 15 (mean age
11.3) and we have proposed clinical standards for three age groups
(7-9, 9-11, 12-15 years). We have shown statistical significance of
differences among the corresponding mean values of the task time
completion. We have also found a strong correlation between the task
time completion and the age of the subjects, as well as we have
performed the test-retest reliability checks in the sample of 84
children, giving the high values of the Pearson coefficients for the
dominant and non-dominant hand in the range 0.740.97 and
0.620.93, respectively.
A new MATLAB-based programming tool aiming at analysis of
cardiologic RR intervals and blood pressure descriptors, is worked
out, too. For each set of data, ten different parameters are extracted: 2
in time domain, 4 in frequency domain and 4 in Poincaré plot
analysis. In addition twelve different parameters of baroreflex
sensitivity are calculated. All these data sets can be visualized in time
domain together with their power spectra and Poincaré plots. If
available, the respiratory oscillation curves can be also plotted for
comparison. Another application processes biological data obtained
from BLAST analysis.
Abstract: Key performance indicators (KPIs) are used for post
result evaluation in the construction industry, and they normally do
not have provisions for changes. This paper proposes a set of
dynamic key performance indicators (d-KPIs) which predicts the
future performance of the activity being measured and presents the
opportunity to change practice accordingly. Critical to the
predictability of a construction project is the ability to achieve
automated data collection. This paper proposes an effective way to
collect the process and engineering management data from an
integrated construction management system. The d-KPI matrix,
consisting of various indicators under seven categories, developed
from this study can be applied to close monitoring of the
development projects of aged-care facilities. The d-KPI matrix also
enables performance measurement and comparison at both project
and organization levels.
Abstract: In this paper, an improvement of PDLZW implementation
with a new dictionary updating technique is proposed. A
unique dictionary is partitioned into hierarchical variable word-width
dictionaries. This allows us to search through dictionaries in parallel.
Moreover, the barrel shifter is adopted for loading a new input string
into the shift register in order to achieve a faster speed. However,
the original PDLZW uses a simple FIFO update strategy, which is
not efficient. Therefore, a new window based updating technique
is implemented to better classify the difference in how often each
particular address in the window is referred. The freezing policy
is applied to the address most often referred, which would not be
updated until all the other addresses in the window have the same
priority. This guarantees that the more often referred addresses would
not be updated until their time comes. This updating policy leads
to an improvement on the compression efficiency of the proposed
algorithm while still keep the architecture low complexity and easy
to implement.
Abstract: This paper reports the influence of sucrose on the
preservation of CO2 hydrate crystal samples. The particle diameter of
hydrate samples were 1.0 and 5.6-8.0 mm. Mass fraction of sucrose in
the sample was 0.16. The samples were stored at the aerated condition
under atmospheric pressure and at the temperature of 253 or 258 K.
The results indicated that the mass fractions of CO2 hydrate in the
samples with sucrose were 0.10 ± 0.03 at the end of 3-week
preservation, regardless of temperature and particle diameter. Mass
fraction of CO2 hydrate in the samples with sucrose was higher than
that of pure CO2 hydrate for 1.0 mm particle diameter, while was
lower than that of pure CO2 hydrate for 5.6-8.0 mm particle diameter.
Discussion is made on the influence of sucrose on the dissociation of
CO2 hydrate and the resulting formation of ice.