Abstract: Large scale systems such as computational Grid is
a distributed computing infrastructure that can provide globally
available network resources. The evolution of information processing
systems in Data Grid is characterized by a strong decentralization of
data in several fields whose objective is to ensure the availability and
the reliability of the data in the reason to provide a fault tolerance
and scalability, which cannot be possible only with the use of the
techniques of replication. Unfortunately the use of these techniques
has a height cost, because it is necessary to maintain consistency
between the distributed data. Nevertheless, to agree to live with
certain imperfections can improve the performance of the system by
improving competition. In this paper, we propose a multi-layer protocol
combining the pessimistic and optimistic approaches conceived
for the data consistency maintenance in large scale systems. Our
approach is based on a hierarchical representation model with tree
layers, whose objective is with double vocation, because it initially
makes it possible to reduce response times compared to completely
pessimistic approach and it the second time to improve the quality
of service compared to an optimistic approach.
Abstract: In this research, CaO-ZnO catalysts (with various
Ca:Zn atomic ratios of 1:5, 1:3, 1:1, and 3:1) prepared by incipientwetness
impregnation (IWI) and co-precipitation (CP) methods were
used as a catalyst in the transesterification of palm oil with methanol
for biodiesel production. The catalysts were characterized by several
techniques, including BET method, CO2-TPD, and Hemmett
Indicator. The effects of precursor concentration, and calcination
temperature on the catalytic performance were studied under reaction
conditions of a 15:1 methanol to oil molar ratio, 6 wt% catalyst,
reaction temperature of 60°C, and reaction time of 8 h. At Ca:Zn
atomic ratio of 1:3 gave the highest FAME value owing to a basic
properties and surface area of the prepared catalyst.
Abstract: This paper reports the results of an experimental work
conducted to investigate the effect of curing conditions on the
compressive strength of self-compacting geopolymer concrete
prepared by using fly ash as base material and combination of sodium
hydroxide and sodium silicate as alkaline activator. The experiments
were conducted by varying the curing time and curing temperature in
the range of 24-96 hours and 60-90°C respectively. The essential
workability properties of freshly prepared Self-compacting
Geopolymer concrete such as filling ability, passing ability and
segregation resistance were evaluated by using Slump flow,
V-funnel, L-box and J-ring test methods. The fundamental
requirements of high flowability and resistance to segregation as
specified by guidelines on Self-compacting Concrete by EFNARC
were satisfied. Test results indicate that longer curing time and curing
the concrete specimens at higher temperatures result in higher
compressive strength. There was increase in compressive strength
with the increase in curing time; however increase in compressive
strength after 48 hours was not significant. Concrete specimens cured
at 70°C produced the highest compressive strength as compared to
specimens cured at 60°C, 80°C and 90°C.
Abstract: Ensemble learning algorithms such as AdaBoost and
Bagging have been in active research and shown improvements in
classification results for several benchmarking data sets with mainly
decision trees as their base classifiers. In this paper we experiment to
apply these Meta learning techniques with classifiers such as random
forests, neural networks and support vector machines. The data sets
are from MAGIC, a Cherenkov telescope experiment. The task is to
classify gamma signals from overwhelmingly hadron and muon
signals representing a rare class classification problem. We compare
the individual classifiers with their ensemble counterparts and
discuss the results. WEKA a wonderful tool for machine learning has
been used for making the experiments.
Abstract: In developing a text-to-speech system, it is well
known that the accuracy of information extracted from a text is
crucial to produce high quality synthesized speech. In this paper, a
new scheme for converting text into its equivalent phonetic spelling
is introduced and developed. This method is applicable to many
applications in text to speech converting systems and has many
advantages over other methods. The proposed method can also
complement the other methods with a purpose of improving their
performance. The proposed method is a probabilistic model and is
based on Smooth Ergodic Hidden Markov Model. This model can be
considered as an extension to HMM. The proposed method is applied
to Persian language and its accuracy in converting text to speech
phonetics is evaluated using simulations.
Abstract: The hot deformation behavior of high strength low
alloy (HSLA) steels with different chemical compositions under hot
working conditions in the temperature range of 900 to 1100℃ and
strain rate range from 0.1 to 10 s-1 has been studied by performing a
series of hot compression tests. The dynamic materials model has been
employed for developing the processing maps, which show variation
of the efficiency of power dissipation with temperature and strain rate.
Also the Kumar-s model has been used for developing the instability
map, which shows variation of the instability for plastic deformation
with temperature and strain rate. The efficiency of power dissipation
increased with decreasing strain rate and increasing temperature in the
steel with higher Cr and Ti content. High efficiency of power
dissipation over 20 % was obtained at a finite strain level of 0.1 under
the conditions of strain rate lower than 1 s-1 and temperature higher
than 1050 ℃ . Plastic instability was expected in the regime of
temperatures lower than 1000 ℃ and strain rate lower than 0.3 s-1. Steel
with lower Cr and Ti contents showed high efficiency of power
dissipation at higher strain rate and lower temperature conditions.
Abstract: Fair share is one of the scheduling objectives supported on many production systems. However, fair share has been shown to cause performance problems for some users, especially the users with difficult jobs. This work is focusing on extending goaloriented parallel computer job scheduling policies to cover the fair share objective. Goal-oriented parallel computer job scheduling policies have been shown to achieve good scheduling performances when conflicting objectives are required. Goal-oriented policies achieve such good performance by using anytime combinatorial search techniques to find a good compromised schedule within a time limit. The experimental results show that the proposed goal-oriented parallel computer job scheduling policy (namely Tradeofffs( Tw:avgX)) achieves good scheduling performances and also provides good fair share performance.
Abstract: The purpose of this work is measurement of the
system presampling MTF of a variable resolution x-ray (VRX) CT
scanner. In this paper, we used the parameters of an actual VRX CT
scanner for simulation and study of effect of different focal spot sizes
on system presampling MTF by Monte Carlo method (GATE
simulation software). Focal spot size of 0.6 mm limited the spatial
resolution of the system to 5.5 cy/mm at incident angles of below 17º
for cell#1. By focal spot size of 0.3 mm the spatial resolution
increased up to 11 cy/mm and the limiting effect of focal spot size
appeared at incident angles of below 9º. The focal spot size of 0.3
mm could improve the spatial resolution to some extent but because
of magnification non-uniformity, there is a 10 cy/mm difference
between spatial resolution of cell#1 and cell#256. The focal spot size
of 0.1 mm acted as an ideal point source for this system. The spatial
resolution increased to more than 35 cy/mm and at all incident angles
the spatial resolution was a function of incident angle. By the way
focal spot size of 0.1 mm minimized the effect of magnification nonuniformity.
Abstract: As seen in literature, about 70% of the improvement initiatives fail, and a significant number do not even get started. This paper analyses the problem of failing initiatives on Software Process Improvement (SPI), and proposes good practices supported by motivational tools that can help minimizing failures. It elaborates on the hypothesis that human factors are poorly addressed by deployers, especially because implementation guides usually emphasize only technical factors. This research was conducted with SPI deployers and analyses 32 SPI initiatives. The results indicate that although human factors are not commonly highlighted in guidelines, the successful initiatives usually address human factors implicitly. This research shows that practices based on human factors indeed perform a crucial role on successful implantations of SPI, proposes change management as a theoretical framework to introduce those practices in the SPI context and suggests some motivational tools based on SPI deployers experience to support it.
Abstract: This paper fist examines three set of bivariate cointegrations between any two of current accounts, stock markets, and currency exchange markets in ten Asian countries. Furthermore, we examined the effect of country characters on this bivariate cointegration. Our findings suggest that for three sets of cointegration test, each sample country at least exists one cointegration. India consistently exhibited a bi-directional causal relationship between any two of three indicators. Unlike Pan et al. (2007) and Phylaktis and Ravazzolo (2005), we found that such cointegration is influenced by three characteristics: capital control; flexibility in foreign exchange rates; and the ratio of trade to GDP. These characteristics are the result of liberalization in each Asian country. This implies that liberalization policies are effective on improving the cointegration between any two of financial markets and current account for ten Asian countries.
Abstract: Aluminothermic rail welding was from the beginning
a great success because its low price even in 1895 in Germany. This
method is now, widely used all over the world for the railways
construction, maintenance and modernization. Instructions give you
guidelines for preparing papers for conferences or journals.
After 1989, the welding needs of the potentials beneficiaries
(Romanian Railways, Urban Transportation Companies) keep raise
because of the railways maintenance and modernization necessity.
The main materials that determine the Thermit (T) composition
result from manufacturing scraps all over the country. This can help
the environment by consuming these scraps.
The Romanian need for alumino-thermic welding is now by 11300
per year, and in a favourable economical environment, this amount
can reach 30000 units.
This paper tries to show the effect of two types of modifiers
introduced in the T composition on the structure and properties of an
alumino-thermic welding.
Abstract: Dorsal hand vein pattern is an emerging biometric which is attracting the attention of researchers, of late. Research is being carried out on existing techniques in the hope of improving them or finding more efficient ones. In this work, Principle Component Analysis (PCA) , which is a successful method, originally applied on face biometric is being modified using Cholesky decomposition and Lanczos algorithm to extract the dorsal hand vein features. This modified technique decreases the number of computation and hence decreases the processing time. The eigenveins were successfully computed and projected onto the vein space. The system was tested on a database of 200 images and using a threshold value of 0.9 to obtain the False Acceptance Rate (FAR) and False Rejection Rate (FRR). This modified algorithm is desirable when developing biometric security system since it significantly decreases the matching time.
Abstract: Nowadays, hand vein recognition has attracted more attentions in identification biometrics systems. Generally, hand vein image is acquired with low contrast and irregular illumination. Accordingly, if you have a good preprocessing of hand vein image, we can easy extracted the feature extraction even with simple binarization. In this paper, a proposed approach is processed to improve the quality of hand vein image. First, a brief survey on existing methods of enhancement is investigated. Then a Radon Like features method is applied to preprocessing hand vein image. Finally, experiments results show that the proposed method give the better effective and reliable in improving hand vein images.
Abstract: This research investigates the suitability of fuel oil in
improving gypseous soil. A detailed laboratory tests were carried-out
on two soils (soil I with 51.6% gypsum content, and soil II with
26.55%), where the two soils were obtained from Al-Therthar site
(Al-Anbar Province-Iraq).
This study examines the improvement of soil properties using the
gypsum material which is locally available with low cost to minimize
the effect of moisture on these soils by using the fuel oil. This study
was conducted on two models of the soil gypsum, from the Tharthar
area. The first model was sandy soil with Gypsum content of (51.6%)
and the second is clayey soil and the content of Gypsum is (26.55%).
The program included tests measuring the permeability and
compressibility of the soil and their collapse properties. The shear
strength of the soil and the amounts of weight loss of fuel oil due to
drying had been found. These tests have been conducted on the
treated and untreated soils to observe the effect of soil treatment on
the engineering properties when mixed with varying degrees of fuel
oil with the equivalent of the water content.
The results showed that fuel oil is a good material to modify the
basic properties of the gypseous soil of collapsibility and
permeability, which are the main problems of this soil and retained
the soil by an appropriate amount of the cohesion suitable for
carrying the loads from the structure.
Abstract: Virtual Assembly (VA) is one of the key technologies
in advanced manufacturing field. It is a promising application of
virtual reality in design and manufacturing field. It has drawn much
interest from industries and research institutes in the last two decades.
This paper describes a process for integrating an interactive Virtual
Reality-based assembly simulation of a digital mockup with the
CAD/CAM infrastructure. The necessary hardware and software
preconditions for the process are explained so that it can easily be
adopted by non VR experts. The article outlines how assembly
simulation can improve the CAD/CAM procedures and structures;
how CAD model preparations have to be carried out and which
virtual environment requirements have to be fulfilled. The issue of
data transfer is also explained in the paper. The other challenges and
requirements like anti-aliasing and collision detection have also been
explained. Finally, a VA simulation has been carried out for a ball
valve assembly and a car door assembly with the help of Vizard
virtual reality toolkit in a semi-immersive environment and their
performance analysis has been done on different workstations to
evaluate the importance of graphical processing unit (GPU) in the
field of VA.
Abstract: In this paper we introduce a novel kernel classifier
based on a iterative shrinkage algorithm developed for compressive
sensing. We have adopted Bregman iteration with soft and hard
shrinkage functions and generalized hinge loss for solving l1 norm
minimization problem for classification. Our experimental results
with face recognition and digit classification using SVM as the
benchmark have shown that our method has a close error rate
compared to SVM but do not perform better than SVM. We have
found that the soft shrinkage method give more accuracy and in some
situations more sparseness than hard shrinkage methods.
Abstract: This paper presents the result of three senior capstone
projects at the Department of Computer Engineering, Prince of
Songkla University, Thailand. These projects focus on developing an
examination management system for the Faculty of Engineering in
order to manage the examination both the examination room
assignments and the examination proctor assignments in each room.
The current version of the software is a web-based application. The
developed software allows the examination proctors to select their
scheduled time online while each subject is assigned to each available
examination room according to its type and the room capacity. The
developed system is evaluated using real data by prospective users of
the system. Several suggestions for further improvements are given
by the testers. Even though the features of the developed software are
not superior, the developing process can be a case study for a projectbased
teaching style. Furthermore, the process of developing this
software can show several issues in developing an educational
support application.
Abstract: This paper presents an optimal design of linear phase
digital high pass finite impulse response (FIR) filter using Improved
Particle Swarm Optimization (IPSO). In the design process, the filter
length, pass band and stop band frequencies, feasible pass band and
stop band ripple sizes are specified. FIR filter design is a multi-modal
optimization problem. An iterative method is introduced to find the
optimal solution of FIR filter design problem. Evolutionary
algorithms like real code genetic algorithm (RGA), particle swarm
optimization (PSO), improved particle swarm optimization (IPSO)
have been used in this work for the design of linear phase high pass
FIR filter. IPSO is an improved PSO that proposes a new definition
for the velocity vector and swarm updating and hence the solution
quality is improved. A comparison of simulation results reveals the
optimization efficacy of the algorithm over the prevailing
optimization techniques for the solution of the multimodal, nondifferentiable,
highly non-linear, and constrained FIR filter design
problems.
Abstract: Inadequate curriculum for software engineering is considered to be one of the most common software risks. A number of solutions, on improving Software Engineering Education (SEE) have been reported in literature but there is a need to collectively present these solutions at one place. We have performed a mapping study to present a broad view of literature; published on improving the current state of SEE. Our aim is to give academicians, practitioners and researchers an international view of the current state of SEE. Our study has identified 70 primary studies that met our selection criteria, which we further classified and categorized in a well-defined Software Engineering educational framework. We found that the most researched category within the SE educational framework is Innovative Teaching Methods whereas the least amount of research was found in Student Learning and Assessment category. Our future work is to conduct a Systematic Literature Review on SEE.
Abstract: Modeling of a manufacturing system enables one to
identify the effects of key design parameters on the system performance and as a result to make correct decision. This paper
proposes a manufacturing system modeling approach using a spreadsheet model based on queuing network theory, in which a
static capacity planning model and stochastic queuing model are integrated. The model was used to improve the existing system utilization in relation to product design. The model incorporates few
parameters such as utilization, cycle time, throughput, and batch size.
The study also showed that the validity of developed model is good enough to apply and the maximum value of relative error is 10%, far
below the limit value 32%. Therefore, the model developed in this
study is a valuable alternative model in evaluating a manufacturing system