Abstract: This article proposes an Ant Colony Optimization
(ACO) metaheuristic to minimize total makespan for scheduling a set
of jobs and assign workers for uniformly related parallel machines.
An algorithm based on ACO has been developed and coded on a
computer program Matlab®, to solve this problem. The paper
explains various steps to apply Ant Colony approach to the problem
of minimizing makespan for the worker assignment & jobs
scheduling problem in a parallel machine model and is aimed at
evaluating the strength of ACO as compared to other conventional
approaches. One data set containing 100 problems (12 Jobs, 03
machines and 10 workers) which is available on internet, has been
taken and solved through this ACO algorithm. The results of our
ACO based algorithm has shown drastically improved results,
especially, in terms of negligible computational effort of CPU, to
reach the optimal solution. In our case, the time taken to solve all 100
problems is even lesser than the average time taken to solve one
problem in the data set by other conventional approaches like GA
algorithm and SPT-A/LMC heuristics.
Abstract: In this paper, novel techniques in increasing the accuracy
and speed of convergence of a Feed forward Back propagation
Artificial Neural Network (FFBPNN) with polynomial activation
function reported in literature is presented. These technique was
subsequently used to determine the coefficients of Autoregressive
Moving Average (ARMA) and Autoregressive (AR) system. The
results obtained by introducing sequential and batch method of weight
initialization, batch method of weight and coefficient update, adaptive
momentum and learning rate technique gives more accurate result
and significant reduction in convergence time when compared t the
traditional method of back propagation algorithm, thereby making
FFBPNN an appropriate technique for online ARMA coefficient
determination.
Abstract: Modeling of Panel Zone (PZ) seismic behavior,
because of its role in overall ductility and lateral stiffness of steel
moment frames, has been considered a challenge for years. There are
some studies regarding the effects of different doubler plates
thicknesses and geometric properties of PZ on its seismic behavior.
However, there is not much investigation on the effects of number of
provided continuity plates in case of presence of one triangular
haunch, two triangular haunches and rectangular haunch (T shape
haunches) for exterior columns. In this research first detailed finite
element models of 12tested connection of SAC joint venture were
created and analyzed then obtained cyclic behavior backbone curves
of these models besides other FE models for similar tests were used
for neural network training. Then seismic behavior of these data is
categorized according to continuity plate-s arrangements and
differences in type of haunches. PZ with one-sided haunches have
little plastic rotation. As the number of continuity plates increases
due to presence of two triangular haunches (four continuity plate),
there will be no plastic rotation, in other words PZ behaves in its
elastic range. In the case of rectangular haunch, PZ show more plastic
rotation in comparison with one-sided triangular haunch and
especially double-sided triangular haunches. Moreover, the models
that will be presented in case of triangular one-sided and double-
sided haunches and rectangular haunches as a result of this study
seem to have a proper estimation of PZ seismic behavior.
Abstract: The Siemens Healthcare Sector is one of the world's
largest suppliers to the healthcare industry and a trendsetter in
medical imaging and therapy, laboratory diagnostics, medical
information technology, and hearing aids.
Siemens offers its customers products and solutions for the entire
range of patient care from a single source – from prevention and
early detection to diagnosis, and on to treatment and aftercare. By
optimizing clinical workflows for the most common diseases,
Siemens also makes healthcare faster, better, and more cost effective.
The optimization of clinical workflows requires a
multidisciplinary focus and a collaborative approach of e.g. medical
advisors, researchers and scientists as well as healthcare economists.
This new form of collaboration brings together experts with deep
technical experience, physicians with specialized medical knowledge
as well as people with comprehensive knowledge about health
economics.
As Charles Darwin is often quoted as saying, “It is neither the
strongest of the species that survive, nor the most intelligent, but the
one most responsive to change," We believe that those who can
successfully manage this change will emerge as winners, with
valuable competitive advantage.
Current medical information and knowledge are some of the core
assets in the healthcare industry. The main issue is to connect
knowledge holders and knowledge recipients from various
disciplines efficiently in order to spread and distribute knowledge.
Abstract: This research’s objective is to select the model with
most accurate value by using Neural Network Technique as a way to
filter potential students who enroll in IT course by Electronic learning
at Suan Suanadha Rajabhat University. It is designed to help students
selecting the appropriate courses by themselves. The result showed
that the most accurate model was 100 Folds Cross-validation which
had 73.58% points of accuracy.
Abstract: Workload and resource management are two essential functions provided at the service level of the grid software infrastructure. To improve the global throughput of these software environments, workloads have to be evenly scheduled among the available resources. To realize this goal several load balancing strategies and algorithms have been proposed. Most strategies were developed in mind, assuming homogeneous set of sites linked with homogeneous and fast networks. However for computational grids we must address main new issues, namely: heterogeneity, scalability and adaptability. In this paper, we propose a layered algorithm which achieve dynamic load balancing in grid computing. Based on a tree model, our algorithm presents the following main features: (i) it is layered; (ii) it supports heterogeneity and scalability; and, (iii) it is totally independent from any physical architecture of a grid.
Abstract: This paper examines many mathematical methods for
molding the hourly price forward curve (HPFC); the model will be
constructed by numerous regression methods, like polynomial
regression, radial basic function neural networks & a furrier series.
Examination the models goodness of fit will be done by means of
statistical & graphical tools. The criteria for choosing the model will
depend on minimize the Root Mean Squared Error (RMSE), using the
correlation analysis approach for the regression analysis the optimal
model will be distinct, which are robust against model
misspecification. Learning & supervision technique employed to
determine the form of the optimal parameters corresponding to each
measure of overall loss. By using all the numerical methods that
mentioned previously; the explicit expressions for the optimal model
derived and the optimal designs will be implemented.
Abstract: Focusing on the environmental issues, including the reduction of scrap and consumer residuals, along with the benefiting from the economic value during the life cycle of goods/products leads the companies to have an important competitive approach. The aim of this paper is to present a new mixed nonlinear facility locationallocation model in recycling collection networks by considering multi-echelon, multi-suppliers, multi-collection centers and multifacilities in the recycling network. To make an appropriate decision in reality, demands, returns, capacities, costs and distances, are regarded uncertain in our model. For this purpose, a fuzzy mathematical programming-based possibilistic approach is introduced as a solution methodology from the recent literature to solve the proposed mixed-nonlinear programming model (MNLP). The computational experiments are provided to illustrate the applicability of the designed model in a supply chain environment and to help the decision makers to facilitate their analysis.
Abstract: In a recent major industry-supported research and development study, a novel framework was developed and applied for assessment of reliability and quality performance levels in reallife power systems with practical large-scale sizes. The new assessment methodology is based on three metaphors (dimensions) representing the relationship between available generation capacities and required demand levels. The paper shares the results of the successfully completed stud and describes the implementation of the new methodology on practical zones in the Saudi electricity system.
Abstract: Wireless ad hoc nodes are freely and dynamically
self-organize in communicating with others. Each node can act as
host or router. However it actually depends on the capability of
nodes in terms of its current power level, signal strength, number
of hops, routing protocol, interference and others. In this research,
a study was conducted to observe the effect of hops count over
different network topologies that contribute to TCP Congestion
Control performance degradation. To achieve this objective, a
simulation using NS-2 with different topologies have been
evaluated. The comparative analysis has been discussed based on
standard observation metrics: throughput, delay and packet loss
ratio. As a result, there is a relationship between types of topology
and hops counts towards the performance of ad hoc network. In
future, the extension study will be carried out to investigate the
effect of different error rate and background traffic over same
topologies.
Abstract: In this work, we present a comparison between two
techniques of image compression. In the first case, the image is
divided in blocks which are collected according to zig-zag scan. In
the second one, we apply the Fast Cosine Transform to the image,
and then the transformed image is divided in blocks which are
collected according to zig-zag scan too. Later, in both cases, the
Karhunen-Loève transform is applied to mentioned blocks. On the
other hand, we present three new metrics based on eigenvalues for a
better comparative evaluation of the techniques. Simulations show
that the combined version is the best, with minor Mean Absolute
Error (MAE) and Mean Squared Error (MSE), higher Peak Signal to
Noise Ratio (PSNR) and better image quality. Finally, new technique
was far superior to JPEG and JPEG2000.
Abstract: The paper presents a novel idea to control computer
mouse cursor movement with human eyes. In this paper, a working
of the product has been described as to how it helps the special
people share their knowledge with the world. Number of traditional
techniques such as Head and Eye Movement Tracking Systems etc.
exist for cursor control by making use of image processing in which
light is the primary source. Electro-oculography (EOG) is a new
technology to sense eye signals with which the mouse cursor can be
controlled. The signals captured using sensors, are first amplified,
then noise is removed and then digitized, before being transferred to
PC for software interfacing.
Abstract: As wireless sensor networks are energy constraint networks
so energy efficiency of sensor nodes is the main design issue.
Clustering of nodes is an energy efficient approach. It prolongs the
lifetime of wireless sensor networks by avoiding long distance communication.
Clustering algorithms operate in rounds. Performance of
clustering algorithm depends upon the round time. A large round
time consumes more energy of cluster heads while a small round
time causes frequent re-clustering. So existing clustering algorithms
apply a trade off to round time and calculate it from the initial
parameters of networks. But it is not appropriate to use initial
parameters based round time value throughout the network lifetime
because wireless sensor networks are dynamic in nature (nodes can be
added to the network or some nodes go out of energy). In this paper
a variable round time approach is proposed that calculates round
time depending upon the number of active nodes remaining in the
field. The proposed approach makes the clustering algorithm adaptive
to network dynamics. For simulation the approach is implemented
with LEACH in NS-2 and the results show that there is 6% increase
in network lifetime, 7% increase in 50% node death time and 5%
improvement over the data units gathered at the base station.
Abstract: The objective of this work is to study the influence of the properties of the substrate on the retrofit (thin repair) of damaged concrete elements, with the SCC. Fluidity, principal characteristic of the SCC, would enable it to cover and adhere to the concrete to be repaired. Two aspects of repair are considered, the bond (Adhesion) and the tensile strength and the cracking. The investigation is experimental; It was conducted over test specimens made up of ordinary concrete prepared and hardened in advance (the material to be repaired) over which a self compacting concrete layer is cast. Three alternatives of SC concrete and one ordinary concrete (comparison) were tested. It appears that the self-compacting concrete constitutes a good material for repairing. It follows perfectly the surfaces- forms to be repaired and allows a perfect bond. Fracture tests made on specimens of self-compacting concrete show a brittle behaviour. However when a small percentage of fibres is added, the resistance to cracking is very much improve.
Abstract: Deep cold rolling (DCR) and low plasticity burnishing (LPB) process are cold working processes, which easily produce a smooth and work-hardened surface by plastic deformation of surface irregularities. The present study focuses on the surface roughness and surface hardness aspects of AISI 4140 work material, using fractional factorial design of experiments. The assessment of the surface integrity aspects on work material was done, in order to identify the predominant factors amongst the selected parameters. They were then categorized in order of significance followed by setting the levels of the factors for minimizing surface roughness and/or maximizing surface hardness. In the present work, the influence of main process parameters (force, feed rate, number of tool passes/overruns, initial roughness of the work piece, ball material, ball diameter and lubricant used) on the surface roughness and the hardness of AISI 4140 steel were studied for both LPB and DCR process and the results are compared. It was observed that by using LPB process surface hardness has been improved by 167% and in DCR process surface hardness has been improved by 442%. It was also found that the force, ball diameter, number of tool passes and initial roughness of the workpiece are the most pronounced parameters, which has a significant effect on the work piece-s surface during deep cold rolling and low plasticity burnishing process.
Abstract: Compensating physiological motion in the context
of minimally invasive cardiac surgery has become an attractive
issue since it outperforms traditional cardiac procedures offering
remarkable benefits. Owing to space restrictions, computer vision
techniques have proven to be the most practical and suitable solution.
However, the lack of robustness and efficiency of existing methods
make physiological motion compensation an open and challenging
problem. This work focusses on increasing robustness and efficiency
via exploration of the classes of 1−and 2−regularized optimization,
emphasizing the use of explicit regularization. Both approaches are
based on natural features of the heart using intensity information.
Results pointed out the 1−regularized optimization class as the best
since it offered the shortest computational cost, the smallest average
error and it proved to work even under complex deformations.
Abstract: Some methodologies were compared in providing
erosion maps of surface, rill and gully and erosion features, in
research which took place in the Varamin sub-basin, north-east
Tehran, Iran. A photomorphic unit map was produced from
processed satellite images, and four other maps were prepared by the
integration of different data layers, including slope, plant cover,
geology, land use, rocks erodibility and land units. Comparison of
ground truth maps of erosion types and working unit maps indicated
that the integration of land use, land units and rocks erodibility layers
with satellite image photomorphic units maps provide the best
methods in producing erosion types maps.
Abstract: This paper is based on a study conducted in 2006 to assess the impact of computer usage on health of National Institute for Medical Research (NIMR) staff. NIMR being a research Institute, most of its staff spend substantial part of their working time on computers. There was notion among NIMR staff on possible prolonged computer usage health hazards. Hence, a study was conducted to establish facts and possible mitigation measures. A total of 144 NIMR staff were involved in the study of whom 63.2% were males and 36.8% females aged between 20 and 59 years. All staff cadres were included in the sample. The functions performed by Institute staff using computers includes; data management, proposal development and report writing, research activities, secretarial duties, accounting and administrative duties, on-line information retrieval and online communication through e-mail services. The interviewed staff had been using computers for 1-8 hours a day and for a period ranging from 1 to 20 years. The study has indicated ergonomic hazards for a significant proportion of interviewees (63%) of various kinds ranging from backache to eyesight related problems. The authors highlighted major issues which are substantially applicable in preventing occurrences of computer related problems and they urged NIMR Management and/or the government of Tanzania opts to adapt their practicability.
Abstract: Gasoline Octane Number is the standard measure of
the anti-knock properties of a motor in platforming processes, that is
one of the important unit operations for oil refineries and can be
determined with online measurement or use CFR (Cooperative Fuel
Research) engines. Online measurements of the Octane number can
be done using direct octane number analyzers, that it is too
expensive, so we have to find feasible analyzer, like ANFIS
estimators.
ANFIS is the systems that neural network incorporated in fuzzy
systems, using data automatically by learning algorithms of NNs.
ANFIS constructs an input-output mapping based both on human
knowledge and on generated input-output data pairs.
In this research, 31 industrial data sets are used (21 data for training
and the rest of the data used for generalization). Results show that,
according to this simulation, hybrid method training algorithm in
ANFIS has good agreements between industrial data and simulated
results.
Abstract: Computer programming is considered a very difficult
course by many computer science students. The reasons for the
difficulties include cognitive load involved in programming,
different learning styles of students, instructional methodology and
the choice of the programming languages. To reduce the difficulties
the following have been tried: pair programming, program
visualization, different learning styles etc. However, these efforts
have produced limited success. This paper reviews the problem and
proposes a framework to help students overcome the difficulties
involved.