Abstract: Hazardous Material transportation by road is coupled
with inherent risk of accidents causing loss of lives, grievous injuries,
property losses and environmental damages. The most common type
of hazmat road accident happens to be the releases (78%) of
hazardous substances, followed by fires (28%), explosions (14%) and
vapour/ gas clouds (6 %.).
The paper is discussing initially the probable 'Impact Zones'
likely to be caused by one flammable (LPG) and one toxic (ethylene
oxide) chemicals being transported through a sizable segment of a
State Highway connecting three notified Industrial zones in Surat
district in Western India housing 26 MAH industrial units. Three
'hotspots' were identified along the highway segment depending on
the particular chemical traffic and the population distribution within
500 meters on either sides. The thermal radiation and explosion
overpressure have been calculated for LPG / Ethylene Oxide BLEVE
scenarios along with toxic release scenario for ethylene oxide.
Besides, the dispersion calculations for ethylene oxide toxic release
have been made for each 'hotspot' location and the impact zones
have been mapped for the LOC concentrations. Subsequently, the
maximum Initial Isolation and the protective zones were calculated
based on ERPG-3 and ERPG-2 values of ethylene oxide respectively
which are estimated taking the worst case scenario under worst
weather conditions. The data analysis will be helpful to the local
administration in capacity building with respect to rescue /
evacuation and medical preparedness and quantitative inputs to
augment the District Offsite Emergency Plan document.
Abstract: The demand on High voltage (HV) infrastructures is growing due to the corresponding growth in industries and population. Many areas are being developed and therefore require additional electrical power to comply with the demand. Substation upgrade is one of the rapid solutions to ensure the continuous supply of power to customers. This upgrade requires civil modifications to structures and fences. The civil work requires excavation and steel works that may create unsafe touch conditions. This paper presents a brief theoretical overview of the touch voltage inside and around substations and uses CDEGS software to simulate a case study.
Abstract: Warranty is a powerful marketing tool for the
manufacturer and a good protection for both the manufacturer and the
customer. However, warranty always involves additional costs to the
manufacturer, which depend on product reliability characteristics and
warranty parameters. This paper presents an approach to optimisation
of warranty parameters for known product failure distribution to
reduce the warranty costs to the manufacturer while retaining the
promotional function of the warranty. Combination free replacement
and pro-rata warranty policy is chosen as a model and the length of
free replacement period and pro-rata policy period are varied, as well
as the coefficients that define the pro-rata cost function. Multiparametric
warranty optimisation is done by using genetic algorithm.
Obtained results are guideline for the manufacturer to choose the
warranty policy that minimises the costs and maximises the profit.
Abstract: In this paper, a framework for the simplification and
standardization of metaheuristic related parameter-tuning by applying
a four phase methodology, utilizing Design of Experiments and
Artificial Neural Networks, is presented. Metaheuristics are multipurpose
problem solvers that are utilized on computational optimization
problems for which no efficient problem specific algorithm
exist. Their successful application to concrete problems requires the
finding of a good initial parameter setting, which is a tedious and
time consuming task. Recent research reveals the lack of approach
when it comes to this so called parameter-tuning process. In the
majority of publications, researchers do have a weak motivation for
their respective choices, if any. Because initial parameter settings
have a significant impact on the solutions quality, this course of
action could lead to suboptimal experimental results, and thereby
a fraudulent basis for the drawing of conclusions.
Abstract: We proposed a technique to identify road traffic
congestion levels from velocity of mobile sensors with high accuracy
and consistent with motorists- judgments. The data collection utilized
a GPS device, a webcam, and an opinion survey. Human perceptions
were used to rate the traffic congestion levels into three levels: light,
heavy, and jam. Then the ratings and velocity were fed into a
decision tree learning model (J48). We successfully extracted vehicle
movement patterns to feed into the learning model using a sliding
windows technique. The parameters capturing the vehicle moving
patterns and the windows size were heuristically optimized. The
model achieved accuracy as high as 99.68%. By implementing the
model on the existing traffic report systems, the reports will cover
comprehensive areas. The proposed method can be applied to any
parts of the world.
Abstract: The structure of retinal vessels is a prominent feature,
that reveals information on the state of disease that are reflected in
the form of measurable abnormalities in thickness and colour.
Vascular structures of retina, for implementation of clinical diabetic
retinopathy decision making system is presented in this paper.
Retinal Vascular structure is with thin blood vessel, whose accuracy
is highly dependent upon the vessel segmentation. In this paper the
blood vessel thickness is automatically detected using preprocessing
techniques and vessel segmentation algorithm. First the capture
image is binarized to get the blood vessel structure clearly, then it is
skeletonised to get the overall structure of all the terminal and
branching nodes of the blood vessels. By identifying the terminal
node and the branching points automatically, the main and branching
blood vessel thickness is estimated. Results are presented and
compared with those provided by clinical classification on 50 vessels
collected from Bejan Singh Eye hospital..
Abstract: Data Envelopment Analysis (DEA) is a methodology
that computes efficiency values for decision making units (DMU) in a
given period by comparing the outputs with the inputs. In many cases,
there are some time lag between the consumption of inputs and the
production of outputs. For a long-term research project, it is hard to
avoid the production lead time phenomenon. This time lag effect
should be considered in evaluating the performance of organizations.
This paper suggests a model to calculate efficiency values for the
performance evaluation problem with time lag. In the experimental
part, the proposed methods are compared with the CCR and an
existing time lag model using the data set of the 21st century frontier
R&D program which is a long-term national R&D program of Korea.
Abstract: This paper examined the influence of matching
students- learning preferences with the teaching methodology
adopted, on their academic performance in an accounting course in
two types of learning environment in one university in Lebanon:
classes with PowerPoint (PPT) vs. conventional classes. Learning
preferences were either for PPT or for Conventional methodology. A
statistically significant increase in academic achievement is found in
the conventionally instructed group as compared to the group taught
with PPT. This low effectiveness of PPT might be attributed to the
learning preferences of Lebanese students. In the PPT group, better
academic performance was found among students with
learning/teaching match as compared with students with
learning/teaching mismatch. Since the majority of students display a
preference for the conventional methodology, the result might
suggest that Lebanese students- performance is not optimized by PPT
in the accounting classrooms, not because of PPT itself, but because
it is not matching the Lebanese students- learning preferences in such
a quantitative course.
Abstract: This paper describes text mining technique for automatically extracting association rules from collections of textual documents. The technique called, Extracting Association Rules from Text (EART). It depends on keyword features for discover association rules amongst keywords labeling the documents. In this work, the EART system ignores the order in which the words occur, but instead focusing on the words and their statistical distributions in documents. The main contributions of the technique are that it integrates XML technology with Information Retrieval scheme (TFIDF) (for keyword/feature selection that automatically selects the most discriminative keywords for use in association rules generation) and use Data Mining technique for association rules discovery. It consists of three phases: Text Preprocessing phase (transformation, filtration, stemming and indexing of the documents), Association Rule Mining (ARM) phase (applying our designed algorithm for Generating Association Rules based on Weighting scheme GARW) and Visualization phase (visualization of results). Experiments applied on WebPages news documents related to the outbreak of the bird flu disease. The extracted association rules contain important features and describe the informative news included in the documents collection. The performance of the EART system compared with another system that uses the Apriori algorithm throughout the execution time and evaluating extracted association rules.
Abstract: Different problems may causes distortion of the rotor,
and hence vibration, which is the most severe damage of the turbine
rotors. In many years different techniques have been developed for
the straightening of bent rotors. The method for straightening can be
selected according to initial information from preliminary inspections
and tests such as nondestructive tests, chemical analysis, run out tests
and also a knowledge of the shaft material. This article covers the
various causes of excessive bends and then some applicable common
straightening methods are reviewed. Finally, hot spotting is opted for
a particular bent rotor. A 325 MW steam turbine rotor is modeled and
finite element analyses are arranged to investigate this straightening
process. Results of experimental data show that performing the exact
hot spot straightening process reduced the bending of the rotor
significantly.
Abstract: In this paper an algorithm is used to detect the color defects of ceramic tiles. First the image of a normal tile is clustered using GCMA; Genetic C-means Clustering Algorithm; those results in best cluster centers. C-means is a common clustering algorithm which optimizes an objective function, based on a measure between data points and the cluster centers in the data space. Here the objective function describes the mean square error. After finding the best centers, each pixel of the image is assigned to the cluster with closest cluster center. Then, the maximum errors of clusters are computed. For each cluster, max error is the maximum distance between its center and all the pixels which belong to it. After computing errors all the pixels of defected tile image are clustered based on the centers obtained from normal tile image in previous stage. Pixels which their distance from their cluster center is more than the maximum error of that cluster are considered as defected pixels.
Abstract: Background: Widespread use of chemotherapeutic
drugs in the treatment of cancer has lead to higher health hazards
among employee who handle and administer such drugs, so nurses
should know how to protect themselves, their patients and their work
environment against toxic effects of chemotherapy. Aim of this study
was carried out to examine the effect of chemotherapy safety protocol
for oncology nurses on their protective measure practices. Design: A
quasi experimental research design was utilized. Setting: The study
was carried out in oncology department of Menoufia university
hospital and Tanta oncology treatment center. Sample: A
convenience sample of forty five nurses in Tanta oncology treatment
center and eighteen nurses in Menoufiya oncology department.
Tools: 1. an interviewing questionnaire that covering sociodemographic
data, assessment of unit and nurses' knowledge about
chemotherapy. II: Obeservational check list to assess nurses' actual
practices of handling and adminestration of chemotherapy. A base
line data were assessed before implementing Chemotherapy Safety
protocol, then Chemotherapy Safety protocol was implemented, and
after 2 monthes they were assessed again. Results: reveled that 88.9%
of study group I and 55.6% of study group II improved to good total
knowledge scores after educating on the safety protocol, also 95.6%
of study group I and 88.9% of study group II had good total practice
score after educating on the safety protocol. Moreover less than half
of group I (44.4%) reported that heavy workload is the most barriers
for them, while the majority of group II (94.4%) had many barriers
for adhering to the safety protocol such as they didn’t know the
protocol, the heavy work load and inadequate equipment.
Conclusions: Safety protocol for Oncology Nurses seemed to have
positive effect on improving nurses' knowledge and practice.
Recommendation: chemotherapy safety protocol should be instituted
for all oncology nurses who are working in any oncology unit and/ or
center to enhance compliance, and this protocol should be done at
frequent intervals.
Abstract: Quality control charts are very effective in detecting
out of control signals but when a control chart signals an out of
control condition of the process mean, searching for a special cause
in the vicinity of the signal time would not always lead to prompt
identification of the source(s) of the out of control condition as the
change point in the process parameter(s) is usually different from the
signal time. It is very important to manufacturer to determine at what
point and which parameters in the past caused the signal. Early
warning of process change would expedite the search for the special
causes and enhance quality at lower cost. In this paper the quality
variables under investigation are assumed to follow a multivariate
normal distribution with known means and variance-covariance
matrix and the process means after one step change remain at the new
level until the special cause is being identified and removed, also it is
supposed that only one variable could be changed at the same time.
This research applies artificial neural network (ANN) to identify the
time the change occurred and the parameter which caused the change
or shift. The performance of the approach was assessed through a
computer simulation experiment. The results show that neural
network performs effectively and equally well for the whole shift
magnitude which has been considered.
Abstract: Clean air in subway station is important to passengers. The Platform Screen Doors (PSDs) can improve indoor air quality in the subway station; however the air quality in the subway tunnel is degraded. The subway tunnel has high CO2 concentration and indoor particulate matter (PM) value. The Indoor Air Quality (IAQ) level in subway environment degrades by increasing the frequency of the train operation and the number of the train. The ventilation systems of the subway tunnel need improvements to have better air-quality. Numerical analyses might be effective tools to analyze the performance of subway twin-track tunnel ventilation systems. An existing subway twin-track tunnel in the metropolitan Seoul subway system is chosen for the numerical simulations. The ANSYS CFX software is used for unsteady computations of the airflow inside the twin-track tunnel when the train moves. The airflow inside the tunnel is simulated when one train runs and two trains run at the same time in the tunnel. The piston-effect inside the tunnel is analyzed when all shafts function as the natural ventilation shaft. The supplied air through the shafts is mixed with the pollutant air in the tunnel. The pollutant air is exhausted by the mechanical ventilation shafts. The supplied and discharged airs are balanced when only one train runs in the twin-track tunnel. The pollutant air in the tunnel is high when two trains run simultaneously in opposite direction and all shafts functioned as the natural shaft cases when there are no electrical power supplies in the shafts. The remained pollutant air inside the tunnel enters into the station platform when the doors are opened.
Abstract: The role of entrepreneurs in generating the economy is
very important. Thus, nurturing entrepreneurship skills among
society is very crucial and should start from the early age. One of the
methods is to teach through game such as board game. Game
provides a fun and interactive platform for players to learn and play.
Besides that as today-s world is moving towards Islamic approach in
terms of finance, banking and entertainment but Islamic based game
is still hard to find in the market especially games on
entrepreneurship. Therefore, there is a gap in this segment that can be
filled by learning entrepreneurship through game. The objective of
this paper is to develop an entrepreneurship digital-based game
entitled “Catur Bistari" that is based on Islamic business approach.
Knowledge and skill of entrepreneurship and Islamic business
approach will be learned through the tasks that are incorporated
inside the game.
Abstract: In this paper, based on the estimation of the Cauchy matrix of linear impulsive differential equations, by using Banach fixed point theorem and Gronwall-Bellman-s inequality, some sufficient conditions are obtained for the existence and exponential stability of almost periodic solution for Cohen-Grossberg shunting inhibitory cellular neural networks (SICNNs) with continuously distributed delays and impulses. An example is given to illustrate the main results.
Abstract: Airport capacity has always been perceived in the
traditional sense as the number of aircraft operations during a
specified time corresponding to a tolerable level of average delay and
it mostly depends on the airside characteristics, on the fleet mix
variability and on the ATM. The adoption of the Directive
2002/30/EC in the EU countries drives the stakeholders to conceive
airport capacity in a different way though. Airport capacity in this
sense is fundamentally driven by environmental criteria, and since
acoustical externalities represent the most important factors, those are
the ones that could pose a serious threat to the growth of airports and
to aviation market itself in the short-medium term. The importance of
the regional airports in the deregulated market grew fast during the
last decade since they represent spokes for network carriers and a
preferential destination for low-fares carriers. Not only regional
airports have witnessed a fast and unexpected growth in traffic but
also a fast growth in the complaints for the nuisance by the people
living near those airports. In this paper the results of a study
conducted in cooperation with the airport of Bologna G. Marconi are
presented in order to investigate airport acoustical capacity as a defacto
constraint of airport growth.
Abstract: Overcurrent (OC) relays are the major protection
devices in a distribution system. The operating time of the OC relays
are to be coordinated properly to avoid the mal-operation of the
backup relays. The OC relay time coordination in ring fed
distribution networks is a highly constrained optimization problem
which can be stated as a linear programming problem (LPP). The
purpose is to find an optimum relay setting to minimize the time of
operation of relays and at the same time, to keep the relays properly
coordinated to avoid the mal-operation of relays.
This paper presents two phase simplex method for optimum time
coordination of OC relays. The method is based on the simplex
algorithm which is used to find optimum solution of LPP. The
method introduces artificial variables to get an initial basic feasible
solution (IBFS). Artificial variables are removed using iterative
process of first phase which minimizes the auxiliary objective
function. The second phase minimizes the original objective function
and gives the optimum time coordination of OC relays.
Abstract: Recently global concerns for the energy security have
steadily been on the increase and are expected to become a major
issue over the next few decades. Energy security refers to a resilient
energy system. This resilient system would be capable of
withstanding threats through a combination of active, direct security
measures and passive or more indirect measures such as redundancy,
duplication of critical equipment, diversity in fuel, other sources of
energy, and reliance on less vulnerable infrastructure. Threats and
disruptions (disturbances) to one part of the energy system affect
another. The paper presents methodology in theoretical background
about energy system as an interconnected network and energy supply
disturbances impact to the network. The proposed methodology uses
a network flow approach to develop mathematical model of the
energy system network as the system of nodes and arcs with energy
flowing from node to node along paths in the network.