Abstract: One of the processes of slope that occurs every year in Iran and some parts of world and cause a lot of criminal and financial harms is called landslide. They are plenty of method to stability landslide in soil and rock slides. The use of the best method with the least cost and in the shortest time is important for researchers. In this research, determining the best method of stability is investigated by using of Decision Support systems. DSS is made for this purpose and was used (for Hasan Salaran area in Kurdistan). Field study data from topography, slope, geology, geometry of landslide and the related features was used. The related data entered decision making managements programs (DSS) (ALES).Analysis of mass stability indicated the instability potential at present. Research results show that surface and sub surface drainage the best method of stabilizing. Analysis of stability shows that acceptable increase in security coefficient is a consequence of drainage.
Abstract: The objective of this research is to develop the
performance indicators (PIs) in operational level for the Pre-hospital Emergency Medical Service (EMS) system employing in Thailand. This research started with ascertaining the current pre-hospital care
system. The team analyzed the strategies of Narerthorn, a government unit under the ministry of public health, and the existing PIs of the pre-hospital care. Afterwards, the current National Strategic Plan of EMS development (2008-2012) of the Emergency
Medical Institute of Thailand (EMIT) was considered using strategic
analysis to developed Strategy Map (SM) and identified the Success
Factors (SFs). The analysis results from strategy map and SFs were used to develop the Performance Indicators (PIs). To verify the set of
PIs, the team has interviewed with the relevant practitioners for the possibilities to implement the PIs. To this paper, it was to ascertain
that all the developed PIs support the objectives of the strategic plan. Nevertheless, the results showed that the operational level PIs suited
only with the first dimension of National Strategic Plan
(infrastructure and information technology development). Besides,
the SF was the infrastructure development (to contribute the EMS system to people throughout with standard and efficiency both in normally and disaster conditions). Finally, twenty-nine indicators
were developed from the analysis results of SM and SFs.
Abstract: This is an application research presenting the
improvement of production quality using the six sigma solutions and
the analyses of benefit-cost ratio. The case of interest is the
production of tile-concrete. Such production has faced with the
problem of high nonconforming products from an inappropriate
surface coating and had low process capability based on the strength
property of tile. Surface coating and tile strength are the most critical
to quality of this product. The improvements followed five stages of
six sigma solutions. After the improvement, the production yield was
improved to 80% as target required and the defective products from
coating process was remarkably reduced from 29.40% to 4.09%. The
process capability based on the strength quality was increased from
0.87 to 1.08 as customer oriented. The improvement was able to save
the materials loss for 3.24 millions baht or 0.11 million dollars. The
benefits from the improvement were analyzed from (1) the reduction
of the numbers of non conforming tile using its factory price for
surface coating improvement and (2) the materials saved from the
increment of process capability. The benefit-cost ratio of overall
improvement was high as 7.03. It was non valuable investment in
define, measure, analyses and the initial of improve stages after that
it kept increasing. This was due to there were no benefits in define,
measure, and analyze stages of six sigma since these three stages
mainly determine the cause of problem and its effects rather than
improve the process. The benefit-cost ratio starts existing in the
improve stage and go on. Within each stage, the individual benefitcost
ratio was much higher than the accumulative one as there was an
accumulation of cost since the first stage of six sigma. The
consideration of the benefit-cost ratio during the improvement
project helps make decisions for cost saving of similar activities
during the improvement and for new project. In conclusion, the
determination of benefit-cost ratio behavior through out six sigma
implementation period provides the useful data for managing quality
improvement for the optimal effectiveness. This is the additional
outcome from the regular proceeding of six sigma.
Abstract: The objective of this paper is to study the electrical
resistivity complexity between field and laboratory measurement, in
order to improve the effectiveness of data interpretation for
geophysical ground resistivity survey. The geological outcrop in
Penang, Malaysia with an obvious layering contact was chosen as the
study site. Two dimensional geoelectrical resistivity imaging were
used in this study to maps the resistivity distribution of subsurface,
whereas few subsurface sample were obtained for laboratory
advance. In this study, resistivity of samples in original conditions is
measured in laboratory by using time domain low-voltage technique,
particularly for granite core sample and soil resistivity measuring set
for soil sample. The experimentation results from both schemes are
studied, analyzed, calibrated and verified, including basis and
correlation, degree of tolerance and characteristics of substance.
Consequently, the significant different between both schemes is
explained comprehensively within this paper.
Abstract: Parallel programming models exist as an abstraction
of hardware and memory architectures. There are several parallel
programming models in commonly use; they are shared memory
model, thread model, message passing model, data parallel model,
hybrid model, Flynn-s models, embarrassingly parallel computations
model, pipelined computations model. These models are not specific
to a particular type of machine or memory architecture. This paper
expresses the model program for concurrent approach to data parallel
model through java programming.
Abstract: This paper presents a compact thermoelectric power generator system based on temperature difference across the element. The system can transfer the burning heat energy to electric energy directly. The proposed system has a thermoelectric generator and a power control box. In the generator, there are 4 thermoelectric modules (TEMs), each of which uses 2 thermoelectric chips (TEs) and 2 cold sinks, 1 thermal absorber, and 1 thermal conduction flat board. In the power control box, there are 1 storing energy device, 1 converter, and 1 inverter. The total net generating power is about 11W. This system uses commercial portable gas stoves or burns timber or the coal as the heat source, which is easily obtained. It adopts solid-state thermoelectric chips as heat inverter parts. The system has the advantages of being light-weight, quite, and mobile, requiring no maintenance, and havng easily-supplied heat source. The system can be used a as long as burning is allowed. This system works well for highly-mobilized outdoors situations by providing a power for illumination, entertainment equipment or the wireless equipment at refuge. Under heavy storms such as typhoon, when the solar panels become ineffective and the wind-powered machines malfunction, the thermoelectric power generator can continue providing the vital power.
Abstract: This study reveals that anti-immigrant policies in
Europe result from a process of securitization, and that, within this
process, radical right parties have been formulating discourses and
approaches through a construction process by using some common
security themes. These security themes can be classified as national
security, economic security, cultural security and internal security.
The frequency with which radical right parties use these themes may
vary according to the specific historical, social and cultural
characteristics of a particular country.
Abstract: This paper presents a simplified version of Data Envelopment Analysis (DEA) - a conventional approach to evaluating the performance and ranking of competitive objects characterized by two groups of factors acting in opposite directions: inputs and outputs. DEA with a Perfect Object (DEA PO) augments the group of actual objects with a virtual Perfect Object - the one having greatest outputs and smallest inputs. It allows for obtaining an explicit analytical solution and making a step to an absolute efficiency. This paper develops this approach further and introduces a DEA model with Partially Perfect Objects. DEA PPO consecutively eliminates the smallest relative inputs or greatest relative outputs, and applies DEA PO to the reduced collections of indicators. The partial efficiency scores are combined to get the weighted efficiency score. The computational scheme remains simple, like that of DEA PO, but the advantage of the DEA PPO is taking into account all of the inputs and outputs for each actual object. Firm evaluation is considered as an example.
Abstract: A conventional binding method for low power in a
high-level synthesis mainly focuses on finding an optimal binding for
an assumed input data, and obtains only one binding table. In this
paper, we show that a binding method which uses multiple binding
tables gets better solution compared with the conventional methods
which use a single binding table, and propose a dynamic bus binding
scheme for low power using multiple binding tables. The proposed
method finds multiple binding tables for the proper partitions of an
input data, and switches binding tables dynamically to produce the
minimum total switching activity. Experimental result shows that the
proposed method obtains a binding solution having 12.6-28.9%
smaller total switching activity compared with the conventional
methods.
Abstract: Blood pulse is an important human physiological signal commonly used for the understanding of the individual physical health. Current methods of non-invasive blood pulse sensing require direct contact or access to the human skin. As such, the performances of these devices tend to vary with time and are subjective to human body fluids (e.g. blood, perspiration and skin-oil) and environmental contaminants (e.g. mud, water, etc). This paper proposes a simulation model for the novel method of non-invasive acquisition of blood pulse using the disturbance created by blood flowing through a localized magnetic field. The simulation model geometry represents a blood vessel, a permanent magnet, a magnetic sensor, surrounding tissues and air in 2-dimensional. In this model, the velocity and pressure fields in the blood stream are described based on Navier-Stroke equations and the walls of the blood vessel are assumed to have no-slip condition. The blood assumes a parabolic profile considering a laminar flow for blood in major artery near the skin. And the inlet velocity follows a sinusoidal equation. This will allow the computational software to compute the interactions between the magnetic vector potential generated by the permanent magnet and the magnetic nanoparticles in the blood. These interactions are simulated based on Maxwell equations at the location where the magnetic sensor is placed. The simulated magnetic field at the sensor location is found to assume similar sinusoidal waveform characteristics as the inlet velocity of the blood. The amplitude of the simulated waveforms at the sensor location are compared with physical measurements on human subjects and found to be highly correlated.
Abstract: The objective of this paper is to estimate realistic
principal extrusion process parameters by means of artificial neural
network. Conventionally, finite element analysis is used to derive
process parameters. However, the finite element analysis of the
extrusion model does not consider the manufacturing process
constraints in its modeling. Therefore, the process parameters
obtained through such an analysis remains highly theoretical.
Alternatively, process development in industrial extrusion is to a
great extent based on trial and error and often involves full-size
experiments, which are both expensive and time-consuming. The
artificial neural network-based estimation of the extrusion process
parameters prior to plant execution helps to make the actual extrusion
operation more efficient because more realistic parameters may be
obtained. And so, it bridges the gap between simulation and real
manufacturing execution system. In this work, a suitable neural
network is designed which is trained using an appropriate learning
algorithm. The network so trained is used to predict the
manufacturing process parameters.
Abstract: Sharing the manufacturing facility through remote
operation and monitoring of a machining process is challenge for
effective use the production facility. Several automation tools in term
of hardware and software are necessary for successfully remote
operation of a machine. This paper presents a prototype of workpiece
holding attachment for remote operation of milling process by self
configuration the workpiece setup. The prototype is designed with
mechanism to reorient the work surface into machining spindle
direction with high positioning accuracy. Variety of parts geometry
is hold by attachment to perform single setup machining. Pin type
with array pattern additionally clamps the workpiece surface from
two opposite directions for increasing the machining rigidity.
Optimum pins configuration for conforming the workpiece geometry
with minimum deformation is determined through hybrid algorithms,
Genetic Algorithms (GA) and Particle Swarm Optimization (PSO).
Prototype with intelligent optimization technique enables to hold
several variety of workpiece geometry which is suitable for
machining low of repetitive production in remote operation.
Abstract: Statistical learning theory was developed by Vapnik. It
is a learning theory based on Vapnik-Chervonenkis dimension. It also
has been used in learning models as good analytical tools. In general, a
learning theory has had several problems. Some of them are local
optima and over-fitting problems. As well, statistical learning theory
has same problems because the kernel type, kernel parameters, and
regularization constant C are determined subjectively by the art of
researchers. So, we propose an evolutionary statistical learning theory
to settle the problems of original statistical learning theory.
Combining evolutionary computing into statistical learning theory,
our theory is constructed. We verify improved performances of an
evolutionary statistical learning theory using data sets from KDD cup.
Abstract: This paper reports a case study on how a conceptual
and analytical thinking approach was used in Art and Design Department at Multimedia University (Malaysia) in addressing the
issues of one nation and its impact in the society through artworks. The art project was designed for students to increase the know-how
and develop creative thinking in design and communication. Goals of the design project were: (1) to develop creative thinking in design
and communication, (2) to increase student understanding on the
process of problem solving for design work, and (3) to use design
elements and principles to generate interest, attention and emotional responses. An exhibition entitled "One Nation" was showcased to
local and international viewers consisting of the general public, professionals, academics, artists and students. Findings indicate that the project supported several visual art standards, as well as
generated awareness in the society. This project may be of interest to
current and future art educators and others interested in the potential
of utilizing global issues as content for art, community and environment studies for the purpose of educational art.
Abstract: Software Development Risks Identification (SDRI),
using Fault Tree Analysis (FTA), is a proposed technique to identify
not only the risk factors but also the causes of the appearance of the
risk factors in software development life cycle. The method is based
on analyzing the probable causes of software development failures
before they become problems and adversely affect a project. It uses
Fault tree analysis (FTA) to determine the probability of a particular
system level failures that are defined by A Taxonomy for Sources of
Software Development Risk to deduce failure analysis in which an
undesired state of a system by using Boolean logic to combine a
series of lower-level events. The major purpose of this paper is to use
the probabilistic calculations of Fault Tree Analysis approach to
determine all possible causes that lead to software development risk
occurrence
Abstract: The design of a modern aircraft is based on three pillars: theoretical results, experimental test and computational simulations.
As a results of this, Computational Fluid Dynamic (CFD) solvers are
widely used in the aeronautical field. These solvers require the correct
selection of many parameters in order to obtain successful results. Besides, the computational time spent in the simulation depends on
the proper choice of these parameters.
In this paper we create an expert system capable of making an
accurate prediction of the number of iterations and time required for the convergence of a computational fluid dynamic (CFD) solver.
Artificial neural network (ANN) has been used to design the expert system. It is shown that the developed expert system is capable of making an accurate prediction the number of iterations and time
required for the convergence of a CFD solver.
Abstract: Software estimation accuracy is among the greatest
challenges for software developers. This study aimed at building and
evaluating a neuro-fuzzy model to estimate software projects
development time. The forty-one modules developed from ten
programs were used as dataset. Our proposed approach is compared
with fuzzy logic and neural network model and Results show that the
value of MMRE (Mean of Magnitude of Relative Error) applying
neuro-fuzzy was substantially lower than MMRE applying fuzzy
logic and neural network.
Abstract: Conventionally the selection of parameters depends
intensely on the operator-s experience or conservative technological
data provided by the EDM equipment manufacturers that assign
inconsistent machining performance. The parameter settings given by
the manufacturers are only relevant with common steel grades. A
single parameter change influences the process in a complex way.
Hence, the present research proposes artificial neural network (ANN)
models for the prediction of surface roughness on first commenced
Ti-15-3 alloy in electrical discharge machining (EDM) process. The
proposed models use peak current, pulse on time, pulse off time and
servo voltage as input parameters. Multilayer perceptron (MLP) with
three hidden layer feedforward networks are applied. An assessment
is carried out with the models of distinct hidden layer. Training of the
models is performed with data from an extensive series of
experiments utilizing copper electrode as positive polarity. The
predictions based on the above developed models have been verified
with another set of experiments and are found to be in good
agreement with the experimental results. Beside this they can be
exercised as precious tools for the process planning for EDM.
Abstract: In this paper back-propagation artificial neural network
(BPANN) is employed to predict the deformation of the upsetting
process. To prepare a training set for BPANN, some finite element
simulations were carried out. The input data for the artificial neural
network are a set of parameters generated randomly (aspect ratio d/h,
material properties, temperature and coefficient of friction). The
output data are the coefficient of polynomial that fitted on barreling
curves. Neural network was trained using barreling curves generated
by finite element simulations of the upsetting and the corresponding
material parameters. This technique was tested for three different
specimens and can be successfully employed to predict the
deformation of the upsetting process
Abstract: Foundation of tower crane serves to ensure stability
against vertical and horizontal forces. If foundation stress is not
sufficient, tower crane may be subject to overturning, shearing or
foundation settlement. Therefore, engineering review of stable support
is a highly critical part of foundation design. However, there are not
many professionals who can conduct engineering review of tower
crane foundation and, if any, they have information only on a small
number of cranes in which they have hands-on experience. It is also
customary to rely on empirical knowledge and tower crane renter-s
recommendations rather than designing foundation on the basis of
engineering knowledge. Therefore, a foundation design automation
system considering not only lifting conditions but also overturning
risk, shearing and vertical force may facilitate production of foolproof
foundation design for experts and enable even non-experts to utilize
professional knowledge that only experts can access now. This study
proposes Automatic Design Algorithm for the Tower Crane
Foundations considering load and horizontal force.