Abstract: Model Predictive Control has been previously applied
to supply chain problems with promising results; however hitherto
proposed systems possessed no information on future demand. A
forecasting methodology will surely promote the efficiency of
control actions by providing insight on the future. A complete supply
chain management framework that is based on Model Predictive
Control (MPC) and Time Series Forecasting will be presented in this
paper. The proposed framework will be tested on industrial data in
order to assess the efficiency of the method and the impact of
forecast accuracy on overall control performance of the supply chain.
To this end, forecasting methodologies with different characteristics
will be implemented on test data to generate forecasts that will serve
as input to the Model Predictive Control module.
Abstract: In designing river intakes and diversion structures, it is paramount that the sediments entering the intake are minimized or, if possible, completely separated. Due to high water velocity, sediments can significantly damage hydraulic structures especially when mechanical equipment like pumps and turbines are used. This subsequently results in wasting water, electricity and further costs. Therefore, it is prudent to investigate and analyze the performance of lateral intakes affected by sediment control structures. Laboratory experiments, despite their vast potential and benefits, can face certain limitations and challenges. Some of these include: limitations in equipment and facilities, space constraints, equipment errors including lack of adequate precision or mal-operation, and finally, human error. Research has shown that in order to achieve the ultimate goal of intake structure design – which is to design longlasting and proficient structures – the best combination of sediment control structures (such as sill and submerged vanes) along with parameters that increase their performance (such as diversion angle and location) should be determined. Cost, difficulty of execution and environmental impacts should also be included in evaluating the optimal design. This solution can then be applied to similar problems in the future. Subsequently, the model used to arrive at the optimal design requires high level of accuracy and precision in order to avoid improper design and execution of projects. Process of creating and executing the design should be as comprehensive and applicable as possible. Therefore, it is important that influential parameters and vital criteria is fully understood and applied at all stages of choosing the optimal design. In this article, influential parameters on optimal performance of the intake, advantages and disadvantages, and efficiency of a given design are studied. Then, a multi-criterion decision matrix is utilized to choose the optimal model that can be used to determine the proper parameters in constructing the intake.
Abstract: Loop detectors report traffic characteristics in real
time. They are at the core of traffic control process. Intuitively,
one would expect that as density of detection increases, so would
the quality of estimates derived from detector data. However, as
detector deployment increases, the associated operating and
maintenance cost increases. Thus, traffic agencies often need to
decide where to add new detectors and which detectors should
continue receiving maintenance, given their resource constraints.
This paper evaluates the effect of detector spacing on freeway
travel time estimation. A freeway section (Interstate-15) in Salt
Lake City metropolitan region is examined. The research reveals
that travel time accuracy does not necessarily deteriorate with
increased detector spacing. Rather, the actual location of detectors
has far greater influence on the quality of travel time estimates.
The study presents an innovative computational approach that
delivers optimal detector locations through a process that relies on
Genetic Algorithm formulation.
Abstract: Due to their high power-to-weight ratio and low cost,
pneumatic actuators are attractive for robotics and automation
applications; however, achieving fast and accurate control of their
position have been known as a complex control problem. A
methodology for obtaining high position accuracy with a linear
pneumatic actuator is presented. During experimentation with a
number of PID classical control approaches over many operations of
the pneumatic system, the need for frequent manual re-tuning of the
controller could not be eliminated. The reason for this problem is
thermal and energy losses inside the cylinder body due to the
complex friction forces developed by the piston displacements.
Although PD controllers performed very well over short periods, it
was necessary in our research project to introduce some form of
automatic gain-scheduling to achieve good long-term performance.
We chose a fuzzy logic system to do this, which proved to be an
easily designed and robust approach. Since the PD approach showed
very good behaviour in terms of position accuracy and settling time,
it was incorporated into a modified form of the 1st order Tagaki-
Sugeno fuzzy method to build an overall controller. This fuzzy gainscheduler
uses an input variable which automatically changes the PD
gain values of the controller according to the frequency of repeated
system operations. Performance of the new controller was
significantly improved and the need for manual re-tuning was
eliminated without a decrease in performance. The performance of
the controller operating with the above method is going to be tested
through a high-speed web network (GRID) for research purposes.
Abstract: In this paper, we propose a new image segmentation approach for colour textured images. The proposed method for image segmentation consists of two stages. In the first stage, textural features using gray level co-occurrence matrix(GLCM) are computed for regions of interest (ROI) considered for each class. ROI acts as ground truth for the classes. Ohta model (I1, I2, I3) is the colour model used for segmentation. Statistical mean feature at certain inter pixel distance (IPD) of I2 component was considered to be the optimized textural feature for further segmentation. In the second stage, the feature matrix obtained is assumed to be the degraded version of the image labels and modeled as Markov Random Field (MRF) model to model the unknown image labels. The labels are estimated through maximum a posteriori (MAP) estimation criterion using ICM algorithm. The performance of the proposed approach is compared with that of the existing schemes, JSEG and another scheme which uses GLCM and MRF in RGB colour space. The proposed method is found to be outperforming the existing ones in terms of segmentation accuracy with acceptable rate of convergence. The results are validated with synthetic and real textured images.
Abstract: The design of a gravity dam is performed through an
interactive process involving a preliminary layout of the structure
followed by a stability and stress analysis. This study presents a
method to define the optimal top width of gravity dam with genetic
algorithm. To solve the optimization task (minimize the cost of the
dam), an optimization routine based on genetic algorithms (GAs) was
implemented into an Excel spreadsheet. It was found to perform well
and GA parameters were optimized in a parametric study. Using the
parameters found in the parametric study, the top width of gravity
dam optimization was performed and compared to a gradient-based
optimization method (classic method). The accuracy of the results
was within close proximity. In optimum dam cross section, the ratio
of is dam base to dam height is almost equal to 0.85, and ratio of dam
top width to dam height is almost equal to 0.13. The computerized
methodology may provide the help for computation of the optimal
top width for a wide range of height of a gravity dam.
Abstract: For complete support of Quality of Service, it is better that environment itself predicts resource requirements of a job by using special methods in the Grid computing. The exact and correct prediction causes exact matching of required resources with available resources. After the execution of each job, the used resources will be saved in the active database named "History". At first some of the attributes will be exploit from the main job and according to a defined similarity algorithm the most similar executed job will be exploited from "History" using statistic terms such as linear regression or average, resource requirements will be predicted. The new idea in this research is based on active database and centralized history maintenance. Implementation and testing of the proposed architecture results in accuracy percentage of 96.68% to predict CPU usage of jobs and 91.29% of memory usage and 89.80% of the band width usage.
Abstract: In this paper is shown that the probability-statistic methods application, especially at the early stage of the aviation gas turbine engine (GTE) technical condition diagnosing, when the flight information has property of the fuzzy, limitation and uncertainty is unfounded. Hence is considered the efficiency of application of new technology Soft Computing at these diagnosing stages with the using of the Fuzzy Logic and Neural Networks methods. Training with high accuracy of fuzzy multiple linear and non-linear models (fuzzy regression equations) which received on the statistical fuzzy data basis is made. Thus for GTE technical condition more adequate model making are analysed dynamics of skewness and kurtosis coefficients' changes. Researches of skewness and kurtosis coefficients values- changes show that, distributions of GTE work parameters have fuzzy character. Hence consideration of fuzzy skewness and kurtosis coefficients is expedient. Investigation of the basic characteristics changes- dynamics of GTE work parameters allows to draw conclusion on necessity of the Fuzzy Statistical Analysis at preliminary identification of the engines' technical condition. Researches of correlation coefficients values- changes shows also on their fuzzy character. Therefore for models choice the application of the Fuzzy Correlation Analysis results is offered. For checking of models adequacy is considered the Fuzzy Multiple Correlation Coefficient of Fuzzy Multiple Regression. At the information sufficiency is offered to use recurrent algorithm of aviation GTE technical condition identification (Hard Computing technology is used) on measurements of input and output parameters of the multiple linear and non-linear generalised models at presence of noise measured (the new recursive Least Squares Method (LSM)). The developed GTE condition monitoring system provides stage-bystage estimation of engine technical conditions. As application of the given technique the estimation of the new operating aviation engine temperature condition was made.
Abstract: Time series analysis often requires data that represents
the evolution of an observed variable in equidistant time steps. In
order to collect this data sampling is applied. While continuous
signals may be sampled, analyzed and reconstructed applying
Shannon-s sampling theorem, time-discrete signals have to be dealt
with differently. In this article we consider the discrete-event
simulation (DES) of job-shop-systems and study the effects of
different sampling rates on data quality regarding completeness and
accuracy of reconstructed inventory evolutions. At this we discuss
deterministic as well as non-deterministic behavior of system
variables. Error curves are deployed to illustrate and discuss the
sampling rate-s impact and to derive recommendations for its wellfounded
choice.
Abstract: Serial hierarchical support vector machine (SHSVM)
is proposed to discriminate three brain tissues which are white matter
(WM), gray matter (GM), and cerebrospinal fluid (CSF). SHSVM
has novel classification approach by repeating the hierarchical
classification on data set iteratively. It used Radial Basis Function
(rbf) Kernel with different tuning to obtain accurate results. Also as
the second approach, segmentation performed with DAGSVM
method. In this article eight univariate features from the raw DTI data
are extracted and all the possible 2D feature sets are examined within
the segmentation process. SHSVM succeed to obtain DSI values
higher than 0.95 accuracy for all the three tissues, which are higher
than DAGSVM results.
Abstract: It is an important task in Korean-English machine
translation to classify the gender of names correctly. When a sentence
is composed of two or more clauses and only one subject is given as a proper noun, it is important to find the gender of the proper noun
for correct translation of the sentence. This is because a singular pronoun has a gender in English while it does not in Korean. Thus,
in Korean-English machine translation, the gender of a proper noun should be determined. More generally, this task can be expanded into the classification of the general Korean names. This paper proposes a statistical method for this problem. By considering a name as just
a sequence of syllables, it is possible to get a statistics for each name from a collection of names. An evaluation of the proposed method
yields the improvement in accuracy over the simple looking-up of the
collection. While the accuracy of the looking-up method is 64.11%, that of the proposed method is 81.49%. This implies that the proposed
method is more plausible for the gender classification of the Korean names.
Abstract: The internet has become an attractive avenue for
global e-business, e-learning, knowledge sharing, etc. Due to
continuous increase in the volume of web content, it is not practically
possible for a user to extract information by browsing and integrating
data from a huge amount of web sources retrieved by the existing
search engines. The semantic web technology enables advancement
in information extraction by providing a suite of tools to integrate
data from different sources. To take full advantage of semantic web,
it is necessary to annotate existing web pages into semantic web
pages. This research develops a tool, named OWIE (Ontology-based
Web Information Extraction), for semantic web annotation using
domain specific ontologies. The tool automatically extracts
information from html pages with the help of pre-defined ontologies
and gives them semantic representation. Two case studies have been
conducted to analyze the accuracy of OWIE.
Abstract: In real-field applications, the correct determination of voice segments highly improves the overall system accuracy and minimises the total computation time. This paper presents reliable measures of speech compression by detcting the end points of the speech signals prior to compressing them. The two different compession schemes used are the Global threshold and the Level- Dependent threshold techniques. The performance of the proposed method is tested wirh the Signal to Noise Ratios, Peak Signal to Noise Ratios and Normalized Root Mean Square Error parameter measures.
Abstract: Facial features are frequently used to represent local
properties of a human face image in computer vision applications. In
this paper, we present a fast algorithm that can extract the facial
features online such that they can give a satisfying representation of a
face image. It includes one step for a coarse detection of each facial
feature by AdaBoost and another one to increase the accuracy of the
found points by Active Shape Models (ASM) in the regions of interest.
The resulted facial features are evaluated by matching with artificial
face models in the applications of physiognomy. The distance measure
between the features and those in the fate models from the database is
carried out by means of the Hausdorff distance. In the experiment, the
proposed method shows the efficient performance in facial feature
extractions and online system of physiognomy.
Abstract: Square pipes (pipes with square cross sections) are
being used for various industrial objectives, such as machine
structure components and housing/building elements. The utilization
of them is extending rapidly and widely. Hence, the out-put of those
pipes is increasing and new application fields are continually
developing.
Due to various demands in recent time, the products have to
satisfy difficult specifications with high accuracy in dimensions. The
reshaping process design of pipes with square cross sections;
however, is performed by trial and error and based on expert-s
experience.
In this paper, a computer-aided simulation is developed based on
the 2-D elastic-plastic method with consideration of the shear
deformation to analyze the reshaping process. Effect of various
parameters such as diameter of the circular pipe and mechanical
properties of metal on product dimension and quality can be
evaluated by using this simulation. Moreover, design of reshaping
process include determination of shrinkage of cross section,
necessary number of stands, radius of rolls and height of pipe at each
stand, are investigated. Further, it is shown that there are good
agreements between the results of the design method and the
experimental results.
Abstract: Aiming at most of the aviation products are facing the problem of fatigue fracture in vibration environment, we makes use of the testing result of a bracket, analysis for the structure with ANSYS-Workbench, predict the life of the bracket by different ways, and compared with the testing result. With the research on analysis methods, make an organic combination of simulation analysis and testing, Not only ensure the accuracy of simulation analysis and life predict, but also make a dynamic supervision of product life process, promote the application of finite element simulation analysis in engineering practice.
Abstract: Text categorization is the problem of classifying text
documents into a set of predefined classes. In this paper, we
investigated three approaches to build a meta-classifier in order to
increase the classification accuracy. The basic idea is to learn a metaclassifier
to optimally select the best component classifier for each
data point. The experimental results show that combining classifiers
can significantly improve the accuracy of classification and that our
meta-classification strategy gives better results than each individual
classifier. For 7083 Reuters text documents we obtained a
classification accuracies up to 92.04%.
Abstract: Sharing the manufacturing facility through remote
operation and monitoring of a machining process is challenge for
effective use the production facility. Several automation tools in term
of hardware and software are necessary for successfully remote
operation of a machine. This paper presents a prototype of workpiece
holding attachment for remote operation of milling process by self
configuration the workpiece setup. The prototype is designed with
mechanism to reorient the work surface into machining spindle
direction with high positioning accuracy. Variety of parts geometry
is hold by attachment to perform single setup machining. Pin type
with array pattern additionally clamps the workpiece surface from
two opposite directions for increasing the machining rigidity.
Optimum pins configuration for conforming the workpiece geometry
with minimum deformation is determined through hybrid algorithms,
Genetic Algorithms (GA) and Particle Swarm Optimization (PSO).
Prototype with intelligent optimization technique enables to hold
several variety of workpiece geometry which is suitable for
machining low of repetitive production in remote operation.
Abstract: Software estimation accuracy is among the greatest
challenges for software developers. This study aimed at building and
evaluating a neuro-fuzzy model to estimate software projects
development time. The forty-one modules developed from ten
programs were used as dataset. Our proposed approach is compared
with fuzzy logic and neural network model and Results show that the
value of MMRE (Mean of Magnitude of Relative Error) applying
neuro-fuzzy was substantially lower than MMRE applying fuzzy
logic and neural network.
Abstract: In this paper, an improved ant colony optimization
(ACO) algorithm is proposed to enhance the performance of global
optimum search. The strategy of the proposed algorithm has the
capability of fuzzy pheromone updating, adaptive parameter tuning,
and mechanism resetting. The proposed method is utilized to tune the
parameters of the fuzzy controller for a real beam and ball system.
Simulation and experimental results indicate that better performance
can be achieved compared to the conventional ACO algorithms in the
aspect of convergence speed and accuracy.