Abstract: Recent research in neural networks science and
neuroscience for modeling complex time series data and statistical
learning has focused mostly on learning from high input space and
signals. Local linear models are a strong choice for modeling local
nonlinearity in data series. Locally weighted projection regression is
a flexible and powerful algorithm for nonlinear approximation in
high dimensional signal spaces. In this paper, different learning
scenario of one and two dimensional data series with different
distributions are investigated for simulation and further noise is
inputted to data distribution for making different disordered
distribution in time series data and for evaluation of algorithm in
locality prediction of nonlinearity. Then, the performance of this
algorithm is simulated and also when the distribution of data is high
or when the number of data is less the sensitivity of this approach to
data distribution and influence of important parameter of local
validity in this algorithm with different data distribution is explained.
Abstract: Company managers are always looking for more and
more opportunities to succeed in today's fiercely competitive market.
To maintain your place among the successful companies on the
market today or to come up with a revolutionary business idea is
much more difficult than before. Each new or improved method, tool,
or approach that can improve the functioning of business processes or
even of the entire system is worth checking and verification. The use
of simulation in the design of manufacturing systems and their
management in practice is one of the ways without increased risk,
which makes it possible to find the optimal parameters of
manufacturing processes and systems. The paper presents an example
of use of simulation for solution of the bottleneck problem in the
concrete company.
Abstract: Job Scheduling plays an important role for efficient
utilization of grid resources available across different domains and
geographical zones. Scheduling of jobs is challenging and NPcomplete.
Evolutionary / Swarm Intelligence algorithms have been
extensively used to address the NP problem in grid scheduling.
Artificial Bee Colony (ABC) has been proposed for optimization
problems based on foraging behaviour of bees. This work proposes a
modified ABC algorithm, Cluster Heterogeneous Earliest First Min-
Min Artificial Bee Colony (CHMM-ABC), to optimally schedule
jobs for the available resources. The proposed model utilizes a novel
Heterogeneous Earliest Finish Time (HEFT) Heuristic Algorithm
along with Min-Min algorithm to identify the initial food source.
Simulation results show the performance improvement of the
proposed algorithm over other swarm intelligence techniques.
Abstract: The development, operation and maintenance of
Integrated Waste Management Systems (IWMS) affects essentially
the sustainable concern of every region. The features of such systems
have great influence on all of the components of sustainability. In
order to reach the optimal way of processes, a comprehensive
mapping of the variables affecting the future efficiency of the system
is needed such as analysis of the interconnections among the
components and modeling of their interactions. The planning of a
IWMS is based fundamentally on technical and economical
opportunities and the legal framework. Modeling the sustainability
and operation effectiveness of a certain IWMS is not in the scope of
the present research. The complexity of the systems and the large
number of the variables require the utilization of a complex approach
to model the outcomes and future risks. This complex method should
be able to evaluate the logical framework of the factors composing
the system and the interconnections between them. The authors of
this paper studied the usability of the Fuzzy Cognitive Map (FCM)
approach modeling the future operation of IWMS’s. The approach
requires two input data set. One is the connection matrix containing
all the factors affecting the system in focus with all the
interconnections. The other input data set is the time series, a
retrospective reconstruction of the weights and roles of the factors.
This paper introduces a novel method to develop time series by
content analysis.
Abstract: In this paper, Bayesian online inference in models of
data series are constructed by change-points algorithm, which
separated the observed time series into independent series and study
the change and variation of the regime of the data with related
statistical characteristics. variation of statistical characteristics of time
series data often represent separated phenomena in the some
dynamical system, like a change in state of brain dynamical reflected
in EEG signal data measurement or a change in important regime of
data in many dynamical system. In this paper, prediction algorithm
for studying change point location in some time series data is
simulated. It is verified that pattern of proposed distribution of data
has important factor on simpler and smother fluctuation of hazard
rate parameter and also for better identification of change point
locations. Finally, the conditions of how the time series distribution
effect on factors in this approach are explained and validated with
different time series databases for some dynamical system.
Abstract: This paper presents a novel statistical description of
the counterpoise effective length due to lightning surges, where the
(impulse) effective length had been obtained by means of regressive
formulas applied to the transient simulation results. The effective
length is described in terms of a statistical distribution function, from
which median, mean, variance, and other parameters of interest could
be readily obtained. The influence of lightning current amplitude,
lightning front duration, and soil resistivity on the effective length has
been accounted for, assuming statistical nature of these parameters. A
method for determining the optimal counterpoise length, in terms of
the statistical impulse effective length, is also presented. It is based on
estimating the number of dangerous events associated with lightning
strikes. Proposed statistical description and the associated method
provide valuable information which could aid the design engineer in
optimising physical lengths of counterpoises in different grounding
arrangements and soil resistivity situations.
Abstract: Text mining techniques are generally applied for
classifying the text, finding fuzzy relations and structures in data
sets. This research provides plenty text mining capabilities. One
common application is text classification and event extraction,
which encompass deducing specific knowledge concerning incidents
referred to in texts. The main contribution of this paper is the
clarification of a concept graph generation mechanism, which is based
on a text classification and optimal fuzzy relationship extraction.
Furthermore, the work presented in this paper explains the application
of fuzzy relationship extraction and branch and bound (BB) method
to simplify the texts.
Abstract: Chemical Reaction Optimization (CRO) is an
optimization metaheuristic inspired by the nature of chemical
reactions as a natural process of transforming the substances from
unstable to stable states. Starting with some unstable molecules with
excessive energy, a sequence of interactions takes the set to a state of
minimum energy. Researchers reported successful application of the
algorithm in solving some engineering problems, like the quadratic
assignment problem, with superior performance when compared with
other optimization algorithms. We adapted this optimization
algorithm to the Printed Circuit Board Drilling Problem (PCBDP)
towards reducing the drilling time and hence improving the PCB
manufacturing throughput. Although the PCBDP can be viewed as
instance of the popular Traveling Salesman Problem (TSP), it has
some characteristics that would require special attention to the
transactions that explore the solution landscape. Experimental test
results using the standard CROToolBox are not promising for
practically sized problems, while it could find optimal solutions for
artificial problems and small benchmarks as a proof of concept.
Abstract: Environmental impacts of six 3D printers using
various materials were compared to determine if material choice
drove sustainability, or if other factors such as machine type, machine
size, or machine utilization dominate. Cradle-to-grave life-cycle
assessments were performed, comparing a commercial-scale FDM
machine printing in ABS plastic, a desktop FDM machine printing in
ABS, a desktop FDM machine printing in PET and PLA plastics, a
polyjet machine printing in its proprietary polymer, an SLA machine
printing in its polymer, and an inkjet machine hacked to print in salt
and dextrose. All scenarios were scored using ReCiPe Endpoint H
methodology to combine multiple impact categories, comparing
environmental impacts per part made for several scenarios per
machine. Results showed that most printers’ ecological impacts were
dominated by electricity use, not materials, and the changes in
electricity use due to different plastics was not significant compared
to variation from one machine to another. Variation in machine idle
time determined impacts per part most strongly. However, material
impacts were quite important for the inkjet printer hacked to print in
salt: In its optimal scenario, it had up to 1/38th the impacts coreper
part as the worst-performing machine in the same scenario. If salt
parts were infused with epoxy to make them more physically robust,
then much of this advantage disappeared, and material impacts
actually dominated or equaled electricity use. Future studies should
also measure DMLS and SLS processes / materials.
Abstract: A feeding experiment was conducted to determine the
optimum dietary protein and lipid levels for juvenile fancy carp. Eight
experimental diets were formulated to contain four protein levels (200,
300, 400 and 500 g kg-1) with two lipid levels (70 and 140 g kg-1).
Triplicate groups of fish (initial weight, 12.1±0.2 g fish-1) were
hand-fed the diets to apparent satiation for 8 weeks. Fish growth
performance, feed utilization and feed intake were significantly
(P0.05). Weight gain and feed efficiency ratio tended to
increase as dietary protein level increased up to 400 and 500 g kg-1,
respectively. Daily feed intake of fish decreased with increasing
dietary protein level and that of fish fed diet contained 500 g kg-1
protein was significantly lower than other fish groups. The protein
efficiency ratio of fish fed 400 and 500 g kg-1 protein was lower than
that of fish fed 200 and 300 g kg-1 protein. Moisture, crude protein and
crude lipid contents of muscle and liver were significantly affected by
dietary protein, but not by dietary lipid level (P>0.05). The increase in
dietary lipid level resulted in an increase in linoleic acid in liver and
muscle paralleled with a decrease in n-3 highly unsaturated fatty acids
content in muscle of fish. In considering these results, it was concluded
that the diet containing 400 g kg-1 protein with 70 g kg-1 lipid level is
optimal for growth and efficient feed utilization of juvenile fancy carp.
Abstract: This paper presents optimization of makespan for ‘n’
jobs and ‘m’ machines flexible job shop scheduling problem with
sequence dependent setup time using genetic algorithm (GA)
approach. A restart scheme has also been applied to prevent the
premature convergence. Two case studies are taken into
consideration. Results are obtained by considering crossover
probability (pc = 0.85) and mutation probability (pm = 0.15). Five
simulation runs for each case study are taken and minimum value
among them is taken as optimal makespan. Results indicate that
optimal makespan can be achieved with more than one sequence of
jobs in a production order.
Abstract: Operations research science (OR) deals with good
success in developing and applying scientific methods for problem
solving and decision-making. However, by using OR techniques, we
can enhance the use of computer decision support systems to achieve
optimal management for institutions. OR applies comprehensive
analysis including all factors that effect on it and builds mathematical
modeling to solve business or organizational problems. In addition, it
improves decision-making and uses available resources efficiently.
The adoption of OR by universities would definitely contributes to
the development and enhancement of the performance of OR
techniques. This paper provides an understanding of the structures,
approaches and models of OR in problem solving and decisionmaking.
Abstract: This paper provides a comparative study on the
performances of standard PID and adaptive PID controllers tested on
travel angle of a 3-Degree-of-Freedom (3-DOF) Quanser bench-top
helicopter. Quanser, a well-known manufacturer of educational
bench-top helicopter has developed Proportional Integration
Derivative (PID) controller with Linear Quadratic Regulator (LQR)
for all travel, pitch and yaw angle of the bench-top helicopter. The
performance of the PID controller is relatively good; however, its
performance could also be improved if the controller is combined
with adaptive element. The objective of this research is to design
adaptive PID controller and then compare the performances of the
adaptive PID with the standard PID. The controller design and test is
focused on travel angle control only. Adaptive method used in this
project is self-tuning controller, which controller’s parameters are
updated online. Two adaptive algorithms those are pole-placement
and deadbeat have been chosen as the method to achieve optimal
controller’s parameters. Performance comparisons have shown that
the adaptive (deadbeat) PID controller has produced more desirable
performance compared to standard PID and adaptive (poleplacement).
The adaptive (deadbeat) PID controller attained very fast
settling time (5 seconds) and very small percentage of overshoot (5%
to 7.5%) for 10° to 30° step change of travel angle.
Abstract: In this work, neural networks methods MLP type were
applied to a database from an array of six sensors for the detection of
three toxic gases. The choice of the number of hidden layers and the
weight values are influential on the convergence of the learning
algorithm. We proposed, in this article, a mathematical formula to
determine the optimal number of hidden layers and good weight
values based on the method of back propagation of errors. The results
of this modeling have improved discrimination of these gases and
optimized the computation time. The model presented here has
proven to be an effective application for the fast identification of
toxic gases.
Abstract: Taking the design tolerance into account, this paper
presents a novel efficient approach to generate iso-scallop tool path for
five-axis strip machining with a barrel cutter. The cutter location is
first determined on the scallop surface instead of the design surface,
and then the cutter is adjusted to locate the optimal tool position based
on the differential rotation of the tool axis and satisfies the design
tolerance simultaneously. The machining strip width and error are
calculated with the aid of the grazing curve of the cutter. Based on the
proposed tool positioning algorithm, the tool paths are generated by
keeping the scallop height formed by adjacent tool paths constant. An
example is conducted to confirm the validity of the proposed method.
Abstract: Real time image and video processing is a demand in
many computer vision applications, e.g. video surveillance, traffic
management and medical imaging. The processing of those video
applications requires high computational power. Thus, the optimal
solution is the collaboration of CPU and hardware accelerators. In
this paper, a Canny edge detection hardware accelerator is proposed.
Edge detection is one of the basic building blocks of video and image
processing applications. It is a common block in the pre-processing
phase of image and video processing pipeline. Our presented
approach targets offloading the Canny edge detection algorithm from
processing system (PS) to programmable logic (PL) taking the
advantage of High Level Synthesis (HLS) tool flow to accelerate the
implementation on Zynq platform. The resulting implementation
enables up to a 100x performance improvement through hardware
acceleration. The CPU utilization drops down and the frame rate
jumps to 60 fps of 1080p full HD input video stream.
Abstract: Near infrared (NIR) spectroscopy has always been of
great interest in the food and agriculture industries. The development
of prediction models has facilitated the estimation process in recent
years. In this study, 110 crude palm oil (CPO) samples were used to
build a free fatty acid (FFA) prediction model. 60% of the collected
data were used for training purposes and the remaining 40% used for
testing. The visible peaks on the NIR spectrum were at 1725 nm and
1760 nm, indicating the existence of the first overtone of C-H bands.
Principal component regression (PCR) was applied to the data in
order to build this mathematical prediction model. The optimal
number of principal components was 10. The results showed
R2=0.7147 for the training set and R2=0.6404 for the testing set.
Abstract: Electricity spot prices are highly volatile under
optimal generation capacity scenarios due to factors such as nonstorability
of electricity, peak demand at certain periods, generator
outages, fuel uncertainty for renewable energy generators, huge
investments and time needed for generation capacity expansion etc.
As a result market participants are exposed to price and volume risk,
which has led to the development of risk management practices. This
paper provides an overview of risk management practices by market
participants in electricity markets using financial derivatives.
Abstract: In this work, we propose and analyze a model of
Phytoplankton-Zooplankton interaction with harvesting considering
that some species are exploited commercially for food. Criteria for
local stability, instability and global stability are derived and some
threshold harvesting levels are explored to maintain the population
at an appropriate equilibrium level even if the species are exploited
continuously.Further,biological and bionomic equilibria of the system
are obtained and an optimal harvesting policy is also analysed using
the Pantryagin’s Maximum Principle.Finally analytical findings are
also supported by some numerical simulations.
Abstract: Now-a-days autonomous mobile robots have found
applications in diverse fields. An autonomous robot system must be
able to behave in an intelligent manner to deal with complex and
changing environment. This work proposes the performance of path
planning and navigation of autonomous mobile robot using
Gravitational Search Algorithm (GSA), Simulated Annealing (SA)
and Particle Swarm optimization (PSO) based intelligent controllers
in an unstructured environment. The approach not only finds a valid
collision free path but also optimal one. The main aim of the work is
to minimize the length of the path and duration of travel from a
starting point to a target while moving in an unknown environment
with obstacles without collision. Finally, a comparison is made
between the three controllers, it is found that the path length and time
duration made by the robot using GSA is better than SA and PSO
based controllers for the same work.