Abstract: People, throughout the history, have made estimates
and inferences about the future by using their past experiences.
Developing information technologies and the improvements in the
database management systems make it possible to extract useful
information from knowledge in hand for the strategic decisions.
Therefore, different methods have been developed. Data mining by
association rules learning is one of such methods. Apriori algorithm,
one of the well-known association rules learning algorithms, is not
commonly used in spatio-temporal data sets. However, it is possible
to embed time and space features into the data sets and make Apriori
algorithm a suitable data mining technique for learning spatiotemporal
association rules. Lake Van, the largest lake of Turkey, is a
closed basin. This feature causes the volume of the lake to increase or
decrease as a result of change in water amount it holds. In this study,
evaporation, humidity, lake altitude, amount of rainfall and
temperature parameters recorded in Lake Van region throughout the
years are used by the Apriori algorithm and a spatio-temporal data
mining application is developed to identify overflows and newlyformed
soil regions (underflows) occurring in the coastal parts of
Lake Van. Identifying possible reasons of overflows and underflows
may be used to alert the experts to take precautions and make the
necessary investments.
Abstract: This paper evaluates the accrual based scheduling for
cloud in single and multi-resource system. Numerous organizations
benefit from Cloud computing by hosting their applications. The
cloud model provides needed access to computing with potentially
unlimited resources. Scheduling is tasks and resources mapping to a
certain optimal goal principle. Scheduling, schedules tasks to virtual
machines in accordance with adaptable time, in sequence under
transaction logic constraints. A good scheduling algorithm improves
CPU use, turnaround time, and throughput. In this paper, three realtime
cloud services scheduling algorithm for single resources and
multiple resources are investigated. Experimental results show
Resource matching algorithm performance to be superior for both
single and multi-resource scheduling when compared to benefit first
scheduling, Migration, Checkpoint algorithms.
Abstract: Imperialist Competitive Algorithm (ICA) is a recent
meta-heuristic method that is inspired by the social evolutions for
solving NP-Hard problems. The ICA is a population-based algorithm
which has achieved a great performance in comparison to other metaheuristics.
This study is about developing enhanced ICA approach to
solve the Cell Formation Problem (CFP) using sequence data. In
addition to the conventional ICA, an enhanced version of ICA,
namely EICA, applies local search techniques to add more
intensification aptitude and embed the features of exploration and
intensification more successfully. Suitable performance measures are
used to compare the proposed algorithms with some other powerful
solution approaches in the literature. In the same way, for checking
the proficiency of algorithms, forty test problems are presented. Five
benchmark problems have sequence data, and other ones are based on
0-1 matrices modified to sequence based problems. Computational
results elucidate the efficiency of the EICA in solving CFP problems.
Abstract: This paper develops a multiple channel assignment
model, which allows to take advantage of spectrum opportunities in
cognitive radio networks in the most efficient way. The developed
scheme allows making several assignments of available and
frequency adjacent channel, which require a bigger bandwidth, under
an equality environment. The hybrid assignment model it is made by
two algorithms, one that makes the ranking and selects available
frequency channels and the other one in charge of establishing the
Max-Min Fairness for not restrict the spectrum opportunities for all
the other secondary users, who also claim to make transmissions.
Measurements made were done for average bandwidth, average
delay, as well as fairness computation for several channel
assignments. Reached results were evaluated with experimental
spectrum occupational data from captured GSM frequency band. The
developed model shows evidence of improvement in spectrum
opportunity use and a wider average transmission bandwidth for each
secondary user, maintaining equality criteria in channel assignment.
Abstract: This paper presents an extensive review of literature
relevant to the modelling techniques adopted in sediment yield and
hydrological modelling. Several studies relating to sediment yield are
discussed. Many research areas of sedimentation in rivers, runoff and
reservoirs are presented. Different types of hydrological models,
different methods employed in selecting appropriate models for
different case studies are analysed. Applications of evolutionary
algorithms and artificial intelligence techniques are discussed and
compared especially in water resources management and modelling.
This review concentrates on Genetic Programming (GP) and fully
discusses its theories and applications. The successful applications of
GP as a soft computing technique were reviewed in sediment
modelling. Some fundamental issues such as benchmark,
generalization ability, bloat, over-fitting and other open issues
relating to the working principles of GP are highlighted. This paper
concludes with the identification of some research gaps in
hydrological modelling and sediment yield.
Abstract: This paper integrates Octagon and Square Search
pattern (OCTSS) motion estimation algorithm into H.264/AVC
(Advanced Video Coding) video codec in Adaptive Group of Pictures
(AGOP) mode. AGOP structure is computed based on scene change
in the video sequence. Octagon and square search pattern block-based
motion estimation method is implemented in inter-prediction process
of H.264/AVC. Both these methods reduce bit rate and computational
complexity while maintaining the quality of the video sequence
respectively. Experiments are conducted for different types of video
sequence. The results substantially proved that the bit rate,
computation time and PSNR gain achieved by the proposed method
is better than the existing H.264/AVC with fixed GOP and AGOP.
With a marginal gain in quality of 0.28dB and average gain in bitrate
of 132.87kbps, the proposed method reduces the average computation
time by 27.31 minutes when compared to the existing state-of-art
H.264/AVC video codec.
Abstract: This study and the field test comparisons were carried
out on the Algerian Derguna – Setif transmission systems. The
transmission line of normal voltage 225 kV is 65 km long,
transported and uses twin bundle conductors protected with two
shield wires of transposed galvanized steel. An iterative finite-element method is used to solve Poisons
equation. Two algorithms are proposed for satisfying the current
continuity condition and updating the space-charge density. A new approach to the problem of corona discharge in
transmission system has been described in this paper. The effect of
varying the configurations and wires number is also investigated. The
analysis of this steady is important in the design of HVDC
transmission lines. The potential and electric field have been
calculating in locations singular points of the system.
Abstract: Recently, numerous documents including large
volumes of unstructured data and text have been created because of the
rapid increase in the use of social media and the Internet. Usually,
these documents are categorized for the convenience of users. Because
the accuracy of manual categorization is not guaranteed, and such
categorization requires a large amount of time and incurs huge costs.
Many studies on automatic categorization have been conducted to help
mitigate the limitations of manual categorization. Unfortunately, most
of these methods cannot be applied to categorize complex documents
with multiple topics because they work on the assumption that
individual documents can be categorized into single categories only.
Therefore, to overcome this limitation, some studies have attempted to
categorize each document into multiple categories. However, the
learning process employed in these studies involves training using a
multi-categorized document set. These methods therefore cannot be
applied to the multi-categorization of most documents unless
multi-categorized training sets using traditional multi-categorization
algorithms are provided. To overcome this limitation, in this study, we
review our novel methodology for extending the category of a
single-categorized document to multiple categorizes, and then
introduce a survey-based verification scenario for estimating the
accuracy of our automatic categorization methodology.
Abstract: Clustering is a process of grouping objects and data
into groups of clusters to ensure that data objects from the same
cluster are identical to each other. Clustering algorithms in one of the
area in data mining and it can be classified into partition, hierarchical,
density based and grid based. Therefore, in this paper we do survey
and review four major hierarchical clustering algorithms called
CURE, ROCK, CHAMELEON and BIRCH. The obtained state of
the art of these algorithms will help in eliminating the current
problems as well as deriving more robust and scalable algorithms for
clustering.
Abstract: Due to the fast and flawless technological innovation
there is a tremendous amount of data dumping all over the world in
every domain such as Pattern Recognition, Machine Learning, Spatial
Data Mining, Image Analysis, Fraudulent Analysis, World Wide
Web etc., This issue turns to be more essential for developing several
tools for data mining functionalities. The major aim of this paper is to
analyze various tools which are used to build a resourceful analytical
or descriptive model for handling large amount of information more
efficiently and user friendly. In this survey the diverse tools are
illustrated with their extensive technical paradigm, outstanding
graphical interface and inbuilt multipath algorithms in which it is
very useful for handling significant amount of data more indeed.
Abstract: In recent years, new techniques for solving complex
problems in engineering are proposed. One of these techniques is
JPSO algorithm. With innovative changes in the nature of the jump
algorithm JPSO, it is possible to construct a graph-based solution
with a new algorithm called G-JPSO. In this paper, a new algorithm
to solve the optimal control problem Fletcher-Powell and optimal
control of pumps in water distribution network was evaluated.
Optimal control of pumps comprise of optimum timetable operation
(status on and off) for each of the pumps at the desired time interval.
Maximum number of status on and off for each pumps imposed to the
objective function as another constraint. To determine the optimal
operation of pumps, a model-based optimization-simulation
algorithm was developed based on G-JPSO and JPSO algorithms.
The proposed algorithm results were compared well with the ant
colony algorithm, genetic and JPSO results. This shows the
robustness of proposed algorithm in finding near optimum solutions
with reasonable computational cost.
Abstract: This paper introduces the concept and principle of data
cleaning, analyzes the types and causes of dirty data, and proposes
several key steps of typical cleaning process, puts forward a well
scalability and versatility data cleaning framework, in view of data
with attribute dependency relation, designs several of violation data
discovery algorithms by formal formula, which can obtain inconsistent
data to all target columns with condition attribute dependent no matter
data is structured (SQL) or unstructured (NoSql), and gives 6 data
cleaning methods based on these algorithms.
Abstract: Evolution strategy (ES) is a well-known instance of evolutionary algorithms, and there have been many studies on ES. In this paper, the author proposes an extended ES for solving fuzzy-valued optimization problems. In the proposed ES, genotype values are not real numbers but fuzzy numbers. Evolutionary processes in the ES are extended so that it can handle genotype instances with fuzzy numbers. In this study, the proposed method is experimentally applied to the evolution of neural networks with fuzzy weights and biases. Results reveal that fuzzy neural networks evolved using the proposed ES with fuzzy genotype values can model hidden target fuzzy functions even though no training data are explicitly provided. Next, the proposed method is evaluated in terms of variations in specifying fuzzy numbers as genotype values. One of the mostly adopted fuzzy numbers is a symmetric triangular one that can be specified by its lower and upper bounds (LU) or its center and width (CW). Experimental results revealed that the LU model contributed better to the fuzzy ES than the CW model, which indicates that the LU model should be adopted in future applications of the proposed method.
Abstract: Carefully scheduling the operations of pumps can be
resulted to significant energy savings. Schedules can be defined
either implicit, in terms of other elements of the network such as tank
levels, or explicit by specifying the time during which each pump is
on/off. In this study, two new explicit representations based on timecontrolled
triggers were analyzed, where the maximum number of
pump switches was established beforehand, and the schedule may
contain fewer switches than the maximum. The optimal operation of
pumping stations was determined using a Jumping Particle Swarm
Optimization (JPSO) algorithm to achieve the minimum energy cost.
The model integrates JPSO optimizer and EPANET hydraulic
network solver. The optimal pump operation schedule of VanZyl
water distribution system was determined using the proposed model
and compared with those from Genetic and Ant Colony algorithms.
The results indicate that the proposed model utilizing the JPSO
algorithm is a versatile management model for the operation of realworld
water distribution system.
Abstract: This paper addresses minimizing the makespan of the
distributed permutation flow shop scheduling problem. In this
problem, there are several parallel identical factories or flowshops
each with series of similar machines. Each job should be allocated to
one of the factories and all of the operations of the jobs should be
performed in the allocated factory. This problem has recently gained
attention and due to NP-Hard nature of the problem, metaheuristic
algorithms have been proposed to tackle it. Majority of the proposed
algorithms require large computational time which is the main
drawback. In this study, a general variable neighborhood search
algorithm (GVNS) is proposed where several time-saving schemes
have been incorporated into it. Also, the GVNS uses the sophisticated
method to change the shaking procedure or perturbation depending
on the progress of the incumbent solution to prevent stagnation of the
search. The performance of the proposed algorithm is compared to
the state-of-the-art algorithms based on standard benchmark
instances.
Abstract: In some applications, such as image recognition or
compression, segmentation refers to the process of partitioning a
digital image into multiple segments. Image segmentation is typically
used to locate objects and boundaries (lines, curves, etc.) in images.
Image segmentation is to classify or cluster an image into several
parts (regions) according to the feature of image, for example, the
pixel value or the frequency response. More precisely, image
segmentation is the process of assigning a label to every pixel in an
image such that pixels with the same label share certain visual
characteristics. The result of image segmentation is a set of segments
that collectively cover the entire image, or a set of contours extracted
from the image. Several image segmentation algorithms were
proposed to segment an image before recognition or compression. Up
to now, many image segmentation algorithms exist and be
extensively applied in science and daily life. According to their
segmentation method, we can approximately categorize them into
region-based segmentation, data clustering, and edge-base
segmentation. In this paper, we give a study of several popular image
segmentation algorithms that are available.
Abstract: This study presents a hybrid metaheuristic algorithm
to obtain optimum designs for steel space buildings. The optimum
design problem of three-dimensional steel frames is mathematically
formulated according to provisions of LRFD-AISC (Load and
Resistance factor design of American Institute of Steel Construction).
Design constraints such as the strength requirements of structural
members, the displacement limitations, the inter-story drift and the
other structural constraints are derived from LRFD-AISC
specification. In this study, a hybrid algorithm by using teachinglearning
based optimization (TLBO) and harmony search (HS)
algorithms is employed to solve the stated optimum design problem.
These algorithms are two of the recent additions to metaheuristic
techniques of numerical optimization and have been an efficient tool
for solving discrete programming problems. Using these two
algorithms in collaboration creates a more powerful tool and
mitigates each other’s weaknesses. To demonstrate the powerful
performance of presented hybrid algorithm, the optimum design of a
large scale steel building is presented and the results are compared to
the previously obtained results available in the literature.
Abstract: Ecological systems are exposed and are influenced by
various natural and anthropogenic disturbances. They produce
various effects and states seeking response symmetry to a state of
global phase coherence or stability and balance of their food webs.
This research project addresses the development of a computational
methodology for modeling plankton food webs. The use of
algorithms to establish connections, the generation of representative
fuzzy multigraphs and application of technical analysis of complex
networks provide a set of tools for defining, analyzing and evaluating
community structure of coastal aquatic ecosystems, beyond the
estimate of possible external impacts to the networks. Thus, this
study aims to develop computational systems and data models to
assess how these ecological networks are structurally and
functionally organized, to analyze the types and degree of
compartmentalization and synchronization between oscillatory and
interconnected elements network and the influence of disturbances on
the overall pattern of rhythmicity of the system.
Abstract: In this paper we present a classification of the various technologies applied for the solution of the portfolio selection problem according to the discipline and the methodological framework followed. We provide a concise presentation of the emerged categories and we are trying to identify which methods considered obsolete and which lie at the heart of the debate. On top of that, we provide a comparative study of the different technologies applied for efficient portfolio construction and we suggest potential paths for future work that lie at the intersection of the presented techniques.
Abstract: The localization information is crucial for the
operation of WSN. There are principally two types of localization
algorithms. The Range-based localization algorithm has strict
requirements on hardware, thus is expensive to be implemented in
practice. The Range-free localization algorithm reduces the hardware
cost. However, it can only achieve high accuracy in ideal scenarios.
In this paper, we locate unknown nodes by incorporating the
advantages of these two types of methods. The proposed algorithm
makes the unknown nodes select the nearest anchor using the
Received Signal Strength Indicator (RSSI) and choose two other
anchors which are the most accurate to achieve the estimated
location. Our algorithm improves the localization accuracy compared
with previous algorithms, which has been demonstrated by the
simulating results.