Abstract: We have proposed an information filtering system
using index word selection from a document set based on the
topics included in a set of documents. This method narrows
down the particularly characteristic words in a document set
and the topics are obtained by Sparse Non-negative Matrix
Factorization. In information filtering, a document is often
represented with the vector in which the elements correspond
to the weight of the index words, and the dimension of the
vector becomes larger as the number of documents is
increased. Therefore, it is possible that useless words as index
words for the information filtering are included. In order to
address the problem, the dimension needs to be reduced. Our
proposal reduces the dimension by selecting index words
based on the topics included in a document set. We have
applied the Sparse Non-negative Matrix Factorization to the
document set to obtain these topics. The filtering is carried out
based on a centroid of the learning document set. The centroid
is regarded as the user-s interest. In addition, the centroid is
represented with a document vector whose elements consist of
the weight of the selected index words. Using the English test
collection MEDLINE, thus, we confirm the effectiveness of
our proposal. Hence, our proposed selection can confirm the
improvement of the recommendation accuracy from the other
previous methods when selecting the appropriate number of
index words. In addition, we discussed the selected index
words by our proposal and we found our proposal was able to
select the index words covered some minor topics included in
the document set.
Abstract: The authors present an algorithm for order reduction of linear time invariant dynamic systems using the combined advantages of the eigen spectrum analysis and the error minimization by particle swarm optimization technique. Pole centroid and system stiffness of both original and reduced order systems remain same in this method to determine the poles, whereas zeros are synthesized by minimizing the integral square error in between the transient responses of original and reduced order models using particle swarm optimization technique, pertaining to a unit step input. It is shown that the algorithm has several advantages, e.g. the reduced order models retain the steady-state value and stability of the original system. The algorithm is illustrated with the help of two numerical examples and the results are compared with the other existing techniques.
Abstract: In this paper, we deal with the Steiner tree problem
(STP) on a graph in which a fuzzy number, instead of a real number,
is assigned to each edge. We propose a modification of the shortest
paths approximation based on the fuzzy shortest paths (FSP)
evaluations. Since a fuzzy min operation using the extension
principle leads to nondominated solutions, we propose another
approach to solving the FSP using Cheng's centroid point fuzzy
ranking method.
Abstract: Nejad and Mashinchi (2011) proposed a revision for ranking fuzzy numbers based on the areas of the left and the right sides of a fuzzy number. However, this method still has some shortcomings such as lack of discriminative power to rank similar fuzzy numbers and no guarantee the consistency between the ranking of fuzzy numbers and the ranking of their images. To overcome these drawbacks, we propose an epsilon-deviation degree method based on the left area and the right area of a fuzzy number, and the concept of the centroid point. The main advantage of the new approach is the development of an innovative index value which can be used to consistently evaluate and rank fuzzy numbers. Numerical examples are presented to illustrate the efficiency and superiority of the proposed method.
Abstract: Various intelligences and inspirations have been
adopted into the iterative searching process called as meta-heuristics.
They intelligently perform the exploration and exploitation in the
solution domain space aiming to efficiently seek near optimal
solutions. In this work, the bee algorithm, inspired by the natural
foraging behaviour of honey bees, was adapted to find the near
optimal solutions of the transportation management system, dynamic
multi-zone dispatching. This problem prepares for an uncertainty and
changing customers- demand. In striving to remain competitive,
transportation system should therefore be flexible in order to cope
with the changes of customers- demand in terms of in-bound and outbound
goods and technological innovations. To remain higher service
level but lower cost management via the minimal imbalance scenario,
the rearrangement penalty of the area, in each zone, including time
periods are also included. However, the performance of the algorithm
depends on the appropriate parameters- setting and need to be
determined and analysed before its implementation. BEE parameters
are determined through the linear constrained response surface
optimisation or LCRSOM and weighted centroid modified simplex
methods or WCMSM. Experimental results were analysed in terms
of best solutions found so far, mean and standard deviation on the
imbalance values including the convergence of the solutions
obtained. It was found that the results obtained from the LCRSOM
were better than those using the WCMSM. However, the average
execution time of experimental run using the LCRSOM was longer
than those using the WCMSM. Finally a recommendation of proper
level settings of BEE parameters for some selected problem sizes is
given as a guideline for future applications.
Abstract: By taking advantage of both k-NN which is highly
accurate and K-means cluster which is able to reduce the time of classification, we can introduce Cluster-k-Nearest Neighbor as "variable k"-NN dealing with the centroid or mean point of all subclasses generated by clustering algorithm. In general the algorithm of K-means cluster is not stable, in term of accuracy, for that reason we develop another algorithm for clustering our space which gives a higher accuracy than K-means cluster, less
subclass number, stability and bounded time of classification with respect to the variable data size. We find between 96% and 99.7 % of accuracy in the lassification of 6 different types of Time series by using K-means cluster algorithm and we find 99.7% by using the new clustering algorithm.
Abstract: In order to accelerate the similarity search in highdimensional database, we propose a new hierarchical indexing method. It is composed of offline and online phases. Our contribution concerns both phases. In the offline phase, after gathering the whole of the data in clusters and constructing a hierarchical index, the main originality of our contribution consists to develop a method to construct bounding forms of clusters to avoid overlapping. For the online phase, our idea improves considerably performances of similarity search. However, for this second phase, we have also developed an adapted search algorithm. Our method baptized NOHIS (Non-Overlapping Hierarchical Index Structure) use the Principal Direction Divisive Partitioning (PDDP) as algorithm of clustering. The principle of the PDDP is to divide data recursively into two sub-clusters; division is done by using the hyper-plane orthogonal to the principal direction derived from the covariance matrix and passing through the centroid of the cluster to divide. Data of each two sub-clusters obtained are including by a minimum bounding rectangle (MBR). The two MBRs are directed according to the principal direction. Consequently, the nonoverlapping between the two forms is assured. Experiments use databases containing image descriptors. Results show that the proposed method outperforms sequential scan and SRtree in processing k-nearest neighbors.
Abstract: Interaction effects of xanthan gum (XG), carboxymethyl
cellulose (CMC), and locust bean gum (LBG) on the flow properties
of oil-in-water emulsions were investigated by a mixture design
experiment. Blends of XG, CMC and LBG were prepared according
to an augmented simplex-centroid mixture design (10 points) and used
at 0.5% (wt/wt) in the emulsion formulations. An appropriate
mathematical model was fitted to express each response as a function
of the proportions of the blend components that are able to
empirically predict the response to any blend of combination of the
components. The synergistic interaction effect of the ternary
XG:CMC:LBG blends at approximately 33-67% XG levels was
shown to be much stronger than that of the binary XG:LBG blend at
50% XG level (p < 0.05). Nevertheless, an antagonistic interaction
effect became significant as CMC level in blends was more than 33%
(p < 0.05). Yield stress and apparent viscosity (at 10 s-1) responses
were successfully fitted with a special quartic model while flow
behaviour index and consistency coefficient were fitted with a full
quartic model (R2
adjusted ≥ 0.90). This study found that a mixture
design approach could serve as a valuable tool in better elucidating
and predicting the interaction effects beyond the conventional twocomponent
blends.
Abstract: This paper includes two novel techniques for skew
estimation of binary document images. These algorithms are based on
connected component analysis and Hough transform. Both these
methods focus on reducing the amount of input data provided to
Hough transform. In the first method, referred as word centroid
approach, the centroids of selected words are used for skew detection.
In the second method, referred as dilate & thin approach, the selected
characters are blocked and dilated to get word blocks and later
thinning is applied. The final image fed to Hough transform has the
thinned coordinates of word blocks in the image. The methods have
been successful in reducing the computational complexity of Hough
transform based skew estimation algorithms. Promising experimental
results are also provided to prove the effectiveness of the proposed
methods.
Abstract: This paper focuses on the development of bond graph
dynamic model of the mechanical dynamics of an excavating mechanism
previously designed to be used with small tractors, which are
fabricated in the Engineering Workshops of Jomo Kenyatta University
of Agriculture and Technology. To develop a mechanical dynamics
model of the manipulator, forward recursive equations similar to
those applied in iterative Newton-Euler method were used to obtain
kinematic relationships between the time rates of joint variables
and the generalized cartesian velocities for the centroids of the
links. Representing the obtained kinematic relationships in bondgraphic
form, while considering the link weights and momenta as
the elements led to a detailed bond graph model of the manipulator.
The bond graph method was found to reduce significantly the number
of recursive computations performed on a 3 DOF manipulator for a
mechanical dynamic model to result, hence indicating that bond graph
method is more computationally efficient than the Newton-Euler
method in developing dynamic models of 3 DOF planar manipulators.
The model was verified by comparing the joint torque expressions
of a two link planar manipulator to those obtained using Newton-
Euler and Lagrangian methods as analyzed in robotic textbooks. The
expressions were found to agree indicating that the model captures
the aspects of rigid body dynamics of the manipulator. Based on
the model developed, actuator sizing and valve sizing methodologies
were developed and used to obtain the optimal sizes of the pistons
and spool valve ports respectively. It was found that using the pump
with the sized flow rate capacity, the engine of the tractor is able to
power the excavating mechanism in digging a sandy-loom soil.
Abstract: This paper presents three new methodologies for the
basic operations, which aim at finding new ways of computing union
(maximum) and intersection (minimum) membership values by
taking into effect the entire membership values in a fuzzy set. The
new methodologies are conceptually simple and easy from the
application point of view and are illustrated with a variety of
problems such as Cartesian product of two fuzzy sets, max –min
composition of two fuzzy sets in different product spaces and an
application of an inverted pendulum to determine the impact of the
new methodologies. The results clearly indicate a difference based on
the nature of the fuzzy sets under consideration and hence will be
highly useful in quite a few applications where different values have
significant impact on the behavior of the system.
Abstract: Swarm principles are increasingly being used to design controllers for the coordination of multi-robot systems or, in general, multi-agent systems. This paper proposes a two-dimensional Lagrangian swarm model that enables the planar agents, modeled as point masses, to swarm whilst effectively avoiding each other and obstacles in the environment. A novel method, based on an extended Lyapunov approach, is used to construct the model. Importantly, the Lyapunov method ensures a form of practical stability that guarantees an emergent behavior, namely, a cohesive and wellspaced swarm with a constant arrangement of individuals about the swarm centroid. Computer simulations illustrate this basic feature of collective behavior. As an application, we show how multiple planar mobile unicycle-like robots swarm to eventually form patterns in which their velocities and orientations stabilize.
Abstract: A new deployment of the multiple criteria decision
making (MCDM) techniques: the Simple Additive Weighting
(SAW), and the Technique for Order Preference by Similarity to
Ideal Solution (TOPSIS) for portfolio allocation, is demonstrated in
this paper. Rather than exclusive reference to mean and variance as in
the traditional mean-variance method, the criteria used in this
demonstration are the first four moments of the portfolio distribution.
Each asset is evaluated based on its marginal impacts to portfolio
higher moments that are characterized by trapezoidal fuzzy numbers.
Then centroid-based defuzzification is applied to convert fuzzy
numbers to the crisp numbers by which SAW and TOPSIS can be
deployed. Experimental results suggest the similar efficiency of these
MCDM approaches to selecting dominant assets for an optimal
portfolio under higher moments. The proposed approaches allow
investors flexibly adjust their risk preferences regarding higher
moments via different schemes adapting to various (from
conservative to risky) kinds of investors. The other significant
advantage is that, compared to the mean-variance analysis, the
portfolio weights obtained by SAW and TOPSIS are consistently
well-diversified.
Abstract: This paper presents a very simple and efficient
algorithm for codebook search, which reduces a great deal of
computation as compared to the full codebook search. The algorithm
is based on sorting and centroid technique for search. The results
table shows the effectiveness of the proposed algorithm in terms of
computational complexity. In this paper we also introduce a new
performance parameter named as Average fractional change in pixel
value as we feel that it gives better understanding of the closeness of
the image since it is related to the perception. This new performance
parameter takes into consideration the average fractional change in
each pixel value.
Abstract: This paper suggests ranking alternatives under fuzzy
MCDM (multiple criteria decision making) via an centroid based
ranking approach, where criteria are classified to benefit qualitative,
benefit quantitative and cost quantitative ones. The ratings of
alternatives versus qualitative criteria and the importance weights of
all criteria are assessed in linguistic values represented by fuzzy
numbers. The membership function for the final fuzzy evaluation
value of each alternative can be developed through α-cuts and
interval arithmetic of fuzzy numbers. The distance between the
original point and the relative centroid is applied to defuzzify the
final fuzzy evaluation values in order to rank alternatives. Finally a
numerical example demonstrates the computation procedure of the
proposed model.
Abstract: In this paper, we present a new and effective image indexing technique that extracts features directly from DCT domain. Our proposed approach is an object-based image indexing. For each block of size 8*8 in DCT domain a feature vector is extracted. Then, feature vectors of all blocks of image using a k-means algorithm is clustered into groups. Each cluster represents a special object of the image. Then we select some clusters that have largest members after clustering. The centroids of the selected clusters are taken as image feature vectors and indexed into the database. Also, we propose an approach for using of proposed image indexing method in automatic image classification. Experimental results on a database of 800 images from 8 semantic groups in automatic image classification are reported.
Abstract: Content-based Image Retrieval (CBIR) aims at searching image databases for specific images that are similar to a given query image based on matching of features derived from the image content. This paper focuses on a low-dimensional color based indexing technique for achieving efficient and effective retrieval performance. In our approach, the color features are extracted using the mean shift algorithm, a robust clustering technique. Then the cluster (region) mode is used as representative of the image in 3-D color space. The feature descriptor consists of the representative color of a region and is indexed using a spatial indexing method that uses *R -tree thus avoiding the high-dimensional indexing problems associated with the traditional color histogram. Alternatively, the images in the database are clustered based on region feature similarity using Euclidian distance. Only representative (centroids) features of these clusters are indexed using *R -tree thus improving the efficiency. For similarity retrieval, each representative color in the query image or region is used independently to find regions containing that color. The results of these methods are compared. A JAVA based query engine supporting query-by- example is built to retrieve images by color.
Abstract: Wireless Sensor Networks (WSNs) are used to monitor/observe vast inaccessible regions through deployment of large number of sensor nodes in the sensing area. For majority of WSN applications, the collected data needs to be combined with geographic information of its origin to make it useful for the user; information received from remote Sensor Nodes (SNs) that are several hops away from base station/sink is meaningless without knowledge of its source. In addition to this, location information of SNs can also be used to propose/develop new network protocols for WSNs to improve their energy efficiency and lifetime. In this paper, range free localization protocols for WSNs have been proposed. The proposed protocols are based on weighted centroid localization technique, where the edge weights of SNs are decided by utilizing fuzzy logic inference for received signal strength and link quality between the nodes. The fuzzification is carried out using (i) Mamdani, (ii) Sugeno, and (iii) Combined Mamdani Sugeno fuzzy logic inference. Simulation results demonstrate that proposed protocols provide better accuracy in node localization compared to conventional centroid based localization protocols despite presence of unintentional radio frequency interference from radio frequency (RF) sources operating in same frequency band.
Abstract: There are several approaches for handling multiclass classification. Aside from one-against-one (OAO) and one-against-all (OAA), hierarchical classification technique is also commonly used. A binary classification tree is a hierarchical classification structure that breaks down a k-class problem into binary sub-problems, each solved by a binary classifier. In each node, a set of classes is divided into two subsets. A good class partition should be able to group similar classes together. Many algorithms measure similarity in term of distance between class centroids. Classes are grouped together by a clustering algorithm when distances between their centroids are small. In this paper, we present a binary classification tree with tuned observation-based clustering (BCT-TOB) that finds a class partition by performing clustering on observations instead of class centroids. A merging step is introduced to merge any insignificant class split. The experiment shows that performance of BCT-TOB is comparable to other algorithms.
Abstract: In this study, a fuzzy-logic based control system was
designed to ensure that time and energy is saved during the operation
of load elevators which are used during the construction of tall
buildings. In the control system that was devised, for the load
elevators to work more efficiently, the energy interval where the
motor worked was taken as the output variable whereas the amount
of load and the building height were taken as input variables. The
most appropriate working intervals depending on the characteristics
of these variables were defined by the help of an expert. Fuzzy expert
system software was formed using Delphi programming language. In
this design, mamdani max-min inference mechanism was used and
the centroid method was employed in the clarification procedure. In
conclusion, it is observed that the system that was designed is
feasible and this is supported by statistical analyses..