Abstract: In this paper, a neural tree (NT) classifier having a
simple perceptron at each node is considered. A new concept for
making a balanced tree is applied in the learning algorithm of the
tree. At each node, if the perceptron classification is not accurate and
unbalanced, then it is replaced by a new perceptron. This separates
the training set in such a way that almost the equal number of patterns
fall into each of the classes. Moreover, each perceptron is trained only
for the classes which are present at respective node and ignore other
classes. Splitting nodes are employed into the neural tree architecture
to divide the training set when the current perceptron node repeats
the same classification of the parent node. A new error function based
on the depth of the tree is introduced to reduce the computational
time for the training of a perceptron. Experiments are performed to
check the efficiency and encouraging results are obtained in terms of
accuracy and computational costs.
Abstract: With optimized bandwidth and latency discrepancy ratios, Node Gain Scores (NGSs) are determined and used as a basis for shaping the max-heap overlay. The NGSs - determined as the respective bandwidth-latency-products - govern the construction of max-heap-form overlays. Each NGS is earned as a synergy of discrepancy ratio of the bandwidth requested with respect to the estimated available bandwidth, and latency discrepancy ratio between the nodes and the source node. The tree leads to enhanceddelivery overlay multicasting – increasing packet delivery which could, otherwise, be hindered by induced packet loss occurring in other schemes not considering the synergy of these parameters on placing the nodes on the overlays. The NGS is a function of four main parameters – estimated available bandwidth, Ba; individual node's requested bandwidth, Br; proposed node latency to its prospective parent (Lp); and suggested best latency as advised by source node (Lb). Bandwidth discrepancy ratio (BDR) and latency discrepancy ratio (LDR) carry weights of α and (1,000 - α ) , respectively, with arbitrary chosen α ranging between 0 and 1,000 to ensure that the NGS values, used as node IDs, maintain a good possibility of uniqueness and balance between the most critical factor between the BDR and the LDR. A max-heap-form tree is constructed with assumption that all nodes possess NGS less than the source node. To maintain a sense of load balance, children of each level's siblings are evenly distributed such that a node can not accept a second child, and so on, until all its siblings able to do so, have already acquired the same number of children. That is so logically done from left to right in a conceptual overlay tree. The records of the pair-wise approximate available bandwidths as measured by a pathChirp scheme at individual nodes are maintained. Evaluation measures as compared to other schemes – Bandwidth Aware multicaSt architecturE (BASE), Tree Building Control Protocol (TBCP), and Host Multicast Tree Protocol (HMTP) - have been conducted. This new scheme generally performs better in terms of trade-off between packet delivery ratio; link stress; control overhead; and end-to-end delays.
Abstract: In ad hoc networks, the main issue about designing of protocols is quality of service, so that in wireless sensor networks the main constraint in designing protocols is limited energy of sensors. In fact, protocols which minimize the power consumption in sensors are more considered in wireless sensor networks. One approach of reducing energy consumption in wireless sensor networks is to reduce the number of packages that are transmitted in network. The technique of collecting data that combines related data and prevent transmission of additional packages in network can be effective in the reducing of transmitted packages- number. According to this fact that information processing consumes less power than information transmitting, Data Aggregation has great importance and because of this fact this technique is used in many protocols [5]. One of the Data Aggregation techniques is to use Data Aggregation tree. But finding one optimum Data Aggregation tree to collect data in networks with one sink is a NP-hard problem. In the Data Aggregation technique, related information packages are combined in intermediate nodes and form one package. So the number of packages which are transmitted in network reduces and therefore, less energy will be consumed that at last results in improvement of longevity of network. Heuristic methods are used in order to solve the NP-hard problem that one of these optimization methods is to solve Simulated Annealing problems. In this article, we will propose new method in order to build data collection tree in wireless sensor networks by using Simulated Annealing algorithm and we will evaluate its efficiency whit Genetic Algorithm.
Abstract: Multicast Network Technology has pervaded our
lives-a few examples of the Networking Techniques and also for the
improvement of various routing devices we use. As we know the
Multicast Data is a technology offers many applications to the user
such as high speed voice, high speed data services, which is presently
dominated by the Normal networking and the cable system and
digital subscriber line (DSL) technologies. Advantages of Multi cast
Broadcast such as over other routing techniques. Usually QoS
(Quality of Service) Guarantees are required in most of Multicast
applications. The bandwidth-delay constrained optimization and we
use a multi objective model and routing approach based on genetic
algorithm that optimizes multiple QoS parameters simultaneously.
The proposed approach is non-dominated routes and the performance
with high efficiency of GA. Its betterment and high optimization has
been verified. We have also introduced and correlate the result of
multicast GA with the Broadband wireless to minimize the delay in
the path.
Abstract: In this paper, acoustic techniques are used to detect hidden insect infestations of date palm tress (Phoenix dactylifera L.). In particular, we use an acoustic instrument for early discovery of the presence of a destructive insect pest commonly known as the Red Date Palm Weevil (RDPW) and scientifically as Rhynchophorus ferrugineus (Olivier). This type of insect attacks date palm tress and causes irreversible damages at late stages. As a result, the infected trees must be destroyed. Therefore, early presence detection is a major part in controlling the spread and economic damage caused by this type of infestation. Furthermore monitoring and early detection of the disease can asses in taking appropriate measures such as isolating or treating the infected trees. The acoustic system is evaluated in terms of its ability for early discovery of hidden bests inside the tested tree. When signal acquisitions is completed for a number of date palms, a signal processing technique known as time-frequency analysis is evaluated in terms of providing an estimate that can be visually used to recognize the acoustic signature of the RDPW. The testing instrument was tested in the laboratory first then; it was used on suspected or infested tress in the field. The final results indicate that the acoustic monitoring approach along with signal processing techniques are very promising for the early detection of presence of the larva as well as the adult pest in the date palms.
Abstract: In this paper, a new algorithm for generating codebook is proposed for vector quantization (VQ) in image coding. The significant features of the training image vectors are extracted by using the proposed Orthogonal Polynomials based transformation. We propose to generate the codebook by partitioning these feature vectors into a binary tree. Each feature vector at a non-terminal node of the binary tree is directed to one of the two descendants by comparing a single feature associated with that node to a threshold. The binary tree codebook is used for encoding and decoding the feature vectors. In the decoding process the feature vectors are subjected to inverse transformation with the help of basis functions of the proposed Orthogonal Polynomials based transformation to get back the approximated input image training vectors. The results of the proposed coding are compared with the VQ using Discrete Cosine Transform (DCT) and Pairwise Nearest Neighbor (PNN) algorithm. The new algorithm results in a considerable reduction in computation time and provides better reconstructed picture quality.
Abstract: Renewable and non-renewable resource constraints have been vast studied in theoretical fields of project scheduling problems. However, although cumulative resources are widespread in practical cases, the literature on project scheduling problems subject to these resources is scant. So in order to study this type of resources more, in this paper we use the framework of a resource constrained project scheduling problem (RCPSP) with finish-start precedence relations between activities and subject to the cumulative resources in addition to the renewable resources. We develop a branch and bound algorithm for this problem customizing precedence tree algorithm of RCPSP. We perform extensive experimental analysis on the algorithm to check its effectiveness and performance for solving different instances of the problem in question.
Abstract: The purpose of determining impact significance is to
place value on impacts. Environmental impact assessment review is a
process that judges whether impact significance is acceptable or not in
accordance with the scientific facts regarding environmental,
ecological and socio-economical impacts described in environmental
impact statements (EIS) or environmental impact assessment reports
(EIAR). The first aim of this paper is to summarize the criteria of
significance evaluation from the past review results and accordingly
utilize fuzzy logic to incorporate these criteria into scientific facts. The
second aim is to employ data mining technique to construct an EIS or
EIAR prediction model for reviewing results which can assist
developers to prepare and revise better environmental management
plans in advance. The validity of the previous prediction model
proposed by authors in 2009 is 92.7%. The enhanced validity in this
study can attain 100.0%.
Abstract: Network layer multicast, i.e. IP multicast, even after
many years of research, development and standardization, is not
deployed in large scale due to both technical (e.g. upgrading of
routers) and political (e.g. policy making and negotiation) issues.
Researchers looked for alternatives and proposed application/overlay
multicast where multicast functions are handled by end hosts, not
network layer routers. Member hosts wishing to receive multicast
data form a multicast delivery tree. The intermediate hosts in the tree
act as routers also, i.e. they forward data to the lower hosts in the
tree. Unlike IP multicast, where a router cannot leave the tree until all
members below it leave, in overlay multicast any member can leave
the tree at any time thus disjoining the tree and disrupting the data
dissemination. All the disrupted hosts have to rejoin the tree. This
characteristic of the overlay multicast causes multicast tree unstable,
data loss and rejoin overhead. In this paper, we propose that each node
sets its leaving time from the tree and sends join request to a number
of nodes in the tree. The nodes in the tree will reject the request if
their leaving time is earlier than the requesting node otherwise they
will accept the request. The node can join at one of the accepting
nodes. This makes the tree more stable as the nodes will join the tree
according to their leaving time, earliest leaving time node being at the
leaf of the tree. Some intermediate nodes may not follow their leaving
time and leave earlier than their leaving time thus disrupting the tree.
For this, we propose a proactive recovery mechanism so that disrupted
nodes can rejoin the tree at predetermined nodes immediately. We
have shown by simulation that there is less overhead when joining
the multicast tree and the recovery time of the disrupted nodes is
much less than the previous works. Keywords
Abstract: A spanning tree of a connected graph is a tree which
consists the set of vertices and some or perhaps all of the edges from
the connected graph. In this paper, a model for spanning tree
transformation of connected graphs into single-row networks, namely
Spanning Tree of Connected Graph Modeling (STCGM) will be
introduced. Path-Growing Tree-Forming algorithm applied with
Vertex-Prioritized is contained in the model to produce the spanning
tree from the connected graph. Paths are produced by Path-Growing
and they are combined into a spanning tree by Tree-Forming. The
spanning tree that is produced from the connected graph is then
transformed into single-row network using Tree Sequence Modeling
(TSM). Finally, the single-row routing problem is solved using a
method called Enhanced Simulated Annealing for Single-Row
Routing (ESSR).
Abstract: Multiple sequence alignment is a fundamental part in
many bioinformatics applications such as phylogenetic analysis.
Many alignment methods have been proposed. Each method gives a
different result for the same data set, and consequently generates a
different phylogenetic tree. Hence, the chosen alignment method
affects the resulting tree. However in the literature, there is no
evaluation of multiple alignment methods based on the comparison of
their phylogenetic trees. This work evaluates the following eight
aligners: ClustalX, T-Coffee, SAGA, MUSCLE, MAFFT, DIALIGN,
ProbCons and Align-m, based on their phylogenetic trees (test trees)
produced on a given data set. The Neighbor-Joining method is used
to estimate trees. Three criteria, namely, the dNNI, the dRF and the
Id_Tree are established to test the ability of different alignment
methods to produce closer test tree compared to the reference one
(true tree). Results show that the method which produces the most
accurate alignment gives the nearest test tree to the reference tree.
MUSCLE outperforms all aligners with respect to the three criteria
and for all datasets, performing particularly better when sequence
identities are within 10-20%. It is followed by T-Coffee at lower
sequence identity (30%), trees scores of all methods
become similar.
Abstract: The major goal in defining and examining game
scenarios is to find good strategies as solutions to the game. A
plausible solution is a recommendation to the players on how to play
the game, which is represented as strategies guided by the various
choices available to the players. These choices invariably compel the
players (decision makers) to execute an action following some
conscious tactics. In this paper, we proposed a refinement-based
heuristic as a machine learning technique for human-like decision
making in playing Ayo game. The result showed that our machine
learning technique is more adaptable and more responsive in making
decision than human intelligence. The technique has the advantage
that a search is astutely conducted in a shallow horizon game tree.
Our simulation was tested against Awale shareware and an appealing
result was obtained.
Abstract: This paper presents an effective framework for Chinesesyntactic parsing, which includes two parts. The first one is a parsing framework, which is based on an improved bottom-up chart parsingalgorithm, and integrates the idea of the beam search strategy of N bestalgorithm and heuristic function of A* algorithm for pruning, then get multiple parsing trees. The second is a novel evaluation model, which integrates contextual and partial lexical information into traditional PCFG model and defines a new score function. Using this model, the tree with the highest score is found out as the best parsing tree. Finally,the contrasting experiment results are given. Keywords?syntactic parsing, PCFG, pruning, evaluation model.
Abstract: Graph decompositions are vital in the study of
combinatorial design theory. A decomposition of a graph G is a
partition of its edge set. An n-sun graph is a cycle Cn with an edge
terminating in a vertex of degree one attached to each vertex. In this
paper, we define n-sun decomposition of some even order graphs
with a perfect matching. We have proved that the complete graph
K2n, complete bipartite graph K2n, 2n and the Harary graph H4, 2n have
n-sun decompositions. A labeling scheme is used to construct the n-suns.
Abstract: With the growth of modern civilization and
industrialization in worldwide, the demand for energy is increasing
day by day. Majority of the world-s energy needs are met through
fossil fuels and natural gas. As a result the amount of fossil fuels is
on diminishing from year to year. Since the fossil fuel is nonrenewable,
so fuel price is gouging as a consequence of spiraling
demand and diminishing supply. At present the power generation of
our country is mainly depends on imported fossil fuels. To reduce the
dependency on imported fuel, the use of renewable sources has
become more popular. In Bangladesh coconut is widely growing tree.
Especially in the southern part of the country a large area will be
found where coconut tree is considered as natural asset. So, our
endeavor was to use the coconut oil as a renewable and alternative
fuel. This article shows the prospect of coconut oil as a renewable
and alternative fuel of diesel fuel. Since diesel engine has a versatile
uses including small electricity generation, an experimental set up is
then made to study the performance of a small diesel engine using
different blends of bio diesel converted from coconut oil. It is found
that bio diesel has slightly different properties than diesel. With
biodiesel the engine is capable of running without difficulty.
Different blends of bio diesel (i.e. B80, B60, and B 50 etc.) have
been used to avoid complicated modification of the engine or the fuel
supply system. Finally, a comparison of engine performance for
different blends of biodiesel has been carried out to determine the
optimum blend for different operating conditions.
Abstract: Graph decompositions are vital in the study of combinatorial design theory. Given two graphs G and H, an H-decomposition of G is a partition of the edge set of G into disjoint isomorphic copies of H. An n-sun is a cycle Cn with an edge terminating in a vertex of degree one attached to each vertex. In this paper we have proved that the complete graph of order 2n, K2n can be decomposed into n-2 n-suns, a Hamilton cycle and a perfect matching, when n is even and for odd case, the decomposition is n-1 n-suns and a perfect matching. For an odd order complete graph K2n+1, delete the star subgraph K1, 2n and the resultant graph K2n is decomposed as in the case of even order. The method of building n-suns uses Walecki's construction for the Hamilton decomposition of complete graphs. A spanning tree decomposition of even order complete graphs is also discussed using the labeling scheme of n-sun decomposition. A complete bipartite graph Kn, n can be decomposed into n/2 n-suns when n/2 is even. When n/2 is odd, Kn, n can be decomposed into (n-2)/2 n-suns and a Hamilton cycle.
Abstract: Feature selection study is gaining importance due to its contribution to save classification cost in terms of time and computation load. In search of essential features, one of the methods to search the features is via the decision tree. Decision tree act as an intermediate feature space inducer in order to choose essential features. In decision tree-based feature selection, some studies used decision tree as a feature ranker with a direct threshold measure, while others remain the decision tree but utilized pruning condition that act as a threshold mechanism to choose features. This paper proposed threshold measure using Manhattan Hierarchical Cluster distance to be utilized in feature ranking in order to choose relevant features as part of the feature selection process. The result is promising, and this method can be improved in the future by including test cases of a higher number of attributes.
Abstract: XML has become a popular standard for information exchange via web. Each XML document can be presented as a rooted, ordered, labeled tree. The Node label shows the exact position of a node in the original document. Region and Dewey encoding are two famous methods of labeling trees. In this paper, we propose a new insert friendly labeling method named IFDewey based on recently proposed scheme, called Extended Dewey. In Extended Dewey many labels must be modified when a new node is inserted into the XML tree. Our method eliminates this problem by reserving even numbers for future insertion. Numbers generated by Extended Dewey may be even or odd. IFDewey modifies Extended Dewey so that only odd numbers are generated and even numbers can then be used for a much easier insertion of nodes.
Abstract: In this paper, we present a new learning algorithm for
anomaly based network intrusion detection using improved self
adaptive naïve Bayesian tree (NBTree), which induces a hybrid of
decision tree and naïve Bayesian classifier. The proposed approach
scales up the balance detections for different attack types and keeps
the false positives at acceptable level in intrusion detection. In
complex and dynamic large intrusion detection dataset, the detection
accuracy of naïve Bayesian classifier does not scale up as well as
decision tree. It has been successfully tested in other problem
domains that naïve Bayesian tree improves the classification rates in
large dataset. In naïve Bayesian tree nodes contain and split as
regular decision-trees, but the leaves contain naïve Bayesian
classifiers. The experimental results on KDD99 benchmark network
intrusion detection dataset demonstrate that this new approach scales
up the detection rates for different attack types and reduces false
positives in network intrusion detection.
Abstract: In this paper we introduce new data oriented modeling
of uniform random variable well-matched with computing systems. Due to this conformity with current computers structure, this modeling will be efficiently used in statistical inference.