Learning and Evaluating Possibilistic Decision Trees using Information Affinity

This paper investigates the issue of building decision trees from data with imprecise class values where imprecision is encoded in the form of possibility distributions. The Information Affinity similarity measure is introduced into the well-known gain ratio criterion in order to assess the homogeneity of a set of possibility distributions representing instances-s classes belonging to a given training partition. For the experimental study, we proposed an information affinity based performance criterion which we have used in order to show the performance of the approach on well-known benchmarks.




References:
[1] N. Ben Amor, S. Benferhat, Z. Elouedi: Qualitative classification with
possibilistic decision trees, (IPMU-04), Perugia, Italy, 2004.
[2] N. Ben Amor, S. Benferhat, Z. Elouedi: Qualitative classification and
evaluation in possibilistic decision trees, (FUZZ-IEEE-04), Hungary, 2004,
653-657.
[3] C. Borgelt, J. Gebhardt, R. Kruse: Concepts for Probabilistic and Possibilistic
Induction of Decision Trees on Real World Data. (EUFIT-96),
1996, 1556-1560.
[4] T. Denoeux and M. S. Bjanger: Induction of decision trees from partially
classified data. SMC-00, Nashville, TN, 2000, 2923-2928.
[5] T. Denoeux and L. M. Zouhal. Handling possibilistic labels in pattern
classification using evidential reasoning. Fuzzy Sets and Systems, 122(3),
2001, 47-62.
[6] D. Dubois and H. Prade: Possibility theory: An approach to computerized
processing of uncertainty, Plenum Press, New York, 1988.
[7] Z. Elouedi, K. Mellouli and P. Smets. Belief decision trees: Theoretical
foundations. International Journal of Approximate Reasoning, 28, 2001,
91-124.
[8] E. H├╝llermeier. Possibilistic Induction in decision tree learning.
ECML-02, Helsinki, Finland, 2002, 173-184.
[9] I. Jenhani, N. Ben Amor, Z. Elouedi, S. Benferhat and K. Mellouli:
Information Affinity: a new similarity measure for possibilistic uncertain
information, ECSQARU-07, Hammamet, Tunisia, 2007, 840-852.
[10] I. Jenhani, N. Ben Amor, Z. Elouedi: Decision Trees as Possibilistic
Classifiers, International Journal of Approximate Reasoning, 48(3), 2008,
784-807.
[11] C. Z. Janikow. Fuzzy decision trees: issues and methods. IEEE Transactions
on Systems, Man and Cybernetics-Part B: Cybernetics 28(1),1998,
1-14.
[12] C. Marsala: Apprentissage inductif en présence de données imprécises:
construction et utilisation d-arbres de décision flous, PhD thesis, University
P. et M. Curie, Paris, France, 1998.
[13] A. Motro: Sources of Uncertainty, Imprecision and Inconsistency in
Information Systems. In Uncertainty Management in Information Systems:
From Needs to Solutions, 1996, 9-34.
[14] P. M. Murphy, D.W. Aha: UCI repository of machine learning databases,
1996.
[15] J. R. Quinlan: Induction of decision trees, Machine Learning, 1, 1986,
81-106.
[16] J. R. Quinlan: C4.5: Programs for machine learning, Morgan Kaufmann,
1993.
[17] Y. Yuan, M.J. Shaw: Induction of fuzzy decision trees, Fuzzy Sets and
Systems, 69, 1995, 125-139.
[18] L. A. Zadeh: Fuzzy sets as a basis for a theory of possibility, Fuzzy Sets
ans Systems, 1, 1978, 3-28.