HIERARCHICAL CLASSIFICATOR: A COGNITIVE APPROACH TO DECISION TREE BUILDING

Authors

  • Jurgita Kapočiūtė-Dzikienė Vytautas Magnus University
  • Arimantas Raškinis Vytautas Magnus University

Abstract

We present a new algorithm that follows “divide and conquer” machine learning approach and exhibits a few interesting cognitive properties. The algorithm aims at building the decision tree with only one terminal node per class. Splits of tree nodes are constrained to functions that take identical values (true or false) for every instance within the same class. Appropriate splits are found through an exhaustive search in the attribute-value-based function space. Simple single-attribute functions are considered before complex multi-attribute k-DNF type ones. Redundant functions are also being incorporated into the decision tree. The unique structure of the decision tree results in that semantic interpretation can be attached to both terminal and non-terminal nodes, the task-specific set of classes is structured within a hierarchy of similarity relationships, and sources of recognition errors can be traced back (localized) to some particular function/split in the decision tree. The new algorithm was implemented, experimentally evaluated and compared with the well-known machine learning techniques Ripper and C4.5. Though limited in scope the experiments showed that the new algorithm can perform at least as well as Ripper and C4.5. Redundant knowledge incorporated into the decision tree helped to improve the recognition accuracy.

Downloads

Published

2008-04-03

Issue

Section

Articles