Abstract Decision trees are highly favoured classifiers because of the resemblance of their understandable nature to the branched process of human thinking. But the comprehensible rationality of these trees can… Click to show full abstract
Abstract Decision trees are highly favoured classifiers because of the resemblance of their understandable nature to the branched process of human thinking. But the comprehensible rationality of these trees can be severely affected by the bias in the selection of the split attribute, and the traditional heuristic methods appear to be multi-value. The present paper proposes an attribute selection method for nodes on the basis of the concept model of decision trees in purpose of avoiding the heuristic bias of attribute measurement and improving the performance of decision trees. The probabilistic statistics form is used to define and express the concept model extracted from the given data of things and created by associated certainty of classes distribution and branches distribution to fulfil certainty description of tree. And class constraint uncertainty ( CCE ) is used as a heuristic measure in the induction of tree to select the split attribute while the processing of the missing branch as an auxiliary leaf measure to construct a novel algorithm of decision tree learning. Experimental findings show that CCE is effective as a heuristic measure to avoid the bias in the selection of the multi-value attribute to all datasets and improve the performance and stability of the decision trees.
               
Click one of the above tabs to view related content.