Journal
ENTROPY
Volume 24, Issue 5, Pages -Publisher
MDPI
DOI: 10.3390/e24050605
Keywords
decision trees; uncertain data; belief entropy; belief function; random forest; evidential likelihood
Categories
Funding
- National Natural Science Foundation of China [61973291]
Ask authors/readers for more resources
In this paper, a new decision tree method based on belief entropy is proposed, and then extended to random forest. This method can handle continuous attribute values directly without discretization preprocessing, and exhibits good classification accuracy on UCI dataset, especially in situations with high uncertainty.
As well-known machine learning methods, decision trees are widely applied in classification and recognition areas. In this paper, with the uncertainty of labels handled by belief functions, a new decision tree method based on belief entropy is proposed and then extended to random forest. With the Gaussian mixture model, this tree method is able to deal with continuous attribute values directly, without pretreatment of discretization. Specifically, the tree method adopts belief entropy, a kind of uncertainty measurement based on the basic belief assignment, as a new attribute selection tool. To improve the classification performance, we constructed a random forest based on the basic trees and discuss different prediction combination strategies. Some numerical experiments on UCI machine learning data set were conducted, which indicate the good classification accuracy of the proposed method in different situations, especially on data with huge uncertainty.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available