4.5 Article

Improving random forests by neighborhood projection for effective text classification

Journal

INFORMATION SYSTEMS
Volume 77, Issue -, Pages 1-21

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.is.2018.05.006

Keywords

Classification; Random forests; Lazy learning; Nearest neighbors

Funding

  1. CNPq
  2. CAPES
  3. FINEP
  4. FAPEMIG
  5. INWEB

Ask authors/readers for more resources

In this article, we propose a lazy version of the traditional random forest (RF) classifier (called LazyNN_RF), specially designed for highly dimensional noisy classification tasks. The LazyNN_RF localized training projection is composed by examples that better resemble the examples to be classified, obtained through nearest neighborhood training set projection. Such projection filters out irrelevant data, ultimately avoiding some of the drawbacks of traditional random forests, such as overfitting due to very complex trees, especially in high dimensional noisy datasets. In sum, our main contributions are: (i) the proposal and implementation of a novel lazy learner based on the random forest classifier and nearest neighborhood projection of the training set that excels in automatic text classification tasks, as well as (ii) a throughout and detailed experimental analysis that sheds light on the behavior, effectiveness and feasibility of our solution. By means of an extensive experimental evaluation, performed considering two text classification domains and a large set of baseline algorithms, we show that our approach is highly effective and feasible, being a strong candidate for consideration for solving automatic text classification tasks when compared to state-of-the-art classifiers. (C) 2018 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available