4.7 Article

Negative Correlation Ensemble Learning for Ordinal Regression

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNNLS.2013.2268279

Keywords

Negative correlation learning (NCL); neural network ensembles; ordinal regression; threshold methods

Funding

  1. Spanish Inter-Ministerial Commission of Science and Technology [TIN2011-22794]
  2. European Regional Development fund
  3. Junta de Andalucia (Spain) [P2011-TIC-7508]
  4. European Commission [270428]

Ask authors/readers for more resources

In this paper, two neural network threshold ensemble models are proposed for ordinal regression problems. For the first ensemble method, the thresholds are fixed a priori and are not modified during training. The second one considers the thresholds of each member of the ensemble as free parameters, allowing their modification during the training process. This is achieved through a reformulation of these tunable thresholds, which avoids the constraints they must fulfill for the ordinal regression problem. During training, diversity exists in different projections generated by each member is taken into account for the parameter updating. This diversity is promoted in an explicit way using a diversity-encouraging error function, extending the well-known negative correlation learning framework to the area of ordinal regression, and inheriting many of its good properties. Experimental results demonstrate that the proposed algorithms can achieve competitive generalization performance when considering four ordinal regression metrics.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available