4.7 Article

Constraint-weighted support vector ordinal regression to resist constraint noises

Journal

INFORMATION SCIENCES
Volume 649, Issue -, Pages -

Publisher

ELSEVIER SCIENCE INC
DOI: 10.1016/j.ins.2023.119644

Keywords

Ordinal regression; Constraint-weighted support vector ordinal; regression; Constraint weight vector; Constraint noise

Ask authors/readers for more resources

This paper proposes a method called Constraint-weighted Support Vector Ordinal Regression (CWSVOR) to address the problem of constraint noises in ordinal regression. By introducing a constraint weight vector to control the influence of constraints on parallel hyperplanes, CWSVOR aims to mitigate the detrimental effects of constraint noises and shows superior performance on training sets corrupted by noises.
Ordinal regression (OR) is a crucial in machine learning. Usual assumption is that all training instances are perfectly denoted. However, when this assumption does not hold, the performance degrades significantly. As a widely used ordinal regression model, support vector ordinal regression (SVOR) identifies r-1 parallel hyperplanes to separate r ranks, where each instance is associated with r-1 constraints for r-1 parallel hyperplanes. Different from the traditional classification problem, an instance with incorrect label may have no influence on certain parallel hyperplanes during SVOR learning. If a constraint induces the deviation of parallel hyperplane(s), it is termed as constraint noise. To address constraint noises, this paper proposes constraint-weighted support vector ordinal regression (CWSVOR) by introducing a constraint weight vector whose length is r-1 to control the influence of r-1 constraints on parallel hyperplanes for each instance. When an instance is denoted as an incorrect rank, the elements of the weight vector for constraint noises are close to 0, while others remain close to 1. The proposed constraint-weighted strategy aims to mitigate the detrimental effects of constraint noises and simultaneously retain the useful constraints during SVOR learning. The experiments on several datasets demonstrate that CWSVOR outperforms KDLOR, ELMOP, NNOP, SVOR and NPSVOR when the training set is corrupted by noises and it shows comparable performance to pin-SVOR.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available