4.7 Article

Solving large-scale support vector ordinal regression with asynchronous parallel coordinate descent algorithms

期刊

PATTERN RECOGNITION
卷 109, 期 -, 页码 -

出版社

ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2020.107592

关键词

Asynchronous parallel; Coordinate descent; Support vector; Ordinal regression

资金

  1. Natural Science Foundation in Jiangsu Province [BK20161534]
  2. Six talent peaks project in Jiangsu Province [XYDXX-042]
  3. 333 Project in Jiangsu Province [BRA2017455]
  4. Priority Academic Program Development (PAPD) of Jiangsu Higher Education Institutions

向作者/读者索取更多资源

Ordinal regression is a significant task in supervised learning, and traditional SVOR methods face inefficiency in large-scale training due to complexity and high cost. This paper proposes a special SVOR formulation with implicit thresholds, and introduces two novel asynchronous parallel coordinate descent algorithms, AsyACGD and AsyORGCD, to accelerate SVOR training. Experimental results demonstrate the superiority of the proposed algorithms on several large-scale ordinal regression datasets.
Ordinal regression is one of the most influential tasks of supervised learning. Support vector ordinal regression (SVOR) is an appealing method to tackle ordinal regression problems. However, due to the complexity in the formulation of SVOR and the high cost of kernel computation, traditional SVOR solvers are inefficient for large-scale training. To address this problem, in this paper, we first highlight a special SVOR formulation whose thresholds are described implicitly, so that the dual formulation is concise to apply the state-of-the-art asynchronous parallel coordinate descent algorithm, such as AsyGCD. To further accelerate the training for SVOR, we propose two novel asynchronous parallel coordinate descent algorithms, called AsyACGD and AsyORGCD respectively. AsyACGD is an accelerated extension of AsyGCD using active set strategy. AsyORGCD is specifically designed for SVOR that it can keep the ordered thresholds when it is training so that it can obtain good performance with lower time. Experimental results on several large-scale ordinal regression datasets demonstrate the superiority of our proposed algorithms. (C) 2020 Elsevier Ltd. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据