4.5 Article

Hierarchical Bayesian Inference and Recursive Regularization for Large-Scale Classification

出版社

ASSOC COMPUTING MACHINERY
DOI: 10.1145/2629585

关键词

Design; Algorithms; Experimentation; Large-scale optimization; hierarchical classification; Bayesian methods

资金

  1. National Science Foundation (NSF) [IIS_1216282]
  2. NSF [CCF_1019104]
  3. Gordon and Betty Moore Foundation in the eScience project

向作者/读者索取更多资源

In this article, we address open challenges in large-scale classification, focusing on how to effectively leverage the dependency structures (hierarchical or graphical) among class labels, and how to make the inference scalable in jointly optimizing all model parameters. We propose two main approaches, namely the hierarchical Bayesian inference framework and the recursive regularization scheme. The key idea in both approaches is to reinforce the similarity among parameter across the nodes in a hierarchy or network based on the proximity and connectivity of the nodes. For scalability, we develop hierarchical variational inference algorithms and fast dual coordinate descent training procedures with parallelization. In our experiments for classification problems with hundreds of thousands of classes and millions of training instances with terabytes of parameters, the proposed methods show consistent and statistically significant improvements over other competing approaches, and the best results on multiple benchmark datasets for large-scale classification.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据