4.5 Article

Hierarchical Bayesian Inference and Recursive Regularization for Large-Scale Classification

Journal

Publisher

ASSOC COMPUTING MACHINERY
DOI: 10.1145/2629585

Keywords

Design; Algorithms; Experimentation; Large-scale optimization; hierarchical classification; Bayesian methods

Funding

  1. National Science Foundation (NSF) [IIS_1216282]
  2. NSF [CCF_1019104]
  3. Gordon and Betty Moore Foundation in the eScience project

Ask authors/readers for more resources

In this article, we address open challenges in large-scale classification, focusing on how to effectively leverage the dependency structures (hierarchical or graphical) among class labels, and how to make the inference scalable in jointly optimizing all model parameters. We propose two main approaches, namely the hierarchical Bayesian inference framework and the recursive regularization scheme. The key idea in both approaches is to reinforce the similarity among parameter across the nodes in a hierarchy or network based on the proximity and connectivity of the nodes. For scalability, we develop hierarchical variational inference algorithms and fast dual coordinate descent training procedures with parallelization. In our experiments for classification problems with hundreds of thousands of classes and millions of training instances with terabytes of parameters, the proposed methods show consistent and statistically significant improvements over other competing approaches, and the best results on multiple benchmark datasets for large-scale classification.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available