4.7 Article

A classification-oriented dictionary learning model: Explicitly learning the particularity and commonality across categories

期刊

PATTERN RECOGNITION
卷 47, 期 2, 页码 885-898

出版社

ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2013.08.004

关键词

Dictionary learning; Sparse coding; Image classification; Particularity; Commonality

资金

  1. Natural Science Foundations [61071218]
  2. 973 Program of China [2010 CB327904]

向作者/读者索取更多资源

Empirically, we find that despite the most exclusively discriminative features owned by one specific object category, the various classes of objects usually share some common patterns, which do not contribute to the discrimination of them. Concentrating on this observation and motivated by the success of dictionary learning (DL) framework, in this paper, we propose to explicitly learn a class-specific dictionary (called particularity) for each category that captures the most discriminative features of this category, and simultaneously learn a common pattern pool (called commonality), whose atoms are shared by all the categories and only contribute to representation of the data rather than discrimination. In this way, the particularity differentiates the categories while the commonality provides the essential reconstruction for the objects. Thus, we can simply adopt a reconstruction-based scheme for classification. By reviewing the existing DL-based classification methods, we can see that our approach simultaneously learns a classification-oriented dictionary and drives the sparse coefficients as discriminative as possible. In this way, the proposed method will achieve better classification performance. To evaluate our method, we extensively conduct experiments both on synthetic data and real-world benchmarks in comparison with the existing DL-based classification algorithms, and the experimental results demonstrate the effectiveness of our method. (C) 2013 Elsevier Ltd. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据