4.5 Article

Dimensionality-Dependent Generalization Bounds for k-Dimensional Coding Schemes

Journal

NEURAL COMPUTATION
Volume 28, Issue 10, Pages 2213-2249

Publisher

MIT PRESS
DOI: 10.1162/NECO_a_00872

Keywords

-

Funding

  1. Australian Research Council [DP-140102164, FT-130101457]
  2. Faculty of Engineering and Information Technologies, University of Sydney, under the Faculty Research Cluster Program

Ask authors/readers for more resources

The k-dimensional coding schemes refer to a collection of methods that attempt to represent data using a set of representative k-dimensional vectors and include nonnegative matrix factorization, dictionary learning, sparse coding, k-means clustering, and vector quantization as special cases. Previous generalization bounds for the reconstruction error of the k-dimensional coding schemes are mainly dimensionality-independent. A major advantage of these bounds is that they can be used to analyze the generalization error when data are mapped into an infinite- or high-dimensional feature space. However, many applications use finite-dimensional data features. Can we obtain dimensionality-dependent generalization bounds for k-dimensional coding schemes that are tighter than dimensionality-independent bounds when data are in a finite-dimensional feature space? Yes. In this letter, we address this problem and derive a dimensionality-dependent generalization bound for k-dimensional coding schemes by bounding the covering number of the loss function class induced by the reconstruction error. The bound is of order O ((mkln(mkn)/n)(lambda n)), where m is the dimension of features, k is the number of the columns in the linear implementation of coding schemes, and n is the size of sample, lambda(n) > 0.5 when n is finite and lambda(n) = 0.5 when n is infinite. We show that our bound can be tighter than previous results because it avoids inducing the worst-case upper bound on k of the loss function. The proposed generalization bound is also applied to some specific coding schemes to demonstrate that the dimensionality-dependent bound is an indispensable complement to the dimensionality-independent generalization bounds.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available