4.5 Article

A Simple Derivation of a Bound on the Perceptron Margin Using Singular Value Decomposition

Journal

NEURAL COMPUTATION
Volume 23, Issue 8, Pages 1935-1943

Publisher

MIT PRESS
DOI: 10.1162/NECO_a_00152

Keywords

-

Funding

  1. DARPA [SyNAPSE HR0011-09-C-0002]
  2. Swartz Foundation
  3. Gatsby Foundation
  4. Rothschild Fellowship
  5. Brainpower for Israel foundation
  6. Swiss National Science Foundation [PBSKP3-133357]
  7. Swiss National Science Foundation (SNF) [PBSKP3_133357] Funding Source: Swiss National Science Foundation (SNF)

Ask authors/readers for more resources

The perceptron is a simple supervised algorithm to train a linear classifier that has been analyzed and used extensively. The classifier separates the data into two groups using a decision hyperplane, with the margin between the data and the hyperplane determining the classifier's ability to generalize and its robustness to input noise. Exact results for the maximal size of the separating margin are known for specific input distributions, and bounds exist for arbitrary distributions, but both rely on lengthy statistical mechanics calculations carried out in the limit of infinite input size. Here we present a short analysis of perceptron classification using singular value decomposition. We provide a simple derivation of a lower bound on the margin and an explicit formula for the perceptron weights that converges to the optimal result for large separating margins.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available