Journal
NEURAL COMPUTATION
Volume 17, Issue 1, Pages 177-204Publisher
MIT PRESS
DOI: 10.1162/0899766052530802
Keywords
-
Funding
- Engineering and Physical Sciences Research Council [GR/T18707/01] Funding Source: researchfish
Ask authors/readers for more resources
In this letter, we provide a study of learning in a Hilbert space of vector-valued functions. We motivate the need for extending learning theory of scalar-valued functions by practical considerations and establish some basic results for learning vector-valued functions that should prove useful in applications. Specifically, we allow an output space gamma to be a Hilbert space, and we consider a reproducing kernel Hilbert space of functions whose values lie in gamma. In this setting, we derive the form of the minimal norm interpolant to a finite set of data and apply it to study some regularization functionals that are important in learning theory. We consider specific examples of such functionals corresponding to multiple-output regularization networks and support vector machines, for both regression and classification. Finally, we provide classes of operator-valued kernels of the dot product and translation-invariant type.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available