Journal
ADVANCES IN INTELLIGENT DATA ANALYSIS XX, IDA 2022
Volume 13205, Issue -, Pages 354-364Publisher
SPRINGER INTERNATIONAL PUBLISHING AG
DOI: 10.1007/978-3-031-01333-1_28
Keywords
Transfer learning; Learning vector quantization; Multiple source learning; Null-space evaluation
Categories
Funding
- European Social Fund (ESF)
Ask authors/readers for more resources
This method allows for classification using data from multiple sources without explicit transfer learning by utilizing a siamese-like GMLVQ architecture. The architecture includes different sets of prototypes for target classification and source separation learning, and trains a linear map for source distinction in parallel to the classification task learning.
We present a method, which allows to train a Generalized Matrix Learning Vector Quantization (GMLVQ) model for classification using data from several, maybe non-calibrated, sources without explicit transfer learning. This is achieved by using a siamese-like GMLVQ-architecture, which comprises different sets of prototypes for the target classification and for the separation learning of the sources. In this architecture, a linear map is trained by means of GMLVQ for source distinction in the mapping space in parallel to the classification task learning. The respective null-space projection provides a common data representation of the different source data for an all-together classification learning.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available