期刊
ADVANCES IN INTELLIGENT DATA ANALYSIS XX, IDA 2022
卷 13205, 期 -, 页码 354-364出版社
SPRINGER INTERNATIONAL PUBLISHING AG
DOI: 10.1007/978-3-031-01333-1_28
关键词
Transfer learning; Learning vector quantization; Multiple source learning; Null-space evaluation
类别
资金
- European Social Fund (ESF)
This method allows for classification using data from multiple sources without explicit transfer learning by utilizing a siamese-like GMLVQ architecture. The architecture includes different sets of prototypes for target classification and source separation learning, and trains a linear map for source distinction in parallel to the classification task learning.
We present a method, which allows to train a Generalized Matrix Learning Vector Quantization (GMLVQ) model for classification using data from several, maybe non-calibrated, sources without explicit transfer learning. This is achieved by using a siamese-like GMLVQ-architecture, which comprises different sets of prototypes for the target classification and for the separation learning of the sources. In this architecture, a linear map is trained by means of GMLVQ for source distinction in the mapping space in parallel to the classification task learning. The respective null-space projection provides a common data representation of the different source data for an all-together classification learning.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据