4.7 Article

Eigen-guided deep metric learning

期刊

EXPERT SYSTEMS WITH APPLICATIONS
卷 203, 期 -, 页码 -

出版社

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.eswa.2022.117367

关键词

Deep learning; Metric learning; Mathematical optimization; Computer vision

资金

  1. Korea Research Institute of Chemical Technology (KRICT) [SI2151-10]

向作者/读者索取更多资源

This paper proposes a new approach to improve the optimization performance of gradient-based optimizers in metric learning by using eigenguidance, which is calculated based on eigenvalue decomposition. The experiments showed that the gradient-based optimizers with eigenguidance converged significantly faster than those without in metric learning tasks.
This paper proposes a new approach to improve the optimization performance of gradient-based optimizers for training neural networks in metric learning. In metric learning, the neural networks use a linear activation in their output layer to generate real-valued data embeddings. Hence, the optimal model parameters can be deterministically calculated with respect to the output layer on the basis of the eigenvalue decomposition. These optimal model parameters that were computed by the eigenvalue decomposition were referred to eigenguidance. The training performance of a gradient-based optimizer is boosted by replacing the current model parameters with the eigen-guidance at each pre-defined number of epochs. In addition, the optimal number of embedding dimensions is determined directly while computing the eigen-guidance without additional complexities. In our experiments, gradient-based optimizers with the eigen-guidance converged significantly faster than the gradient-based optimizers alone in metric learning tasks.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据