4.6 Article

Metric and non-metric proximity transformations at linear costs

期刊

NEUROCOMPUTING
卷 167, 期 -, 页码 643-657

出版社

ELSEVIER
DOI: 10.1016/j.neucom.2015.04.017

关键词

Dissimilarity learning; Linear eigenvalue correction; Nystrom approximation; Double centering; Pseudo-Euclidean; Indefinite kernel

资金

  1. Cluster of Excellence 277 Cognitive Interaction Technology - German Excellence Initiative
  2. Marie Curie Intra-European Fellowship (IEF): FP7-PEOPLE-IEF [FP7-327791-ProMoS]

向作者/读者索取更多资源

Domain specific (dis-)similarity or proximity measures used e.g. in alignment algorithms of sequence data are popular to analyze complicated data objects and to cover domain specific data properties. Without an underlying vector space these data are given as pairwise (dis-)similarities only. The few available methods for such data focus widely on similarities and do not scale to large datasets. Kernel methods are very effective for metric similarity matrices, also at large scale, but costly transformations are necessary starting with non-metric (dis-) similarities. We propose an integrative combination of Nystrom approximation, potential double centering and eigenvalue correction to obtain valid kernel matrices at linear costs in the number of samples. By the proposed approach effective kernel approaches become accessible. Experiments with several larger (dis-)similarity datasets show that the proposed method achieves much better runtime performance than the standard strategy while keeping competitive model accuracy. The main contribution is an efficient and accurate technique, to convert (potentially non-metric) large scale dissimilarity matrices into approximated positive semi-definite kernel matrices at linear costs. (C) 2015 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据