4.6 Article

Analysis of Kernel Matrices via the von Neumann Entropy and Its Relation to RVM Performances

期刊

ENTROPY
卷 25, 期 1, 页码 -

出版社

MDPI
DOI: 10.3390/e25010154

关键词

von Neumann entropy; relevance vector machines; generalization error

向作者/读者索取更多资源

Kernel methods have been crucial in data science for modeling and visualizing complex problems over the past two decades. However, the selection of kernel functions and the reasons behind their varying performances are still poorly understood. Additionally, the computational costs associated with kernel-based methods call for careful kernel design and parameter selection, as standard model selection methods are highly inefficient. This paper examines these issues from an entropic perspective, focusing on kernelized relevance vector machines (RVMs) and identifying desirable properties of kernel matrices that improve generalization power and model fitting ability. A heuristic approach is also provided to achieve optimal modeling results with limited computational resources.
Kernel methods have played a major role in the last two decades in the modeling and visualization of complex problems in data science. The choice of kernel function remains an open research area and the reasons why some kernels perform better than others are not yet understood. Moreover, the high computational costs of kernel-based methods make it extremely inefficient to use standard model selection methods, such as cross-validation, creating a need for careful kernel design and parameter choice. These reasons justify the prior analyses of kernel matrices, i.e., mathematical objects generated by the kernel functions. This paper explores these topics from an entropic standpoint for the case of kernelized relevance vector machines (RVMs), pinpointing desirable properties of kernel matrices that increase the likelihood of obtaining good model performances in terms of generalization power, as well as relate these properties to the model's fitting ability. We also derive a heuristic for achieving close-to-optimal modeling results while keeping the computational costs low, thus providing a recipe for efficient analysis when processing resources are limited.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据