4.6 Article

Automatic Tuning of the RBF Kernel Parameter for Batch-Mode Active Learning Algorithms: A Scalable Framework

期刊

IEEE TRANSACTIONS ON CYBERNETICS
卷 49, 期 12, 页码 4460-4472

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCYB.2018.2869861

关键词

Active learning; automatical tuning of radial-basis function (RBF) kernel parameters; hypothesis margins

资金

  1. Ministry of Science and Technology of Taiwan [MOST 105-2221-E-019-070]

向作者/读者索取更多资源

Batch-mode active learning algorithms can select a batch of valuable unlabeled samples to manually annotate for reducing the total cost of labeling every unlabeled sample. To facilitate selection of valuable unlabeled samples, many batch-mode active learning algorithms map samples to the reproducing kernel Hilbert space induced by a radial-basis function (RBF) kernel. Setting a proper value to the parameter for the RBF kernel is crucial for such batch-mode active learning algorithms. In this paper, for automatic tuning of the kernel parameter, a hypothesis-margin-based criterion function is proposed. Three frameworks are also developed to incorporate the function of automatic tuning of the kernel parameter with existing batch-model active learning algorithms. In the proposed frameworks, the kernel parameter can be tuned in a single stage or in multiple stages. Tuning the kernel parameter in a single stage aims for the kernel parameter to be suitable for selecting the specified number of unlabeled samples. When the kernel parameter is tuned in multiple stages, the incorporated active learning algorithm can be enforced to make coarse-to-fine evaluations of the importance of unlabeled samples. The proposed framework can also improve the scalability of existing batch-mode active learning algorithms satisfying a decomposition property. Experimental results on data sets comprising hundreds to hundreds of thousands of samples have shown the feasibility of the proposed framework.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据