期刊
NEUROCOMPUTING
卷 423, 期 -, 页码 24-33出版社
ELSEVIER
DOI: 10.1016/j.neucom.2020.09.061
关键词
Extreme learning machine; Sparse Bayesian; Multi-class classification; Sparse learning; Multinomial distribution
资金
- University of Macau [MYRG2018-00138-FST]
- Science and Technology Development Fund, Macau SAR [273/2017/A, 004/2019/AFJ]
Sparse Bayesian extreme learning machine (SBELM) faces challenges in multi-class classification, leading to the proposal of multinomial Bayesian extreme learning machine (MBELM) with multinomial distribution and integration of automatic relevance determination (ARD) and L1 penalty mechanisms. Experimental results demonstrate significant improvements in accuracy and model size for the new model.
Sparse Bayesian extreme learning machine (SBELM) is a probabilistic model with three-layer neural network, which is superior to extreme learning machine (ELM) in model generalization, sparsity and execution time. In SBELM, Bernoulli distribution is employed for binary classification, and then extended to multi-class classification using pairwise coupling. However, pairwise coupling suffers from three significant drawbacks for multi-class classification: 1) classification ambiguity and uncovered class regions; 2) large model size; 3) insufficient uncertainty representation for label prediction in probabilities. To alleviate these drawbacks, multinomial Bayesian extreme learning machine (MBELM) is proposed that employs multinomial distribution, which is proposed for multi-class classification. For the sake of various concerns between sparsity and accuracy, two sparse mechanisms namely automatic relevance determination (ARD) and L1 penalty are respectively integrated with MBELM. The experimental results show that, compared to SBELM, the proposed MBELM improves the test accuracy and the model size respectively up to 5% better, and 94 times smaller for multi-class classification. (c) 2020 Elsevier B.V. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据