Journal
IEEE ACCESS
Volume 7, Issue -, Pages 149890-149899Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/ACCESS.2019.2947359
Keywords
Training; Resistance; Support vector machines; Error analysis; Licenses; Prediction algorithms; Training data; AdaBoost; feature learning; overfitting; SVM
Categories
Funding
- National Natural Science Foundation of China [61603291, 61772427, 61751202]
- National Major Science and Technology Projects of China [2018ZX01008103]
- Natural Science Basic Research Plan in Shaanxi Province of China [2018JM6057]
- Fundamental Research Funds for the Central Universities
Ask authors/readers for more resources
The AdaBoost algorithm has the superiority of resisting overfitting. Understanding the mysteries of this phenomenon is a very fascinating fundamental theoretical problem. Many studies are devoted to explaining it from statistical view and margin theory. In this paper, this phenomenon is illustrated by the proposed AdaBoostSVM algorithm from feature learning viewpoint, which clearly explains the resistance to overfitting of AdaBoost. Firstly, we adopt the AdaBoost algorithm to learn the base classifiers. Then, instead of directly combining the base classifiers, we regard them as features and input them to SVM classifier. With this, the new coefficient and bias can be obtained, which can be used to construct the final classifier. We explain the rationality of this and illustrate the theorem that when the dimension of these features increases, the performance of SVM would not be worse, which can explain the resistance to overfitting of AdaBoost.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available