期刊
INFORMATION SCIENCES
卷 557, 期 -, 页码 130-152出版社
ELSEVIER SCIENCE INC
DOI: 10.1016/j.ins.2020.12.067
关键词
Accuracy; Ensemble classifier; Random forest; Speed; Subbagging
资金
- Australian Government Research Training Program (RTP) scholarship
The FastForest algorithm, with its three optimizing components, achieves faster processing speed on hardware-constrained devices while maintaining high accuracy, suitable for both PC and smartphone platforms. Empirical testing shows excellent performance against other ensemble classifiers, surpassing them in various tests.
Random Forest remains one of Data Mining's most enduring ensemble algorithms, achieving well-documented levels of accuracy and processing speed, as well as regularly appearing in new research. However, with data mining now reaching the domain of hardware-constrained devices such as smartphones and Internet of Things (IoT) devices, there is continued need for further research into algorithm efficiency to deliver greater processing speed without sacrificing accuracy. Our proposed FastForest algorithm achieves this result through a combination of three optimising components - Subsample Aggregating ('Subbagging'), Logarithmic Split-Point Sampling and Dynamic Restricted Subspacing. Empirical testing shows FastForest delivers an average 24% increase in model-training speed compared with Random Forest whilst maintaining (and frequently exceeding) classification accuracy over tests involving 45 datasets on both PC and smartphone platforms. Further tests show FastForest achieves favourable results against a number of ensemble classifiers including implementations of Bagging and Random Subspace. With growing interest in machine-learning on mobile devices, FastForest provides an efficient ensemble classifier that can achieve faster results on hardware-constrained devices, such as smartphones. (C) 2021 Elsevier Inc. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据