4.7 Article

Boosting Tree-Assisted Multitask Deep Learning for Small Scientific Datasets

Journal

JOURNAL OF CHEMICAL INFORMATION AND MODELING
Volume 60, Issue 3, Pages 1235-1244

Publisher

AMER CHEMICAL SOC
DOI: 10.1021/acs.jcim.9b01184

Keywords

-

Funding

  1. NSF [DMS-1721024, DMS-1761320, IIS1900473]
  2. NIH [GM126189]
  3. BristolMyers Squibb
  4. Pfizer
  5. Chinese Scholarships Council
  6. National Natural Science Foundation of China [61573011, 11972266]

Ask authors/readers for more resources

Machine learning approaches have had tremendous success in various disciplines. However, such success highly depends on the size and quality of datasets. Scientific datasets are often small and difficult to collect. Currently, improving machine learning performance for small scientific datasets remains a major challenge in many academic fields, such as bioinformatics or medical science. Gradient boosting decision tree (GBDT) is typically optimal for small datasets, while deep learning often performs better for large datasets. This work reports a boosting tree-assisted multitask deep learning (BTAMDL) architecture that integrates GBDT and multitask deep learning (MDL) to achieve near-optimal predictions for small datasets when there exists a large dataset that is well correlated to the small datasets. Two BTAMDL models are constructed, one utilizing purely MDL output as GBDT input while the other admitting additional features in GBDT input. The proposed BTAMDL models are validated on four categories of datasets, including toxicity, partition coefficient, solubility, and solvation. It is found that the proposed BTAMDL models outperform the current state-of-the-art methods in various applications involving small datasets.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available