4.6 Article

Transformer-CNN: Swiss knife for QSAR modeling and interpretation

Journal

JOURNAL OF CHEMINFORMATICS
Volume 12, Issue 1, Pages -

Publisher

BMC
DOI: 10.1186/s13321-020-00423-w

Keywords

Transformer model; Convolutional neural neural networks; Augmentation; QSAR; SMILES; Embeddings; Character-based models; Cheminformatics; Regression; Classification

Funding

  1. European Union [676434]
  2. ERA-CVD CardioOncology project [BMBF 01KL1710]

Ask authors/readers for more resources

We present SMILES-embeddings derived from the internal encoder state of a Transformer [1] model trained to canonize SMILES as a Seq2Seq problem. Using a CharNN [2] architecture upon the embeddings results in higher quality interpretable QSAR/QSPR models on diverse benchmark datasets including regression and classification tasks. The proposed Transformer-CNN method uses SMILES augmentation for training and inference, and thus the prognosis is based on an internal consensus. That both the augmentation and transfer learning are based on embeddings allows the method to provide good results for small datasets. We discuss the reasons for such effectiveness and draft future directions for the development of the method. The source code and the embeddings needed to train a QSAR model are available on . The repository also has a standalone program for QSAR prognosis which calculates individual atoms contributions, thus interpreting the model's result. OCHEM [3] environment () hosts the on-line implementation of the method proposed.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available