4.7 Article

BionoiNet: ligand-binding site classification with off-the-shelf deep neural network

期刊

BIOINFORMATICS
卷 36, 期 10, 页码 3077-3083

出版社

OXFORD UNIV PRESS
DOI: 10.1093/bioinformatics/btaa094

关键词

-

资金

  1. National Institute of General Medical Sciences of the National Institutes of Health [R35GM119524]
  2. US National Science Foundation [CCF-1619303]
  3. Louisiana Board of Regents [LEQSF(2016-19)-RD-B-03]
  4. Center for Computation and Technology, Louisiana State University

向作者/读者索取更多资源

Motivation: Fast and accurate classification of ligand-binding sites in proteins with respect to the class of binding molecules is invaluable not only to the automatic functional annotation of large datasets of protein structures but also to projects in protein evolution, protein engineering and drug development. Deep learning techniques, which have already been successfully applied to address challenging problems across various fields, are inherently suitable to classify ligand-binding pockets. Our goal is to demonstrate that off-the-shelf deep learning models can be employed with minimum development effort to recognize nucleotide- and heme-binding sites with a comparable accuracy to highly specialized, voxel-based methods. Results: We developed BionoiNet, a new deep learning-based framework implementing a popular ResNet model for image classification. BionoiNet first transforms the molecular structures of ligand-binding sites to 2D Voronoi diagrams, which are then used as the input to a pretrained convolutional neural network classifier. The ResNet model generalizes well to unseen data achieving the accuracy of 85.6% for nucleotide- and 91.3% for heme-binding pockets. BionoiNet also computes significance scores of pocket atoms, called BionoiScores, to provide meaningful insights into their interactions with ligand molecules. BionoiNet is a lightweight alternative to computationally expensive 3D architectures.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据