Journal
JOURNAL OF COMPUTATIONAL BIOLOGY
Volume 30, Issue 1, Pages 95-111Publisher
MARY ANN LIEBERT, INC
DOI: 10.1089/cmb.2022.0132
Keywords
neural networks; protein family classification; protein-protein interaction prediction
Ask authors/readers for more resources
The scientific community is generating protein sequence information rapidly, but only a small fraction can be experimentally validated. We propose a Transformer neural network that fine-tunes task-agnostic sequence representations for protein prediction tasks, achieving satisfactory results.
The scientific community is rapidly generating protein sequence information, but only a fraction of these proteins can be experimentally characterized. While promising deep learning approaches for protein prediction tasks have emerged, they have computational limitations or are designed to solve a specific task. We present a Transformer neural network that pre-trains task-agnostic sequence representations. This model is fine-tuned to solve two different protein prediction tasks: protein family classification and protein interaction prediction. Our method is comparable to existing state-of-the-art approaches for protein family classification while being much more general than other architectures. Further, our method outperforms other approaches for protein interaction prediction for two out of three different scenarios that we generated. These results offer a promising framework for fine-tuning the pre-trained sequence representations for other protein prediction tasks.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available