Journal
INTERNATIONAL JOURNAL OF MOLECULAR SCIENCES
Volume 23, Issue 6, Pages -Publisher
MDPI
DOI: 10.3390/ijms23062966
Keywords
deep neural network; feature vector; protein; protein fold classification
Ask authors/readers for more resources
In this study, a new deep neural network architecture called BioS2Net is proposed for extracting sequential and structural information of biomolecules. The performance of BioS2Net is evaluated on two protein fold classification datasets, demonstrating its effectiveness and reliability in protein fold recognition.
Background: For decades, the rate of solving new biomolecular structures has been exceeding that at which their manual classification and feature characterisation can be carried out efficiently. Therefore, a new comprehensive and holistic tool for their examination is needed. Methods: Here we propose the Biological Sequence and Structure Network (BioS2Net), which is a novel deep neural network architecture that extracts both sequential and structural information of biomolecules. Our architecture consists of four main parts: (i) a sequence convolutional extractor, (ii) a 3D structure extractor, (iii) a 3D structure-aware sequence temporal network, as well as (iv) a fusion and classification network. Results: We have evaluated our approach using two protein fold classification datasets. BioS2Net achieved a 95.4% mean class accuracy on the eDD dataset and a 76% mean class accuracy on the F184 dataset. The accuracy of BioS2Net obtained on the eDD dataset was comparable to results achieved by previously published methods, confirming that the algorithm described in this article is a top-class solution for protein fold recognition. Conclusions: BioS2Net is a novel tool for the holistic examination of biomolecules of known structure and sequence. It is a reliable tool for protein analysis and their unified representation as feature vectors.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available