4.7 Article

Heart sound classification based on bispectrum features and Vision Transformer mode

Journal

ALEXANDRIA ENGINEERING JOURNAL
Volume 85, Issue -, Pages 49-59

Publisher

ELSEVIER
DOI: 10.1016/j.aej.2023.11.035

Keywords

Artificial Intelligence; Maternal; Bispectrum; Vision Transformer; Cardiac auscultation

Ask authors/readers for more resources

In regions with limited resources and moderate incomes, precise classification of heart sounds using deep learning algorithms is crucial for early diagnosis and intervention of cardiovascular diseases (CVDs). A new model is presented in this paradigm-shifting article, which combines bispectrum-inspired feature extraction and the cutting-edge Vision Transformer (ViT) model to achieve binary classification of heart sounds as 'normal' or 'abnormal'. The model demonstrates adept classification performance, surpassing even seasoned cardiologists, offering hope for the future of cardiac care where advanced algorithms can unlock the secrets of heart sounds.
In regions with limited resources and moderate incomes, the relentless spectre of cardiovascular diseases (CVDs) continues to loom large. Amidst this challenge, the precise classification of heart sounds is emerging as a pivotal linchpin in the realm of early CVD diagnosis and intervention. Manual heart sound auscultation efficacy remains tethered to the expertise of physicians, but the tides are shifting. With deep learning algorithms, heart sound classification reaches new heights. In this paradigm-shifting article, we unveil an ingenious model fortified by bispectrum-inspired feature extraction and the cutting-edge prowess of the Vision Transformer (ViT) model. This model spearheads the binary classification of heart sounds, labelling them as either 'normal' or 'abnormal.' Our model uses data from the PhysioNet Challenge 2022 database, which contains 3163 data points from 942 patients. The model showcases an adept classification process with a remarkable consistency, notably holding its own when distinguishing between heart sounds of pregnant and nonpregnant patients. Moreover, we dare to challenge the status quo. This article boldly pits the performance of our model against that of seasoned cardiologists. Our model emerges as the triumphant frontrunner, eclipsing the proficiency of even the most seasoned cardiologists. In a world where health resources are unevenly distributed, this pioneering work offers a beacon of hope, unlocking a future where advanced algorithms not only match but surpass human expertise. Step into the forefront of transformative cardiac care-where artificial intelligence becomes the key to unravelling the secrets of heart sounds.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available