4.3 Article

Cognitive decline assessment using semantic linguistic content and transformer deep learning architecture

Publisher

WILEY
DOI: 10.1111/1460-6984.12973

Keywords

cosine similarity; dementia; semantic analysis; sentence; transcript; text analysis; word count

Ask authors/readers for more resources

This study proposes an approach for assessing cognitive decline by analyzing speech data, specifically focusing on speech relevance as a crucial indicator for memory recall. The results show that the proposed approach outperforms other models in capturing context level information, particularly semantic memory, and cosine similarity is identified as the most appropriate measure for evaluating the relevance of uttered sequences of sentences.
BackgroundDementia is a cognitive decline that leads to the progressive deterioration of an individual's ability to perform daily activities independently. As a result, a considerable amount of time and resources are spent on caretaking. Early detection of dementia can significantly reduce the effort and resources needed for caretaking.AimsThis research proposes an approach for assessing cognitive decline by analysing speech data, specifically focusing on speech relevance as a crucial indicator for memory recall.Methods & ProceduresThis is a cross-sectional, online, self-administered. The proposed method used deep learning architecture based on transformers, with BERT (Bidirectional Encoder Representations from Transformers) and Sentence-Transformer to derive encoded representations of speech transcripts. These representations provide contextually descriptive information that is used to analyse the relevance of sentences in their respective contexts. The encoded information is then compared using cosine similarity metrics to measure the relevance of uttered sequences of sentences. The study uses the Pitt Corpus Dementia dataset for experimentation, which consists of speech data from individuals with and without dementia. The accuracy of the proposed multi-QA-MPNet (Multi-Query Maximum Inner Product Search Pretraining) model is compared with other pretrained transformer models of Sentence-Transformer.Outcomes & ResultsThe results show that the proposed approach outperforms the other models in capturing context level information, particularly semantic memory. Additionally, the study explores the suitability of different similarity measures to evaluate the relevance of uttered sequences of sentences. The experimentation reveals that cosine similarity is the most appropriate measure for this task.Conclusions & ImplicationsThis finding has significant implications for the early warning signs of dementia, as it suggests that cosine similarity metrics can effectively capture the semantic relevance of spoken language. The persistent cognitive decline over time acts as one of the indicators for prevalence of dementia. Additionally early dementia could be recognised by analysis on other modalities like speech and brain images.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.3
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available