4.2 Article

Convolution Neural Network Having Multiple Channels with Own Attention Layer for Depression Detection from Social Data

Journal

NEW GENERATION COMPUTING
Volume -, Issue -, Pages -

Publisher

SPRINGER
DOI: 10.1007/s00354-023-00237-y

Keywords

Neural network; Convolution neural network; Attention mechanism; Word embeddings; Machine learning; Depression; Social data; Multiple channels

Ask authors/readers for more resources

In this study, a model using a multi-channel convolutional neural network is proposed to accurately evaluate the mental state of users on social platforms. The model is able to capture both the local features and larger context of user posts, achieving competitive accuracy and recall results in depression classification.
People share textual posts about their interests, routines, and moods on social platforms, which can be targeted to evaluate their mental state using diverse techniques such as lexical approaches, machine learning (ML), and deep learning (DL). Bigger grams (bi, tri, or quad) carry more contextual information than unigrams. However, most of the models used in the classification of depression include only unigrams. Moreover, the well-known depression classifiers, the recurrent neural networks (RNN), retain only the sequential information of the text and ignores the local features of postings. We suggest using a convolutional neural network of multiple channels (MCNN) to capture local features and larger context from user posts. Also, each channel has a dedicated dot-product attention layer to capture global features from local features of various context levels. The proposed model is tested on a depression dataset CLEF-eRisk 2018 with 214 depressed and 1493 non-depressed users' posts. Experimental results show that our model achieved competitive accuracy, recall, and f-score of 91.00%, 76.50%, and 70.51%, respectively. Accuracy is up to 5.00% higher and recall is approximately 24% higher than multi-channel CNN without an attention layer. Significant grams highlighted by the attention mechanism can be employed to provide a user-level explanation for the depression classification results. However, directly incorporating the attention weights might not be helpful as attention highlightings are dense and entangled.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.2
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available