4.6 Article

Multimodal Patient Satisfaction Recognition for Smart Healthcare

Journal

IEEE ACCESS
Volume 7, Issue -, Pages 174219-174226

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/ACCESS.2019.2956083

Keywords

Healthcare; local texture pattern; patient monitoring

Funding

  1. Deanship of Scientific Research at King Saud University, Riyadh, Saudi Arabia, through the Vice Deanship of Scientific Research Chairs: Chair of Smart Technologies

Ask authors/readers for more resources

The inclusion of multimodal inputs improves the accuracy and dependability of smart healthcare systems. A user satisfaction monitoring system that uses multimodal inputs composed of users' facial images and speech is proposed in this paper. This smart healthcare system then sends multimodal inputs to the cloud. The inputs are processed and classified as fully satisfied, partly satisfied, or unsatisfied, and the results are sent to various stakeholders in the smart healthcare environment. Multiple image and speech features are extracted during cloud processing. Moreover, directional derivatives and a weber local descriptor is used for speech and image features, respectively. The features are then combined to form a multimodal signal, which is supplied to a classifier by support vector machine. Our proposed system achieves 93% accuracy for satisfaction detection.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available