4.7 Article

Crossmodal benefits to vocal emotion perception in cochlear implant users

Related references

Note: Only part of the references are listed.
Article Audiology & Speech-Language Pathology

Parameter-Specific Morphing Reveals Contributions of Timbre to the Perception of Vocal Emotions in Cochlear Implant Users

Celina I. von Eiff et al.

Summary: This study compared the differences in vocal emotion perception between cochlear implant (CI) users and normal-hearing (NH) individuals. The results showed that CI users, as a group, had lower performance in vocal emotion perception. There was also a huge individual variability among CI users. In contrast to NH individuals, CI users were more efficient in using timbre information but less efficient in using fundamental frequency (F0) information for this task. Furthermore, the study found a correlation between better vocal emotion perception and higher quality of life ratings.

EAR AND HEARING (2022)

Article Behavioral Sciences

Neurocognitive effects of a training program for poor face recognizers using shape and texture caricatures: A pilot investigation

Katharina Limbach et al.

Summary: Recent research has found that individuals with poor face recognition skills tend to rely disproportionately on shape information, while texture information is more important for recognizing familiar faces. This study tested a training program using selectively caricatured faces in either shape or texture parameters. The results showed that shape training improved face matching skills, while texture training significantly enhanced face learning abilities. Furthermore, the texture training group showed enhanced brain responses to novel faces after training, suggesting changes in early markers of face processing. This study suggests that parameter-specific caricature training may be a promising way to improve performance in individuals with poor face recognition skills.

NEUROPSYCHOLOGIA (2022)

Article Acoustics

An Overview of Voice Conversion and Its Challenges: From Statistical Modeling to Deep Learning

Berrak Sisman et al.

Summary: Voice conversion is a technology that changes speaker identity while keeping linguistic content unchanged, involving various speech processing techniques. Recent advancements allow for producing human-like voice quality with high speaker similarity. This article provides an overview of voice conversion techniques, performance evaluation methods, and discusses their promise and limitations.

IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING (2021)

Review Psychology

Multisensory Integration as a Window into Orderly and Disrupted Cognition and Communication

Mark T. Wallace et al.

ANNUAL REVIEW OF PSYCHOLOGY, VOL 71 (2020)

Review Behavioral Sciences

Face and Voice Perception: Understanding Commonalities and Differences

Andrew W. Young et al.

TRENDS IN COGNITIVE SCIENCES (2020)

Article Audiology & Speech-Language Pathology

Parameter-Specific Morphing Reveals Contributions of Timbre and Fundamental Frequency Cues to the Perception of Voice Gender and Age in Cochlear Implant Users

Verena G. Skuk et al.

JOURNAL OF SPEECH LANGUAGE AND HEARING RESEARCH (2020)

Letter Audiology & Speech-Language Pathology

The Role of Stimulus Type and Social Signal for Voice Perception in Cochlear Implant Users: Response to the Letter by Meister et al

Stefan R. Schweinberger et al.

JOURNAL OF SPEECH LANGUAGE AND HEARING RESEARCH (2020)

Article Psychology, Experimental

The perception of caricatured emotion in voice

Caroline M. Whiting et al.

COGNITION (2020)

Article Acoustics

Statistical Parametric Speech Synthesis Incorporating Generative Adversarial Networks

Yuki Saito et al.

IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING (2018)

Article Multidisciplinary Sciences

Improving face identity perception in age-related macular degeneration via caricaturing

Jo Lane et al.

SCIENTIFIC REPORTS (2018)

Article Acoustics

Vocal emotion recognition performance predicts the quality of life in adult cochlear implant users

Xin Luo et al.

JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA (2018)

Article Audiology & Speech-Language Pathology

Voice emotion perception and production in cochlear implant users

N. T. Jiam et al.

HEARING RESEARCH (2017)

Article Audiology & Speech-Language Pathology

Emotional recognition of dynamic facial expressions before and after cochlear implantation in adults with progressive deafness

Emmanuele Ambert-Dahan et al.

HEARING RESEARCH (2017)

Editorial Material Multidisciplinary Sciences

Cooperation between hearing and vision in people with cochlear implants

Mark T. Wallace

PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA (2017)

Article Multidisciplinary Sciences

Adaptive benefit of cross-modal plasticity following cochlear implantation in deaf adults

Carly A. Anderson et al.

PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA (2017)

Article Multidisciplinary Sciences

Multisensory emotion perception in congenitally, early, and late deaf CI users

Ineke Fengler et al.

PLOS ONE (2017)

Article Audiology & Speech-Language Pathology

Multisensory Integration in Cochlear Implant Recipients

Ryan A. Stevenson et al.

EAR AND HEARING (2017)

Article Audiology & Speech-Language Pathology

Voice emotion recognition by cochlear-implanted children and their normally-hearing peers

Monita Chatterjee et al.

HEARING RESEARCH (2015)

Article Behavioral Sciences

Multisensory perception of the six basic emotions is modulated by attentional instruction and unattended modality

Sachiko Takagi et al.

FRONTIERS IN INTEGRATIVE NEUROSCIENCE (2015)

Article Neurosciences

Multisensory emotions: perception, combination and underlying neural processes

Martin Klasen et al.

REVIEWS IN THE NEUROSCIENCES (2012)

Article Audiology & Speech-Language Pathology

Auditory, Visual, and Auditory-Visual Perceptions of Emotions by Young Children With Hearing Loss Versus Children With Normal Hearing

Tova Most et al.

JOURNAL OF SPEECH LANGUAGE AND HEARING RESEARCH (2012)

Article Psychology, Experimental

Preattentive processing of audio-visual emotional signals

Julia Foecker et al.

ACTA PSYCHOLOGICA (2011)

Article Psychology, Biological

The role of audiovisual asynchrony in person recognition

David M. C. Robertson et al.

QUARTERLY JOURNAL OF EXPERIMENTAL PSYCHOLOGY (2010)

Article Neurosciences

Superior temporal activation in response to dynamic audio-visual emotional cues

Diana L. Robins et al.

BRAIN AND COGNITION (2009)

Article Behavioral Sciences

Visual stimuli can impair auditory processing in cochlear implant users

Francois Champoux et al.

NEUROPSYCHOLOGIA (2009)

Article Multidisciplinary Sciences

MEG demonstrates a supra-additive response to facial and vocal emotion in the right superior temporal sulcus

Cindy C. Hagan et al.

PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA (2009)

Article Audiology & Speech-Language Pathology

Quality of Life for Children With Cochlear Implants: Perceived Benefits and Problems and the Perception of Single Words and Emotional Sounds

Efrat A. Schorr et al.

JOURNAL OF SPEECH LANGUAGE AND HEARING RESEARCH (2009)

Article Neurosciences

McGurk effects in cochlear-implanted deaf subjects

Julien Rouger et al.

BRAIN RESEARCH (2008)

Review Neurosciences

Multisensory integration: current issues from the perspective of the single neuron

Barry E. Stein et al.

NATURE REVIEWS NEUROSCIENCE (2008)

Article Multidisciplinary Sciences

Evidence that cochlear-implanted deaf patients are better multisensory integrators

J. Rouger et al.

PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA (2007)

Article Behavioral Sciences

Temporal window of integration in auditory-visual speech perception

Virginie van Wassenhove et al.

NEUROPSYCHOLOGIA (2007)

Article Multidisciplinary Sciences

Auditory-visual fusion in speech perception in children with cochlear implants

EA Schorr et al.

PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA (2005)

Review Neurosciences

Human brain regions involved in recognizing environmental sounds

JW Lewis et al.

CEREBRAL CORTEX (2004)

Article Behavioral Sciences

Is cross-modal integration of emotional expressions independent of attentional resources?

Jean Vroomen et al.

COGNITIVE AFFECTIVE & BEHAVIORAL NEUROSCIENCE (2001)

Article Biochemistry & Molecular Biology

Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex

GA Calvert et al.

CURRENT BIOLOGY (2000)

Article Psychology, Experimental

The perception of emotions by ear and by eye

B de Gelder et al.

COGNITION & EMOTION (2000)