Audiovisual Perception of Emotion and Speech in Hearing Individuals and Cochlear Implant Users

听力正常者和人工耳蜗使用者对情绪和言语的视听感知

基本信息

项目摘要

The ability to communicate via auditory spoken language is taken as a benchmark for success of cochlear implants (CIs), but this disregards the important role visual cues play in communication. The relevance of socio-emotional signals and their importance for quality of life with a CI (Luo, Kern, & Pulling, 2018; Schorr, Roth, & Fox, 2009) calls for research on visual benefits to communication. Recruiting models of communication via the face and voice (Young, Frühholz, & Schweinberger, 2020), we consider that deafness can elicit crossmodal cortical plasticity, such that visual stimuli can activate auditory cortex areas. Even after adaptation to a CI, initial findings suggest a particularly strong contribution of visual information to the perception of speech and speaker gender. Better understanding of these phenomena at the functional and brain level is required to promote efficient interventions improving communication, and ultimately life quality. Here we focus on postlingually deaf adult CI users and propose four studies (S1-S4). In S1, we conduct a systematic review to determine the current state of knowledge regarding the role of visual information (face or manual gesture) for emotion recognition and speech perception from voices, in hearing adults and CI users. In S2, we explore in a behavioral experiment with dynamic time-synchronized audiovisual stimuli whether CI users benefit more from congruent facial expressions when recognizing vocal emotions than do hearing adults – and whether this holds even when controlling for overall auditory-only performance levels. Importantly, we use voice morphing technology, rather than noise, to equate performance levels. In S3, we study brain correlates of audiovisual integration (AVI) in event-related potentials (ERPs) to audiovisual (AV) emotional stimuli. We focus on the ability of congruent AV stimuli to speed up neural processing, and investigate relationships between individual neural markers of AVI and behavioral performance in emotion recognition. In S4, we study the degree to which perceptual training with caricatured vocal emotions can improve auditory and audiovisual emotion recognition in adult CI users. We assess relationships between emotion recognition abilities and reported quality of life in all studies. The project builds on successful previous research funded by the DFG on Voice Perception and Audiovisual integration in the identification of speaker and speech, and on our long-standing collaboration with the Cochlear Implant Rehabilitation Centre in Thuringia. We hope this work will contribute to models of the cognitive and brain mechanisms underlying multimodal perception in human communication. We propose that better understanding of the mechanisms by which visual facial signals support CI users will provide important information that can be used to optimize both linguistic and socio-emotional communication, and ultimately quality of life.
通过听觉语言进行交流的能力被视为人工耳蜗植入(CIs)成功的基准,但这忽视了视觉线索在交流中的重要作用。社会情感信号的相关性及其对CI生活质量的重要性(Luo, Kern, & pull2018; Schorr, Roth, & Fox, 2009)要求研究视觉对沟通的好处。招募通过面部和声音进行交流的模型(Young, fr<s:2> hholz, & Schweinberger, 2020),我们认为耳聋可以引发皮层的跨模态可塑性,因此视觉刺激可以激活听觉皮层区域。即使在适应了CI之后,最初的研究结果表明,视觉信息对语音和说话者性别的感知有特别强的贡献。我们需要在功能和大脑层面更好地理解这些现象,以促进有效的干预措施,改善沟通,最终提高生活质量。在这里,我们将重点放在语言后失聪的成人CI用户上,并提出了四项研究(S1-S4)。在S1中,我们进行了一项系统综述,以确定视觉信息(面部或手势)在听力正常的成年人和CI用户中对声音的情感识别和语音感知中的作用的知识现状。在S2中,我们通过动态时间同步视听刺激的行为实验,探讨了CI用户在识别声音情绪时是否比听力正常的成年人更能从一致的面部表情中获益,以及即使在控制整体听觉表现水平时,这一结果是否也适用。重要的是,我们使用语音变形技术,而不是噪音,来平衡性能水平。在S3中,我们研究了事件相关电位(ERPs)对视听(AV)情绪刺激的视听整合(AVI)的脑相关性。本研究聚焦于一致的视音频刺激加速神经加工的能力,并探讨了视音频个体神经标记与情绪识别行为表现之间的关系。在S4中,我们研究了具有漫画声音情绪的感知训练在多大程度上可以提高成人CI用户的听觉和视听情感识别。我们评估了所有研究中情绪识别能力和报告生活质量之间的关系。该项目建立在之前由DFG资助的关于声音感知和视听整合识别说话者和言语的成功研究基础上,以及我们与图林根州人工耳蜗康复中心的长期合作基础上。我们希望这项工作将有助于建立人类交流中多模态感知的认知和大脑机制模型。我们建议,更好地理解视觉面部信号支持CI用户的机制,将提供重要的信息,可用于优化语言和社会情感沟通,并最终提高生活质量。

项目成果

期刊论文数量(0)
专著数量(0)
科研奖励数量(0)
会议论文数量(0)
专利数量(0)

数据更新时间:{{ journalArticles.updateTime }}

{{ item.title }}
{{ item.translation_title }}
  • DOI:
    {{ item.doi }}
  • 发表时间:
    {{ item.publish_year }}
  • 期刊:
  • 影响因子:
    {{ item.factor }}
  • 作者:
    {{ item.authors }}
  • 通讯作者:
    {{ item.author }}

数据更新时间:{{ journalArticles.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ monograph.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ sciAawards.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ conferencePapers.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ patent.updateTime }}

Professor Dr. Christian Dobel其他文献

Professor Dr. Christian Dobel的其他文献

{{ item.title }}
{{ item.translation_title }}
  • DOI:
    {{ item.doi }}
  • 发表时间:
    {{ item.publish_year }}
  • 期刊:
  • 影响因子:
    {{ item.factor }}
  • 作者:
    {{ item.authors }}
  • 通讯作者:
    {{ item.author }}

{{ truncateString('Professor Dr. Christian Dobel', 18)}}的其他基金

Computerbasiertes Rehabilitationsprogramm für Erwachsene und Kinder nach Cochlea Implantation und dessen neurokognitive Evaluation
成人和儿童人工耳蜗植入后计算机康复方案及其神经认知评估
  • 批准号:
    102731728
  • 财政年份:
    2009
  • 资助金额:
    --
  • 项目类别:
    Research Grants
Kognitive und linquistische Repräsentation von Ereignissen bei Sprachproduktion und Sprachwahrnehmung
言语产生和言语感知中事件的认知和语言表征
  • 批准号:
    5406360
  • 财政年份:
    2003
  • 资助金额:
    --
  • 项目类别:
    Research Grants
Eye tracking studies in language production tasks.
语言产生任务中的眼动追踪研究。
  • 批准号:
    5269628
  • 财政年份:
    2000
  • 资助金额:
    --
  • 项目类别:
    Research Fellowships
The tinnitus network: comorbidity, plasticity and response to treatment
耳鸣网络:合并症、可塑性和对治疗的反应
  • 批准号:
    317886062
  • 财政年份:
  • 资助金额:
    --
  • 项目类别:
    Research Grants

相似海外基金

Developmental relations between emotion input and emotion perception
情绪输入与情绪感知之间的发展关系
  • 批准号:
    2333886
  • 财政年份:
    2024
  • 资助金额:
    --
  • 项目类别:
    Standard Grant
Mapping perception and emotion in the human brain with functional magnetic resonance imaging and diffusion tensor imaging using naturalistic stimuli
使用自然刺激的功能磁共振成像和扩散张量成像来绘制人脑中的感知和情绪
  • 批准号:
    RGPIN-2021-03568
  • 财政年份:
    2022
  • 资助金额:
    --
  • 项目类别:
    Discovery Grants Program - Individual
Automating Emotion Perception towards Social Machines
自动化对社交机器的情感感知
  • 批准号:
    532647-2019
  • 财政年份:
    2022
  • 资助金额:
    --
  • 项目类别:
    Postgraduate Scholarships - Doctoral
Investigating the Neural Basis of Flexible Emotion Perception
研究灵活情绪感知的神经基础
  • 批准号:
    10569934
  • 财政年份:
    2022
  • 资助金额:
    --
  • 项目类别:
Mapping perception and emotion in the human brain with functional magnetic resonance imaging and diffusion tensor imaging using naturalistic stimuli
使用自然刺激的功能磁共振成像和扩散张量成像来绘制人脑中的感知和情绪
  • 批准号:
    DGECR-2021-00297
  • 财政年份:
    2021
  • 资助金额:
    --
  • 项目类别:
    Discovery Launch Supplement
Impact of Androgens on Women's Emotion Perception
雄激素对女性情绪感知的影响
  • 批准号:
    565794-2021
  • 财政年份:
    2021
  • 资助金额:
    --
  • 项目类别:
    Alexander Graham Bell Canada Graduate Scholarships - Master's
Mapping perception and emotion in the human brain with functional magnetic resonance imaging and diffusion tensor imaging using naturalistic stimuli
使用自然刺激的功能磁共振成像和扩散张量成像来绘制人脑中的感知和情绪
  • 批准号:
    RGPIN-2021-03568
  • 财政年份:
    2021
  • 资助金额:
    --
  • 项目类别:
    Discovery Grants Program - Individual
Automating Emotion Perception towards Social Machines
自动化对社交机器的情感感知
  • 批准号:
    532647-2019
  • 财政年份:
    2021
  • 资助金额:
    --
  • 项目类别:
    Postgraduate Scholarships - Doctoral
Automating Emotion Perception towards Social Machines
自动化对社交机器的情感感知
  • 批准号:
    532647-2019
  • 财政年份:
    2020
  • 资助金额:
    --
  • 项目类别:
    Postgraduate Scholarships - Doctoral
Understanding individual differences in facial emotion perception and their association with psychiatric risk indicators
了解面部情绪感知的个体差异及其与精神风险指标的关联
  • 批准号:
    MR/S011307/1
  • 财政年份:
    2019
  • 资助金额:
    --
  • 项目类别:
    Research Grant
{{ showInfoDetail.title }}

作者:{{ showInfoDetail.author }}

知道了