Dynamic brain representations underlying emotional experience

情绪体验背后的动态大脑表征

基本信息

  • 批准号:
    10116182
  • 负责人:
  • 金额:
    $ 67.59万
  • 依托单位:
  • 依托单位国家:
    美国
  • 项目类别:
  • 财政年份:
    2018
  • 资助国家:
    美国
  • 起止时间:
    2018-03-01 至 2022-12-31
  • 项目状态:
    已结题

项目摘要

Emotions play a critical role in organizing human experience and behavior, and emotion dysregulation lies at the heart of psychopathology and functional impairment across disorders. To measure and understand emotion dysregulation, advances in understanding the fundamentals of how the brain generates and represents emotional states are vitally needed. This proposal develops and validates models of the brain representations that give rise to emotional states in naturalistic, narrative contexts. This will provide normative models of emotion to ground future translational studies, measurement models for specific emotional brain representations, and targets for interventions. We combine Functional Magnetic Resonance Imaging (fMRI), multi-dimensional measures of behavior, and pattern recognition techniques to develop models of brain activity that characterize and differentiate discrete categories of emotion experience (joy, anger, sadness, pride, and others) and blends of emotion. We place particular emphasis on the predictive validity (sensitivity and specificity) and generalizability of these models across sensory modalities, evaluative judgments, contextual narratives, and populations. We elicit emotional experiences in an ecologically valid paradigm using narratives (stories) experienced via listening, reading, or watching video. We measure multiple types of emotional experience in parallel with fMRI, using innovative collaborative filtering approaches to infer continuous moment-by-moment experience. The resulting brain models of specific emotion categories afford several potentially transformative advantages. Such models can (a) provide insight into which systems are necessary and sufficient for emotion generation (Aim 1); (b) be shared and tested across studies, allowing us to evaluate their generalizability across contexts (Aim 2); and (c) provide targets for psychological and neurological interventions (Aim 3). Six experiments focus on developing and validating emotional brain representations that are generalizable across individuals, research sites (Dartmouth and Colorado), and populations (college students and more diverse community samples). Expt. 1 develops models that predict the intensity of discrete emotional states. Expts. 2-4 establish the context sensitivity and generalizability of these. Expt. 2 examines the role of evaluative judgments in shaping emotional experience. Expt. 3 assesses the impact of background contextual narratives. Expt. 4 evaluates the role of sensory processing in emotion representations. Expts. 5-6 establish whether or not the brain models mediate emotional experiences. Expt. 5 uses cognitive appraisal and Expt. 6 uses real-time fMRI neurofeedback to manipulate emotion category-specific brain representations, testing for causal effects of these psychological and brain manipulations on emotional experience. Together, these studies will yield generalizable models of the dynamic brain patterns underlying specific emotional experiences. Such models could transform clinical research by allowing investigators to test emotion- focused interventions and assess emotion-related risk factors, permitting early detection and intervention.
情绪在组织人类的经验和行为中起着关键作用,情绪失调是 精神病理学和功能障碍的心脏。去衡量和理解情感 失调,在理解大脑如何产生和表现的基本原理方面取得了进展 情绪状态是非常必要的。这项提议发展并验证了大脑表征的模型 在自然主义的叙事环境中产生情感状态。这将提供情感的规范模型 为未来的转化研究奠定基础,为特定的情感大脑表征建立测量模型, 干预的目标。我们将联合收割机功能性磁共振成像(fMRI)、多维 行为测量和模式识别技术来开发大脑活动模型, 并区分情绪体验的离散类别(喜悦,愤怒,悲伤,骄傲等)和混合 情感我们特别强调的预测效度(敏感性和特异性)和概括性 这些模型跨越感官形态,评价判断,上下文叙述和人群。 我们在一个生态有效的范式中使用经历过的叙述(故事)来引出情感体验。 通过听、阅读或看视频。我们同时测量多种类型的情感体验, 功能磁共振成像,使用创新的协同过滤方法来推断连续的时刻时刻的经验。 由此产生的特定情绪类别的大脑模型提供了几个潜在的变革优势。 这样的模型可以(a)提供洞察哪些系统是必要的和足够的情感生成 (Aim 1);(B)在不同的研究中被共享和测试,使我们能够评估它们在不同背景下的普遍性 (Aim(c)提供心理和神经干预的目标(目标3)。 六个实验专注于开发和验证情绪大脑表征, 可在个人、研究地点(达特茅斯和科罗拉多)和人群(大学生)中推广 更多样的社区样本)。实验1开发了预测离散情绪强度的模型, states.摘录。2-4建立上下文敏感性和这些的普遍性。实验2.审查的作用 在塑造情感体验中的评价判断。实验3评估背景环境的影响 叙述实验4评估了感觉加工在情绪表征中的作用。摘录。5-6建立 大脑模型是否调节情感体验。实验5使用认知评价和Expt. 6 使用实时fMRI神经反馈来操纵特定于情绪类别的大脑表征,测试 这些心理和大脑操纵对情绪体验的因果影响。 总之,这些研究将产生可推广的模型的动态大脑模式的基础上,具体的 情感体验这种模型可以通过允许研究人员测试情绪来改变临床研究- 有针对性的干预措施,并评估与情绪有关的风险因素,以便及早发现和干预。

项目成果

期刊论文数量(0)
专著数量(0)
科研奖励数量(0)
会议论文数量(0)
专利数量(0)

数据更新时间:{{ journalArticles.updateTime }}

{{ item.title }}
{{ item.translation_title }}
  • DOI:
    {{ item.doi }}
  • 发表时间:
    {{ item.publish_year }}
  • 期刊:
  • 影响因子:
    {{ item.factor }}
  • 作者:
    {{ item.authors }}
  • 通讯作者:
    {{ item.author }}

数据更新时间:{{ journalArticles.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ monograph.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ sciAawards.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ conferencePapers.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ patent.updateTime }}

Luke Joseph Chang其他文献

Luke Joseph Chang的其他文献

{{ item.title }}
{{ item.translation_title }}
  • DOI:
    {{ item.doi }}
  • 发表时间:
    {{ item.publish_year }}
  • 期刊:
  • 影响因子:
    {{ item.factor }}
  • 作者:
    {{ item.authors }}
  • 通讯作者:
    {{ item.author }}

{{ truncateString('Luke Joseph Chang', 18)}}的其他基金

Characterizing the neural mechanisms of social connection
表征社会联系的神经机制
  • 批准号:
    10611142
  • 财政年份:
    2022
  • 资助金额:
    $ 67.59万
  • 项目类别:
Characterizing the neural mechanisms of social connection
表征社会联系的神经机制
  • 批准号:
    10374435
  • 财政年份:
    2022
  • 资助金额:
    $ 67.59万
  • 项目类别:
Dynamic brain representations underlying emotional experience
情绪体验背后的动态大脑表征
  • 批准号:
    10380111
  • 财政年份:
    2018
  • 资助金额:
    $ 67.59万
  • 项目类别:
Mechanisms Underlying Social Cooperative Behavior
社会合作行为的潜在机制
  • 批准号:
    7927111
  • 财政年份:
    2009
  • 资助金额:
    $ 67.59万
  • 项目类别:
Prefrontal-Amygdala Interactions in Social Learning
社会学习中的前额叶-杏仁核相互作用
  • 批准号:
    9499980
  • 财政年份:
    2007
  • 资助金额:
    $ 67.59万
  • 项目类别:
{{ showInfoDetail.title }}

作者:{{ showInfoDetail.author }}

知道了