CI-SUSTAIN: Collaborative Research: Extending a Large Multimodal Corpus of Spontaneous Behavior for Automated Emotion Analysis

CI-SUSTAIN:协作研究:扩展自发行为的大型多模态语料库以进行自动情绪分析

基本信息

  • 批准号:
    1629898
  • 负责人:
  • 金额:
    $ 48.36万
  • 依托单位:
  • 依托单位国家:
    美国
  • 项目类别:
    Standard Grant
  • 财政年份:
    2016
  • 资助国家:
    美国
  • 起止时间:
    2016-09-01 至 2022-08-31
  • 项目状态:
    已结题

项目摘要

This project will extend and sustain a widely-used data infrastructure for studying human emotion, hosted at the lead investigator's university and available to the research community. The first two versions of the dataset (BP4D and BP4D+) contain videos of people reacting to varied emotion-eliciting situations, their self-reported emotion, and expert annotations of their facial expression. Version 1, BP4D (n=41), has been used by over 100 research groups and supported a successful community competition around recognizing emotion. The second version (BP4D+) adds participants (n = 140), thermal imaging, and measures of peripheral physiology. The current project greatly broadens and extends this corpus to produce a new dataset (BP4D++) that enables deep-learning approaches, increases generalizability, and builds research infrastructure and community in computer and behavioral science. The collaborators will (1) increase participant diversity; 2) add videos of pairs of people interacting to the current mix of individual and interviewer-mediated video; 3) increase the number of participants to meet the demands of recent advances in "big data" approaches to machine learning; and 4) expand the size and scope of annotations in the videos. They will also involve the community through an oversight and coordinating consortium that includes researchers in computer vision, biometrics, robotics, and cognitive and behavioral science. The consortium will be composed of special interest groups that focus on various aspects of the corpus, including groups responsible for completing the needed annotations, generating meta-data, and expanding the database application scope. Having an infrastructure to support emotion recognition research matters because computer systems that interact with people (such as phone assistants or characters in virtual reality environments) will be more useful if they react appropriately to what people are doing, thinking, and feeling. The team will triple the number of participants in the combined corpora to 540. They will develop a dyadic interaction task and capture data from 100 interacting dyads to support dynamic modeling of interpersonal influence across expressive behavior and physiology, as well as analysis of emotional synchrony. They will increase the density of facial annotations to about 15 million frames in total, allowing the database to become sufficiently large to support deep-learning approaches to multimodal emotion detection. These annotations will be accomplished through a hybrid approach that combines expert coding using the Facial Action Coding System, automated face analysis, and crowdsourcing with expert input from the research community. Finally, the recorded data will be augmented with a wide range of meta-data derived from 2D videos, 3D videos, thermal videos, and physiological signals. To ensure the community is involved in sustaining the infrastructure, in addition to the governance consortium described above, the investigators will involve the community in jointly building both APIs that allow adding meta-data and annotations and tools to support the submission and evaluation of new recognition algorithms, then organizing community-wide competitions using those tools. The research team will also reach out to new research communities around health computing, biometrics, and affective computing to widen the utility of the enhanced infrastructure, grow the community of expert annotators through training workshops, and build an educational community around the infrastructure that facilitates the development and sharing of course materials that use it. Long-term, the infrastructure will be funded through a combination of commercial licensing and support from the lead university's system administration group.
该项目将扩展和维持一个广泛使用的数据基础设施,用于研究人类情感,托管在首席研究员所在的大学,并提供给研究界。 数据集的前两个版本(BP 4D和BP 4D+)包含人们对各种情绪引发情况的反应,他们自我报告的情绪以及他们面部表情的专家注释的视频。版本1,BP 4D(n=41),已被100多个研究小组使用,并支持围绕识别情绪的成功社区竞争。 第二个版本(BP 4D+)增加了参与者(n = 140)、热成像和外周生理学指标。 目前的项目极大地拓宽和扩展了这个语料库,以产生一个新的数据集(BP 4D ++),该数据集支持深度学习方法,提高了可推广性,并建立了计算机和行为科学的研究基础设施和社区。 合作者将(1)增加参与者的多样性; 2)在当前的个人视频和访谈视频的混合中添加成对人互动的视频; 3)增加参与者的数量,以满足机器学习“大数据”方法的最新进展的需求; 4)扩大视频中注释的大小和范围。他们还将通过一个监督和协调联盟参与社区,该联盟包括计算机视觉,生物识别,机器人技术以及认知和行为科学的研究人员。该联盟将由专注于语料库各个方面的特殊兴趣小组组成,包括负责完成所需注释、生成元数据和扩展数据库应用范围的小组。 拥有一个支持情感识别研究的基础设施很重要,因为与人交互的计算机系统(如电话助理或虚拟现实环境中的角色)如果能对人们的行为、想法和感受做出适当的反应,就会更有用。 该团队将把合并后的语料库中的参与者数量增加两倍,达到540人。 他们将开发一个二元互动任务,并从100个互动的二元中捕获数据,以支持跨表达行为和生理的人际影响的动态建模,以及情感同步的分析。 他们将把面部注释的密度增加到总共约1500万帧,使数据库变得足够大,以支持多模态情感检测的深度学习方法。这些注释将通过一种混合方法来完成,该方法将使用面部动作编码系统的专家编码,自动面部分析和众包与研究界的专家输入相结合。 最后,记录的数据将通过来自2D视频、3D视频、热视频和生理信号的各种元数据进行增强。 为了确保社区参与维持基础设施,除了上述治理联盟之外,研究人员还将让社区参与联合构建允许添加元数据和注释的API以及支持提交和评估新识别算法的工具,然后使用这些工具组织社区范围内的竞赛。 研究团队还将接触围绕健康计算,生物识别和情感计算的新研究社区,以扩大增强基础设施的实用性,通过培训研讨会发展专家注释者社区,并围绕基础设施建立教育社区,促进使用它的课程材料的开发和共享。有关基建的经费,将由商业特许及牵头大学的系统行政小组提供。

项目成果

期刊论文数量(7)
专著数量(0)
科研奖励数量(0)
会议论文数量(0)
专利数量(0)
Adaptive Multimodal Fusion for Facial Action Units Recognition
Exploiting Semantic Embedding and Visual Feature for Facial Action Unit Detection
SAT-Net: Self-Attention and Temporal Fusion for Facial Action Unit Detection
How Do the Hearts of Deep Fakes Beat? Deep Fake Source Detection via Interpreting Residuals with Biological Signals
Region of Interest Based Graph Convolution: A Heatmap Regression Approach for Action Unit Detection
{{ item.title }}
{{ item.translation_title }}
  • DOI:
    {{ item.doi }}
  • 发表时间:
    {{ item.publish_year }}
  • 期刊:
  • 影响因子:
    {{ item.factor }}
  • 作者:
    {{ item.authors }}
  • 通讯作者:
    {{ item.author }}

数据更新时间:{{ journalArticles.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ monograph.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ sciAawards.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ conferencePapers.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ patent.updateTime }}

Lijun Yin其他文献

Diverse mechanical properties and microstructures of sorghum bran arabinoxylans/soy protein isolate mixed gels by duo-induction of peroxidase and calcium ions
过氧化物酶和钙离子双重诱导高粱麸阿拉伯木聚糖/大豆分离蛋白混合凝胶的多种机械性能和微观结构
  • DOI:
    10.1016/j.foodhyd.2020.105946
  • 发表时间:
    2020-10
  • 期刊:
  • 影响因子:
    10.7
  • 作者:
    Jinxin Yan;Boya Zhang;Feifei Wu;Wenjia Yan;Peng Lv;Madhav Yadav;Xin Jia;Lijun Yin
  • 通讯作者:
    Lijun Yin
Research on the Relationship between Land Finance and Housing Price in Urbanization Process: An Empirical Analysis of 182 Cities in China based on Threshold Panel Models,
城镇化进程中土地财政与房价关系研究——基于阈值面板模型的中国182个城市的实证分析,
The use of W/O/W controlled-release coagulants to improve the quality of bittern-solidified tofu
使用W/O/W控释凝固剂提高卤豆腐品质
  • DOI:
    10.1016/j.foodhyd.2013.08.002
  • 发表时间:
    2014-03
  • 期刊:
  • 影响因子:
    10.7
  • 作者:
    Yongqiang Cheng;Eizo Tatsumi;Masayoshi Saito;Lijun Yin
  • 通讯作者:
    Lijun Yin
Synthesis, characterization and application of sugar beet pectin-ferulic acid conjugates in the study of lipid, DNA and protein oxidation
甜菜果胶 - 阿魏酸共轭物的合成、表征及其在脂质、DNA和蛋白质氧化研究中的应用
Effects of Fermentation Temperature on the Content and Composition of Isoflavones and β-Glucosidase Activity in Sufu
发酵温度对腐乳中异黄酮含量、组成及β-葡萄糖苷酶活性的影响

Lijun Yin的其他文献

{{ item.title }}
{{ item.translation_title }}
  • DOI:
    {{ item.doi }}
  • 发表时间:
    {{ item.publish_year }}
  • 期刊:
  • 影响因子:
    {{ item.factor }}
  • 作者:
    {{ item.authors }}
  • 通讯作者:
    {{ item.author }}

{{ truncateString('Lijun Yin', 18)}}的其他基金

CI-ADDO-EN: Collaborative Research: 3D Dynamic Multimodal Spontaneous Emotion Corpus for Automated Facial Behavior and Emotion Analysis
CI-ADDO-EN:协作研究:用于自动面部行为和情绪分析的 3D 动态多模态自发情绪语料库
  • 批准号:
    1205664
  • 财政年份:
    2012
  • 资助金额:
    $ 48.36万
  • 项目类别:
    Standard Grant
EAGER: Spontaneous 4D-Facial Expression Corpus for Automated Facial Image Analysis
EAGER:用于自动面部图像分析的自发 4D 面部表情语料库
  • 批准号:
    1051103
  • 财政年份:
    2010
  • 资助金额:
    $ 48.36万
  • 项目类别:
    Standard Grant
SGER: Analyzing Facial Expression in Three Dimensional Space
SGER:分析三维空间中的面部表情
  • 批准号:
    0541044
  • 财政年份:
    2005
  • 资助金额:
    $ 48.36万
  • 项目类别:
    Standard Grant
SGER: Developing a high-definition face modeling system for recognition and generation of face and face expressions
SGER:开发高清人脸建模系统,用于人脸和面部表情的识别和生成
  • 批准号:
    0414029
  • 财政年份:
    2004
  • 资助金额:
    $ 48.36万
  • 项目类别:
    Standard Grant

相似海外基金

CRI: CI-SUSTAIN: Collaborative Research: Sustaining Lemur Project Resources for the Long-Term
CRI:CI-SUSTAIN:合作研究:长期维持狐猴项目资源
  • 批准号:
    1822986
  • 财政年份:
    2018
  • 资助金额:
    $ 48.36万
  • 项目类别:
    Standard Grant
CRI: CI-SUSTAIN: Collaborative Research: CiteSeerX: Toward Sustainable Support of Scholarly Big Data
CRI:CI-SUSTAIN:协作研究:CiteSeerX:迈向学术大数据的可持续支持
  • 批准号:
    1823288
  • 财政年份:
    2018
  • 资助金额:
    $ 48.36万
  • 项目类别:
    Standard Grant
CRI: CI-SUSTAIN: Collaborative Research: CiteSeerX: Toward Sustainable Support of Scholarly Big Data
CRI:CI-SUSTAIN:协作研究:CiteSeerX:迈向学术大数据的可持续支持
  • 批准号:
    1853919
  • 财政年份:
    2018
  • 资助金额:
    $ 48.36万
  • 项目类别:
    Standard Grant
CRI: CI-SUSTAIN: Collaborative Research: CiteSeerX: Toward Sustainable Support of Scholarly Big Data
CRI:CI-SUSTAIN:协作研究:CiteSeerX:迈向学术大数据的可持续支持
  • 批准号:
    1823292
  • 财政年份:
    2018
  • 资助金额:
    $ 48.36万
  • 项目类别:
    Standard Grant
CRI: CI-SUSTAIN: Collaborative Research: Sustaining Lemur Project Resources for the Long-Term
CRI:CI-SUSTAIN:合作研究:长期维持狐猴项目资源
  • 批准号:
    1822975
  • 财政年份:
    2018
  • 资助金额:
    $ 48.36万
  • 项目类别:
    Standard Grant
Collaborative Research: CI-SUSTAIN: StarExec: Cross-Community Infrastructure for Logic Solving
协作研究:CI-SUSTAIN:StarExec:用于逻辑解决的跨社区基础设施
  • 批准号:
    1730419
  • 财政年份:
    2017
  • 资助金额:
    $ 48.36万
  • 项目类别:
    Standard Grant
Collaborative Research: CI-SUSTAIN: National File System Trace Repository
合作研究:CI-SUSTAIN:国家文件系统跟踪存储库
  • 批准号:
    1730726
  • 财政年份:
    2017
  • 资助金额:
    $ 48.36万
  • 项目类别:
    Standard Grant
Collaborative Research: CI-SUSTAIN: National File System Trace Repository
合作研究:CI-SUSTAIN:国家文件系统跟踪存储库
  • 批准号:
    1729939
  • 财政年份:
    2017
  • 资助金额:
    $ 48.36万
  • 项目类别:
    Standard Grant
Collaborative Research: CI-SUSTAIN: StarExec: Cross-Community Infrastructure for Logic Solving
协作研究:CI-SUSTAIN:StarExec:用于逻辑解决的跨社区基础设施
  • 批准号:
    1729603
  • 财政年份:
    2017
  • 资助金额:
    $ 48.36万
  • 项目类别:
    Standard Grant
CI-SUSTAIN: Collaborative Research: Extending a Large Multimodal Corpus of Spontaneous Behavior for Automated Emotion Analysis
CI-SUSTAIN:协作研究:扩展自发行为的大型多模态语料库以进行自动情绪分析
  • 批准号:
    1629716
  • 财政年份:
    2016
  • 资助金额:
    $ 48.36万
  • 项目类别:
    Standard Grant
{{ showInfoDetail.title }}

作者:{{ showInfoDetail.author }}

知道了