CI-SUSTAIN: Collaborative Research: Extending a Large Multimodal Corpus of Spontaneous Behavior for Automated Emotion Analysis
CI-SUSTAIN:协作研究:扩展自发行为的大型多模态语料库以进行自动情绪分析
基本信息
- 批准号:1629716
- 负责人:
- 金额:$ 30.1万
- 依托单位:
- 依托单位国家:美国
- 项目类别:Standard Grant
- 财政年份:2016
- 资助国家:美国
- 起止时间:2016-09-01 至 2020-07-31
- 项目状态:已结题
- 来源:
- 关键词:
项目摘要
This project will extend and sustain a widely-used data infrastructure for studying human emotion, hosted at the lead investigator's university and available to the research community. The first two versions of the dataset (BP4D and BP4D+) contain videos of people reacting to varied emotion-eliciting situations, their self-reported emotion, and expert annotations of their facial expression. Version 1, BP4D (n=41), has been used by over 100 research groups and supported a successful community competition around recognizing emotion. The second version (BP4D+) adds participants (n = 140), thermal imaging, and measures of peripheral physiology. The current project greatly broadens and extends this corpus to produce a new dataset (BP4D++) that enables deep-learning approaches, increases generalizability, and builds research infrastructure and community in computer and behavioral science. The collaborators will (1) increase participant diversity; 2) add videos of pairs of people interacting to the current mix of individual and interviewer-mediated video; 3) increase the number of participants to meet the demands of recent advances in "big data" approaches to machine learning; and 4) expand the size and scope of annotations in the videos. They will also involve the community through an oversight and coordinating consortium that includes researchers in computer vision, biometrics, robotics, and cognitive and behavioral science. The consortium will be composed of special interest groups that focus on various aspects of the corpus, including groups responsible for completing the needed annotations, generating meta-data, and expanding the database application scope. Having an infrastructure to support emotion recognition research matters because computer systems that interact with people (such as phone assistants or characters in virtual reality environments) will be more useful if they react appropriately to what people are doing, thinking, and feeling. The team will triple the number of participants in the combined corpora to 540. They will develop a dyadic interaction task and capture data from 100 interacting dyads to support dynamic modeling of interpersonal influence across expressive behavior and physiology, as well as analysis of emotional synchrony. They will increase the density of facial annotations to about 15 million frames in total, allowing the database to become sufficiently large to support deep-learning approaches to multimodal emotion detection. These annotations will be accomplished through a hybrid approach that combines expert coding using the Facial Action Coding System, automated face analysis, and crowdsourcing with expert input from the research community. Finally, the recorded data will be augmented with a wide range of meta-data derived from 2D videos, 3D videos, thermal videos, and physiological signals. To ensure the community is involved in sustaining the infrastructure, in addition to the governance consortium described above, the investigators will involve the community in jointly building both APIs that allow adding meta-data and annotations and tools to support the submission and evaluation of new recognition algorithms, then organizing community-wide competitions using those tools. The research team will also reach out to new research communities around health computing, biometrics, and affective computing to widen the utility of the enhanced infrastructure, grow the community of expert annotators through training workshops, and build an educational community around the infrastructure that facilitates the development and sharing of course materials that use it. Long-term, the infrastructure will be funded through a combination of commercial licensing and support from the lead university's system administration group.
该项目将扩展和维持一个广泛使用的数据基础设施,用于研究人类情感,托管在首席研究员所在的大学,并提供给研究界。 数据集的前两个版本(BP 4D和BP 4D+)包含人们对各种情绪引发情况的反应,他们自我报告的情绪以及他们面部表情的专家注释的视频。版本1,BP 4D(n=41),已被100多个研究小组使用,并支持围绕识别情绪的成功社区竞争。 第二个版本(BP 4D+)增加了参与者(n = 140)、热成像和外周生理学指标。 目前的项目极大地拓宽和扩展了这个语料库,以产生一个新的数据集(BP 4D ++),该数据集支持深度学习方法,提高了可推广性,并建立了计算机和行为科学的研究基础设施和社区。 合作者将(1)增加参与者的多样性; 2)在当前的个人视频和访谈视频的混合中添加成对人互动的视频; 3)增加参与者的数量,以满足机器学习“大数据”方法的最新进展的需求; 4)扩大视频中注释的大小和范围。他们还将通过一个监督和协调联盟参与社区,该联盟包括计算机视觉,生物识别,机器人技术以及认知和行为科学的研究人员。该联盟将由专注于语料库各个方面的特殊兴趣小组组成,包括负责完成所需注释、生成元数据和扩展数据库应用范围的小组。 拥有一个支持情感识别研究的基础设施很重要,因为与人交互的计算机系统(如电话助理或虚拟现实环境中的角色)如果能对人们的行为、想法和感受做出适当的反应,就会更有用。 该团队将把合并后的语料库中的参与者数量增加两倍,达到540人。 他们将开发一个二元互动任务,并从100个互动的二元中捕获数据,以支持跨表达行为和生理的人际影响的动态建模,以及情感同步的分析。 他们将把面部注释的密度增加到总共约1500万帧,使数据库变得足够大,以支持多模态情感检测的深度学习方法。这些注释将通过一种混合方法来完成,该方法将使用面部动作编码系统的专家编码,自动面部分析和众包与研究界的专家输入相结合。 最后,记录的数据将通过来自2D视频、3D视频、热视频和生理信号的各种元数据进行增强。 为了确保社区参与维持基础设施,除了上述治理联盟之外,研究人员还将让社区参与联合构建允许添加元数据和注释的API以及支持提交和评估新识别算法的工具,然后使用这些工具组织社区范围内的竞赛。 研究团队还将接触围绕健康计算,生物识别和情感计算的新研究社区,以扩大增强基础设施的实用性,通过培训研讨会发展专家注释者社区,并围绕基础设施建立教育社区,促进使用它的课程材料的开发和共享。有关基建的经费,将由商业特许及牵头大学的系统行政小组提供。
项目成果
期刊论文数量(0)
专著数量(0)
科研奖励数量(0)
会议论文数量(0)
专利数量(0)
数据更新时间:{{ journalArticles.updateTime }}
{{
item.title }}
{{ item.translation_title }}
- DOI:
{{ item.doi }} - 发表时间:
{{ item.publish_year }} - 期刊:
- 影响因子:{{ item.factor }}
- 作者:
{{ item.authors }} - 通讯作者:
{{ item.author }}
数据更新时间:{{ journalArticles.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ monograph.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ sciAawards.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ conferencePapers.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ patent.updateTime }}
Jeffrey Cohn其他文献
Identification of candidate neural biomarkers of obsessive-compulsive symptom intensity and response to deep brain stimulation
- DOI:
10.1016/j.brs.2023.01.180 - 发表时间:
2023-01-01 - 期刊:
- 影响因子:
- 作者:
Nicole Provenza;Chandra Swamy;Luciano Branco;Evan Dastin-van Rijn;Saurabh Hinduja;Michaela Alarie;Ayan Waite;Michelle Avendano-Ortega;Sarah McKay;Greg Vogt;Huy Dang;Raissa Mathura;Bradford Roarr;Jeff Herron;Eric Storch;Jeffrey Cohn;David Borton;Nuri Ince;Wayne Goodman;Sameer Sheth - 通讯作者:
Sameer Sheth
Subspace methods for electronic structure simulations on quantum computers
量子计算机电子结构模拟的子空间方法
- DOI:
10.1088/2516-1075/ad3592 - 发表时间:
2023 - 期刊:
- 影响因子:2.6
- 作者:
Mario Motta;William Kirby;I. Liepuoniute;Kevin J. Sung;Jeffrey Cohn;Antonio Mezzacapo;Katherine Klymko;Nam Nguyen;Nobuyuki Yoshioka;Julia E. Rice - 通讯作者:
Julia E. Rice
Efficacy of the Omega-3 Index in predicting NAFLD in overweight and obese adults: A pilot study
- DOI:
10.1016/j.orcp.2014.10.138 - 发表时间:
2014-12-01 - 期刊:
- 影响因子:
- 作者:
Helen Parker;Helen O’Connor;Shelley Keating;Jeffrey Cohn;Manohar Garg;Ian Caterson;Jacob George;Nathan Johnson - 通讯作者:
Nathan Johnson
Chronic Ecological Assessment of Intracranial Neural Activity Synchronized to Disease-Relevant Behaviors in Obsessive-Compulsive Disorder
- DOI:
10.1016/j.biopsych.2023.02.041 - 发表时间:
2023-05-01 - 期刊:
- 影响因子:
- 作者:
Nicole Provenza;Evan Dastin-van Rijn;Chandra Prakash Swamy;Huy Dang;Sameer Rajesh;Nabeel Diab;Laszlo Jeni;Saurabh Hinduja;Michelle Avendano-Ortega;Sarah A. Mckay;Gregory S. Vogt;Bradford Roarr;Andrew Wiese;Ben Shofty;Jeffrey Herron;Kelly Bijanki;Eric Storch;Jeffrey Cohn;Nuri Ince;David Borton - 通讯作者:
David Borton
Identification of Candidate Neural Biomarkers of Obsessive-Compulsive Symptom Intensity and Response to Deep Brain Stimulation
- DOI:
10.1016/j.biopsych.2023.02.174 - 发表时间:
2023-05-01 - 期刊:
- 影响因子:
- 作者:
Nicole Provenza;Evan Dastin-van Rijn;Chandra Prakash Swamy;Luciano Branco;Saurabh Hinduja;Michelle Avendano-Ortega;Sarah A. Mckay;Gregory S. Vogt;Huy Dang;Bradford Roarr;Andrew Wiese;Ben Shofty;Jeffrey Herron;Matthew Harrison;Kelly Bijanki;Eric Storch;Jeffrey Cohn;Nuri Ince;David Borton;Wayne Goodman - 通讯作者:
Wayne Goodman
Jeffrey Cohn的其他文献
{{
item.title }}
{{ item.translation_title }}
- DOI:
{{ item.doi }} - 发表时间:
{{ item.publish_year }} - 期刊:
- 影响因子:{{ item.factor }}
- 作者:
{{ item.authors }} - 通讯作者:
{{ item.author }}
{{ truncateString('Jeffrey Cohn', 18)}}的其他基金
SCH: INT: Collaborative Research: Dyadic Behavior Informatics for Psychotherapy Process and Outcome
SCH:INT:合作研究:心理治疗过程和结果的二元行为信息学
- 批准号:
1721667 - 财政年份:2017
- 资助金额:
$ 30.1万 - 项目类别:
Standard Grant
WORKSHOP: Doctoral Consortium at the ACM International Conference on Multimodal Interaction 2014
研讨会:2014 年 ACM 国际多模式交互会议上的博士联盟
- 批准号:
1443097 - 财政年份:2014
- 资助金额:
$ 30.1万 - 项目类别:
Standard Grant
SCH: INT: Collaborative Research: Learning and Sensory-based Modeling for Adaptive Web-Empowerment Trauma Treatment
SCH:INT:协作研究:自适应网络赋权创伤治疗的学习和基于感觉的建模
- 批准号:
1418026 - 财政年份:2014
- 资助金额:
$ 30.1万 - 项目类别:
Standard Grant
CI-ADDO-EN: Collaborative Research: 3D Dynamic Multimodal Spontaneous Emotion Corpus for Automated Facial Behavior and Emotion Analysis
CI-ADDO-EN:协作研究:用于自动面部行为和情绪分析的 3D 动态多模态自发情绪语料库
- 批准号:
1205195 - 财政年份:2012
- 资助金额:
$ 30.1万 - 项目类别:
Standard Grant
Collaborative Research: Communication, Perturbation, and Early Development
合作研究:沟通、扰动和早期发展
- 批准号:
1052603 - 财政年份:2011
- 资助金额:
$ 30.1万 - 项目类别:
Standard Grant
EAGER: Spontaneous 4D-Facial Expression Corpus for Automated Facial Image Analysis
EAGER:用于自动面部图像分析的自发 4D 面部表情语料库
- 批准号:
1051169 - 财政年份:2010
- 资助金额:
$ 30.1万 - 项目类别:
Standard Grant
Collaborative Research DHB: Coordinated motion and facial expression in dyadic conversation
DHB 合作研究:二元对话中的协调运动和面部表情
- 批准号:
0527397 - 财政年份:2006
- 资助金额:
$ 30.1万 - 项目类别:
Standard Grant
Collaborative Proposal: Automated Measurements of Infant Facial Expressions and Human Ratings of Their Emotional Intensity
合作提案:婴儿面部表情的自动测量和情绪强度的人类评级
- 批准号:
0418001 - 财政年份:2004
- 资助金额:
$ 30.1万 - 项目类别:
Standard Grant
Mother-Infant Coordination of Vocalization and Affect
母婴发声和情感的协调
- 批准号:
8919711 - 财政年份:1990
- 资助金额:
$ 30.1万 - 项目类别:
Continuing Grant
相似海外基金
CRI: CI-SUSTAIN: Collaborative Research: Sustaining Lemur Project Resources for the Long-Term
CRI:CI-SUSTAIN:合作研究:长期维持狐猴项目资源
- 批准号:
1822986 - 财政年份:2018
- 资助金额:
$ 30.1万 - 项目类别:
Standard Grant
CRI: CI-SUSTAIN: Collaborative Research: CiteSeerX: Toward Sustainable Support of Scholarly Big Data
CRI:CI-SUSTAIN:协作研究:CiteSeerX:迈向学术大数据的可持续支持
- 批准号:
1823288 - 财政年份:2018
- 资助金额:
$ 30.1万 - 项目类别:
Standard Grant
CRI: CI-SUSTAIN: Collaborative Research: CiteSeerX: Toward Sustainable Support of Scholarly Big Data
CRI:CI-SUSTAIN:协作研究:CiteSeerX:迈向学术大数据的可持续支持
- 批准号:
1853919 - 财政年份:2018
- 资助金额:
$ 30.1万 - 项目类别:
Standard Grant
CRI: CI-SUSTAIN: Collaborative Research: CiteSeerX: Toward Sustainable Support of Scholarly Big Data
CRI:CI-SUSTAIN:协作研究:CiteSeerX:迈向学术大数据的可持续支持
- 批准号:
1823292 - 财政年份:2018
- 资助金额:
$ 30.1万 - 项目类别:
Standard Grant
CRI: CI-SUSTAIN: Collaborative Research: Sustaining Lemur Project Resources for the Long-Term
CRI:CI-SUSTAIN:合作研究:长期维持狐猴项目资源
- 批准号:
1822975 - 财政年份:2018
- 资助金额:
$ 30.1万 - 项目类别:
Standard Grant
Collaborative Research: CI-SUSTAIN: StarExec: Cross-Community Infrastructure for Logic Solving
协作研究:CI-SUSTAIN:StarExec:用于逻辑解决的跨社区基础设施
- 批准号:
1730419 - 财政年份:2017
- 资助金额:
$ 30.1万 - 项目类别:
Standard Grant
Collaborative Research: CI-SUSTAIN: National File System Trace Repository
合作研究:CI-SUSTAIN:国家文件系统跟踪存储库
- 批准号:
1730726 - 财政年份:2017
- 资助金额:
$ 30.1万 - 项目类别:
Standard Grant
Collaborative Research: CI-SUSTAIN: National File System Trace Repository
合作研究:CI-SUSTAIN:国家文件系统跟踪存储库
- 批准号:
1729939 - 财政年份:2017
- 资助金额:
$ 30.1万 - 项目类别:
Standard Grant
Collaborative Research: CI-SUSTAIN: StarExec: Cross-Community Infrastructure for Logic Solving
协作研究:CI-SUSTAIN:StarExec:用于逻辑解决的跨社区基础设施
- 批准号:
1729603 - 财政年份:2017
- 资助金额:
$ 30.1万 - 项目类别:
Standard Grant
CI-SUSTAIN: Collaborative Research: Sustaining Successful Smartphone Testbeds to Enable Diverse Mobile Experiments
CI-SUSTAIN:协作研究:维持成功的智能手机测试平台以实现多样化的移动实验
- 批准号:
1629894 - 财政年份:2016
- 资助金额:
$ 30.1万 - 项目类别:
Standard Grant