The effects of dynamic facial information on sensitivity to facial expressions.
动态面部信息对面部表情敏感度的影响。
基本信息
- 批准号:ES/T009098/1
- 负责人:
- 金额:$ 12.19万
- 依托单位:
- 依托单位国家:英国
- 项目类别:Fellowship
- 财政年份:2019
- 资助国家:英国
- 起止时间:2019 至 无数据
- 项目状态:已结题
- 来源:
- 关键词:
项目摘要
Facial emotion recognition is viewed as one of the defining aspects of human visual perception. However, for some members of the population extracting emotion-related information from faces is a challenging task. Deficits in facial emotion recognition are particularly prominent in autism spectrum disorders (ASD), and in certain mood-disorders. Behavioural intervention programmes have been used to facilitate facial emotion learning by utilising interactive games with computer generated agents. Exposure therapy using VR has had positive effects on reducing social phobias and effects of post-traumatic stress disorder. The objective of the present study is pilot novel stimuli that reflect naturalistic facial emotion perception within VR, with a long-term view for using this as a model for behavioural intervention programmes within clinical populations. This has been of particular interest to Sinewave; software developers whom the primary mentor of the project is already in connection with. A notable drawback facing current behavioural paradigms is their use of synthetic and 3D animated characters whose qualities do not reflect the fluid nature of human expression as it emerges in real time. This presents a gap within literature and industry that requires a realistic set of real-life, dynamic human facial expressions. If piloted now, such a stimulus set would lend itself to an expanse of areas, from clinical research areas, to cutting-edge user software development. The second mentor on the project will offer their expertise in this particular area, and possess the research background and hardware required to develop and pilot such a face set using VR. For facial stimuli to be effective tools used for measuring and improving facial emotion recognition, they must be realistic enough to simulate natural social perception: these objectives bring together expertise from both Abigail Webb and Peter Scarfe. The project will be the first to address how real-life, dynamic facial information in videos of human faces contributes to observers' emotion recognition accuracy. This is yet to be tested using an immersive VR platform; a cost-effective and novel technique for assessing and promoting the practice of facial emotion recognition in clinical populations. Such dynamic, real-time facial stimuli may be more effective when measuring natural facial emotion recognition, an advantage that is bolstered by using immersive VR. Our pilot study uses a non-clinical population, and will serve as a preliminary investigation of the efficacy of our model for immersive facial recognition. This is a particular source of interest to Sinewave. Using specialised 3D facial scanning techniques at the University of Reading (Peter Scarfe), it will be possible to create real human facial stimuli that are both 3D and dynamic, changing expression in real time. This will be investigated both outside of and within immersive VR, to measure the relative effects of VR context on facial emotion recognition. Software companies for service users, including Facebook Reality Labs (another pre-existing relationship with mentors) are now emphasising the importance of such research input, in order to develop the hardware and software necessary for relaying the nuances of such socially complex interactions. These applications are vast, and lend themselves equally well to clinical research, empirical areas of cognitive psychology, sociology, industry, and business. This area of research now taking off in areas outside of academia. We see this is the case with Facebook Reality Labs -another pre-existing relationship that exists with the primary mentor of the project- where the construction of cutting-edge immersive software is becoming increasingly informed by expert researchers within face perception. Together, this project unifies experts across psychophysics and face perception (AW), VR (LvD), creation of real-life human facial stimuli (PS).
人脸情绪识别被认为是人类视觉感知的重要方面之一。然而,对于一些人来说,从面部提取与情绪相关的信息是一项具有挑战性的任务。面部情绪识别缺陷在自闭症谱系障碍(ASD)和某些情绪障碍中尤为突出。行为干预计划已经被用来通过利用与计算机生成的代理的互动游戏来促进面部情绪的学习。VR暴露疗法在减少社交恐惧症和创伤后应激障碍方面起到了积极作用。本研究的目标是试行新颖的刺激,在虚拟现实中反映自然主义的面部情绪感知,并着眼于将其用作临床人群中行为干预计划的模型。SineWave对此特别感兴趣;该项目的主要导师已经与软件开发人员建立了联系。目前的行为模式面临的一个显著缺陷是,它们使用的是合成的和3D动画角色,这些角色的品质不能反映出人类表情实时出现时的流动性。这在文学和行业中呈现出一种鸿沟,需要一套逼真的、动态的人类面部表情。如果现在试行,这样的刺激方案将扩展到从临床研究领域到尖端用户软件开发的广泛领域。该项目的第二位导师将提供他们在这一特定领域的专业知识,并拥有使用VR开发和试验这样的人脸集所需的研究背景和硬件。面部刺激要成为测量和改善面部情绪识别的有效工具,它们必须足够现实,以模拟自然的社会感知:这些目标汇集了阿比盖尔·韦布和彼得·斯卡夫的专业知识。该项目将首次解决人脸视频中真实、动态的面部信息如何有助于观察者情绪识别的准确性。这还有待使用沉浸式VR平台进行测试;沉浸式VR平台是一种经济高效的新技术,用于评估和促进临床人群的面部情绪识别实践。当测量自然的面部情感识别时,这种动态、实时的面部刺激可能会更有效,这一优势通过使用沉浸式VR得到了支持。我们的试点研究使用了非临床人群,并将作为对我们的沉浸式面部识别模型的有效性的初步调查。这是SineWave特别感兴趣的一个来源。使用雷丁大学(Peter Scarfe)专门的3D面部扫描技术,将有可能创造出真实的人类面部刺激,既是3D的,也是动态的,实时改变表情。这将在沉浸式VR之外和内部进行调查,以衡量VR环境对面部情感识别的相对影响。为服务用户服务的软件公司,包括Facebook Reality Labs(另一家与导师建立的关系),现在正强调此类研究投入的重要性,以便开发必要的硬件和软件,以传递这种复杂的社交互动的细微差别。这些应用是广泛的,也同样适用于临床研究、认知心理学、社会学、工业和商业的经验领域。这一领域的研究目前正在学术界以外的领域展开。我们看到Facebook Reality Labs就是这种情况--另一种与项目主要导师存在的预先存在的关系--在那里,尖端沉浸式软件的构建越来越多地得到脸部感知领域的专家研究人员的帮助。该项目将心理物理学和面部感知(AW)、虚拟现实(LVD)、真实面部刺激(PS)等领域的专家结合在一起。
项目成果
期刊论文数量(9)
专著数量(0)
科研奖励数量(0)
会议论文数量(0)
专利数量(0)
Reversing luminance polarity: control faces are harder to recognise but easier to see
反转亮度极性:控制面更难识别,但更容易看到
- DOI:10.31234/osf.io/zw3nb
- 发表时间:2020
- 期刊:
- 影响因子:0
- 作者:Webb A
- 通讯作者:Webb A
Saccadic eye movements are deployed faster for salient facial stimuli, but are relatively indifferent to their emotional content
对于显着的面部刺激,眼球扫视运动的部署速度更快,但对其情感内容相对漠不关心
- DOI:10.31234/osf.io/j9c2e
- 发表时间:2021
- 期刊:
- 影响因子:0
- 作者:Webb A
- 通讯作者:Webb A
Contrast normalisation masks natural expression-related differences and artificially enhances the perceived salience of fear expressions.
对比度标准化掩盖了与自然表达相关的差异,并人为地增强了恐惧表达的感知显着性。
- DOI:10.31219/osf.io/hkrd9
- 发表时间:2020
- 期刊:
- 影响因子:0
- 作者:Webb A
- 通讯作者:Webb A
Conference Paper
- DOI:10.1016/s0987-7983(98)80087-x
- 发表时间:2009
- 期刊:
- 影响因子:0
- 作者:Peter Chan
- 通讯作者:Peter Chan
sychology of Aesthetics, Creativity, and the Arts Immersive-360° theatre: user experience in the virtual auditorium and platform efficacy for current and underserved audiences
美学、创造力和艺术心理学沉浸式 360° 剧院:虚拟礼堂的用户体验以及当前和服务不足的观众的平台功效
- DOI:10.31234/osf.io/2fyv7
- 发表时间:2023
- 期刊:
- 影响因子:0
- 作者:Webb A
- 通讯作者:Webb A
{{
item.title }}
{{ item.translation_title }}
- DOI:
{{ item.doi }} - 发表时间:
{{ item.publish_year }} - 期刊:
- 影响因子:{{ item.factor }}
- 作者:
{{ item.authors }} - 通讯作者:
{{ item.author }}
数据更新时间:{{ journalArticles.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ monograph.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ sciAawards.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ conferencePapers.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ patent.updateTime }}
Abigail Webb其他文献
Intraoperative margin assessment: Reliability and limitations of specimen X-rays in breast cancer surgery
术中切缘评估:乳腺癌手术中 X 射线标本的可靠性和局限性
- DOI:
10.1016/j.ejso.2025.109823 - 发表时间:
2025-05-01 - 期刊:
- 影响因子:2.900
- 作者:
Sadia Jaskani;Winston Yao;Neal Chhaya;Abigail Webb;Haresh Devalia;Mohsin Dani - 通讯作者:
Mohsin Dani
Abigail Webb的其他文献
{{
item.title }}
{{ item.translation_title }}
- DOI:
{{ item.doi }} - 发表时间:
{{ item.publish_year }} - 期刊:
- 影响因子:{{ item.factor }}
- 作者:
{{ item.authors }} - 通讯作者:
{{ item.author }}
相似国自然基金
Dynamic Credit Rating with Feedback Effects
- 批准号:
- 批准年份:2024
- 资助金额:万元
- 项目类别:外国学者研究基金项目
含Re、Ru先进镍基单晶高温合金中TCP相成核—生长机理的原位动态研究
- 批准号:52301178
- 批准年份:2023
- 资助金额:30.00 万元
- 项目类别:青年科学基金项目
静动态损伤问题的基面力元法及其在再生混凝土材料细观损伤分析中的应用
- 批准号:11172015
- 批准年份:2011
- 资助金额:58.0 万元
- 项目类别:面上项目
基于贝叶斯网络可靠度演进模型的城市雨水管网整体优化设计理论研究
- 批准号:51008191
- 批准年份:2010
- 资助金额:20.0 万元
- 项目类别:青年科学基金项目
美洲大蠊药材养殖及加工过程中化学成分动态变化与生物活性的相关性研究
- 批准号:81060329
- 批准年份:2010
- 资助金额:26.0 万元
- 项目类别:地区科学基金项目
星系恒星与气体的动力学演化
- 批准号:11073025
- 批准年份:2010
- 资助金额:30.0 万元
- 项目类别:面上项目
非标准随机调度模型的最优动态策略
- 批准号:71071056
- 批准年份:2010
- 资助金额:28.0 万元
- 项目类别:面上项目
"锁住"的金属中心手性-手性笼络合物的动态CD光谱研究与应用开发
- 批准号:20973136
- 批准年份:2009
- 资助金额:34.0 万元
- 项目类别:面上项目
生物膜式反应器内复杂热物理参数动态场分布的多尺度实时测量方法研究
- 批准号:50876120
- 批准年份:2008
- 资助金额:36.0 万元
- 项目类别:面上项目
大规模动态网络环境中协同组操作一致性维护算法的正确性证明及其验证的研究
- 批准号:60803118
- 批准年份:2008
- 资助金额:20.0 万元
- 项目类别:青年科学基金项目
相似海外基金
Your eyes can deceive you: Predictive biases in perceptual representation of dynamic facial expressions.
你的眼睛会欺骗你:动态面部表情感知表征的预测偏差。
- 批准号:
2752337 - 财政年份:2022
- 资助金额:
$ 12.19万 - 项目类别:
Studentship
Multisensory integration at the cell, circuit, and behavioral levels: How audiovisual signals drive dynamic courtship behavior in Drosophila melanogaster
细胞、回路和行为层面的多感觉整合:视听信号如何驱动果蝇的动态求偶行为
- 批准号:
10389197 - 财政年份:2022
- 资助金额:
$ 12.19万 - 项目类别:
Multisensory integration at the cell, circuit, and behavioral levels: How audiovisual signals drive dynamic courtship behavior in Drosophila melanogaster
细胞、回路和行为层面的多感觉整合:视听信号如何驱动果蝇的动态求偶行为
- 批准号:
10828249 - 财政年份:2022
- 资助金额:
$ 12.19万 - 项目类别:
3D Dynamic and Patient-Centered Outcomes of Facial Reanimation Surgery in Patients with Facial Paralysis
面瘫患者面部复活手术的 3D 动态和以患者为中心的结果
- 批准号:
10507946 - 财政年份:2022
- 资助金额:
$ 12.19万 - 项目类别:
3D Dynamic and Patient-Centered Outcomes of Facial Reanimation Surgery in Patients with Facial Paralysis
面瘫患者面部复活手术的 3D 动态和以患者为中心的结果
- 批准号:
10353424 - 财政年份:2022
- 资助金额:
$ 12.19万 - 项目类别:
3D Dynamic and Patient-Centered Outcomes of Facial Reanimation Surgery in Patients with Facial Paralysis
面瘫患者面部复活手术的 3D 动态和以患者为中心的结果
- 批准号:
10211825 - 财政年份:2021
- 资助金额:
$ 12.19万 - 项目类别:
Analysis by Synthesis of Dynamic Facial Color and Expression under Emotional Changes in Remote Rehabilitation
远程康复情绪变化下动态面部颜色和表情的综合分析
- 批准号:
20K03147 - 财政年份:2020
- 资助金额:
$ 12.19万 - 项目类别:
Grant-in-Aid for Scientific Research (C)
Mechanisms of Dynamic Neural Coupling during Face-to-Face Expressions of Emotion
面对面情感表达过程中的动态神经耦合机制
- 批准号:
9883842 - 财政年份:2019
- 资助金额:
$ 12.19万 - 项目类别:
Mechanisms of Dynamic Neural Coupling during Face-to-Face Expressions of Emotion
面对面情感表达过程中的动态神经耦合机制
- 批准号:
10542713 - 财政年份:2019
- 资助金额:
$ 12.19万 - 项目类别:
Mechanisms of Dynamic Neural Coupling during Face-to-Face Expressions of Emotion
面对面情感表达过程中的动态神经耦合机制
- 批准号:
10319996 - 财政年份:2019
- 资助金额:
$ 12.19万 - 项目类别: