EAGER: Volition Based Anticipatory Control for Time-Critical Brain-Prosthetic Interaction
EAGER:基于意志的预期控制,用于时间关键的大脑-假体交互
基本信息
- 批准号:1550397
- 负责人:
- 金额:$ 17.88万
- 依托单位:
- 依托单位国家:美国
- 项目类别:Standard Grant
- 财政年份:2015
- 资助国家:美国
- 起止时间:2015-08-15 至 2017-07-31
- 项目状态:已结题
- 来源:
- 关键词:
项目摘要
This exploratory project focuses on developing algorithms that will allow the PI's previously implemented prototype drumming prosthesis, which was developed in an effort to help an injured teen, to anticipate human physical actions based on an analysis of EEG signals so that it can respond mechanically in a timely manner. The goal is to enable the enhanced prosthesis to detect volition, the cognitive process by which an individual decides on and commits to a particular course of action hundreds of milliseconds before the action actually takes place, in order to foresee the drummer's actions and achieve sub-second synchronization between artificial and biological limbs, thereby leading to improved performance in a time-sensitive domain where asynchronous operations of more than a few milliseconds are noticeable by listeners. Project outcomes will include cognitive models and technical approaches that will be of great value for improving efficiency and fluency in a wide range of human-robot and human-prosthesis interaction scenarios, from construction tasks where humans and robots collaborate to achieve common goals, to time-critical tasks such as in hospital operating rooms or space stations where humans operate artificial robotic limbs. The work will also lead to creation of a volition trials database that will be documented and shared with the broad community of brain scholars and brain-machine interface researches. And the project will have additional broad impact by supporting students in the Robotic Musicianship group at Georgia Tech as it transitions from its previous focus on robotic musicianship into the fields of prosthetic and human augmentation.Prior studies of volition have shown that across multiple repetitions of (real or imagined) motor activity one can derive the Event-Related-Potential (ERP) associated with the intent to move the hand, up to a few seconds prior to the generation of the movement. Additionally, studies of mirror neurons have shown that observing a motor activity can trigger sets of cells in the brain that replicate the activity depicted when a subject is engaged in the action itself. In this project the PI will build on such findings to develop new pattern recognition algorithms for EEG signal analysis in an effort to identify volition and design new anticipatory algorithms for brain-machine interfaces that reduce latency and allow for synchronization at the millisecond level. The work will be carried out in stages. The PI will first collect EEG data from a large number of experimental trials where participants are engaged in a voluntary motor action. The data will be studied to detect patterns indicative of volition activity from electrodes monitoring both the motor and pre-motor cortices (SMA and pre-SMA), and also to isolate the neural correlates of imagined vs. real movement. A variety of general purpose machine learning classifiers, as well as music focused feature extraction techniques, will be used to distinguish between anticipatory patterns of activity preluding an action (volition) and patterns generated when the action is indeed manifested. As part of the analysis the PI will attempt to acquire an understanding of the delta times between volition and action under different conditions, and he will develop a repeatability / reliability matrix to be utilized for synchronization in the next stage of the work, in which the PI will develop a "latency compensation engine" that generates robotic drum hits at the exact anticipated action time, compensating for mechanical latencies while taking into account the projected delta time between volition and action. Multi-modal integration with data from other sensors (EMG, microphones, proximity, etc.) will be exploited to correct errors is detection and classification. Finally, success of the new algorithms will be evaluated using both objective and subjective measures by having the amputee drummer perform a series of musical tasks with the robotic arm.
这个探索性项目的重点是开发算法,该算法将允许PI先前实施的原型击鼓假肢,该假肢是为了帮助受伤的青少年而开发的,它可以根据对EEG信号的分析来预测人类的身体动作,以便它可以及时做出机械反应。 目标是使增强的假肢能够检测意志,这是一个认知过程,通过这个过程,一个人在行动实际发生之前数百毫秒决定并承诺采取特定的行动,以便预见鼓手的行动,并实现人造和生物肢体之间的亚秒同步,从而导致在时间敏感域中的改进的性能,在所述时间敏感域中,多于几毫秒的异步操作对于收听者是明显的。 项目成果将包括认知模型和技术方法,这些模型和技术方法对于提高人类-机器人和人类-假肢交互场景的效率和流畅性具有重要价值,从人类和机器人合作实现共同目标的建筑任务,到时间紧迫的任务,例如在医院手术室或人类操作人工机器人肢体的空间站。 这项工作还将导致创建一个意志试验数据库,该数据库将被记录并与大脑学者和脑机接口研究的广泛社区共享。 该项目将通过支持格鲁吉亚理工学院机器人音乐才能组的学生产生额外的广泛影响,因为它从以前的重点机器人音乐才能过渡到假肢和人类增强领域。(真实的或想象的)运动活动,人们可以导出与移动手的意图相关联的事件相关电位(ERP),直到产生运动之前的几秒钟。 此外,对镜像神经元的研究表明,观察运动活动可以触发大脑中的一组细胞,这些细胞复制受试者参与动作本身时所描绘的活动。 在本项目中,PI将在这些发现的基础上开发用于EEG信号分析的新模式识别算法,以识别意志并为脑机接口设计新的预期算法,以减少延迟并实现毫秒级的同步。 这项工作将分阶段进行。 PI将首先从参与者参与自愿运动动作的大量实验性试验中收集EEG数据。 将研究这些数据,以检测来自监测运动和前运动皮层(SMA和前SMA)的电极的指示意志活动的模式,并分离想象与真实的运动的神经相关性。 将使用各种通用机器学习分类器以及以音乐为中心的特征提取技术来区分动作(意志)的预期活动模式和动作确实表现出来时生成的模式。 作为分析的一部分,PI将尝试了解不同条件下意志和动作之间的增量时间,他将开发一个可重复性/可靠性矩阵,用于下一阶段工作的同步,其中PI将开发一个“延迟补偿引擎”,在准确的预期动作时间产生机器人击鼓,补偿机械性疲劳,同时考虑到意志和行动之间的预计时间差。 多模式集成来自其他传感器的数据(EMG、麦克风、接近度等)将被用来纠正错误的检测和分类。 最后,新算法的成功将通过让截肢鼓手用机器人手臂执行一系列音乐任务来进行客观和主观的评估。
项目成果
期刊论文数量(0)
专著数量(0)
科研奖励数量(0)
会议论文数量(0)
专利数量(0)
数据更新时间:{{ journalArticles.updateTime }}
{{
item.title }}
{{ item.translation_title }}
- DOI:
{{ item.doi }} - 发表时间:
{{ item.publish_year }} - 期刊:
- 影响因子:{{ item.factor }}
- 作者:
{{ item.authors }} - 通讯作者:
{{ item.author }}
数据更新时间:{{ journalArticles.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ monograph.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ sciAawards.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ conferencePapers.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ patent.updateTime }}
Gil Weinberg其他文献
Synchronization in human-robot Musicianship
人机音乐同步
- DOI:
10.1109/roman.2010.5598690 - 发表时间:
2010 - 期刊:
- 影响因子:0
- 作者:
Guy Hoffman;Gil Weinberg - 通讯作者:
Gil Weinberg
Robotic Musicianship - Musical Interactions Between Humans and Machines
机器人音乐——人与机器之间的音乐互动
- DOI:
10.5772/5206 - 发表时间:
2007 - 期刊:
- 影响因子:0
- 作者:
Gil Weinberg - 通讯作者:
Gil Weinberg
Emotional musical prosody for the enhancement of trust: Audio design for robotic arm communication
增强信任的情感音乐韵律:机械臂通信的音频设计
- DOI:
10.1515/pjbr-2021-0033 - 发表时间:
2021 - 期刊:
- 影响因子:0
- 作者:
Richard J. Savery;Lisa Zahray;Gil Weinberg - 通讯作者:
Gil Weinberg
The embroidered musical ball: a squeezable instrument for expressive performance
刺绣音乐球:一种可挤压的表现力乐器
- DOI:
- 发表时间:
2000 - 期刊:
- 影响因子:0
- 作者:
Gil Weinberg;Maggie Orth;Peter Russo - 通讯作者:
Peter Russo
Visual cues-based anticipation for percussionist-robot interaction
基于视觉线索的打击乐手与机器人交互的预期
- DOI:
10.1145/2157689.2157713 - 发表时间:
2012 - 期刊:
- 影响因子:0
- 作者:
Marcelo Cicconet;Mason Bretan;Gil Weinberg - 通讯作者:
Gil Weinberg
Gil Weinberg的其他文献
{{
item.title }}
{{ item.translation_title }}
- DOI:
{{ item.doi }} - 发表时间:
{{ item.publish_year }} - 期刊:
- 影响因子:{{ item.factor }}
- 作者:
{{ item.authors }} - 通讯作者:
{{ item.author }}
{{ truncateString('Gil Weinberg', 18)}}的其他基金
Data Driven Predictive Auditory Cues for Safety and Fluency in Human-Robot Interaction
数据驱动的预测听觉线索可确保人机交互的安全性和流畅性
- 批准号:
2240525 - 财政年份:2023
- 资助金额:
$ 17.88万 - 项目类别:
Standard Grant
NRI: FND: Creating Trust Between Groups of Humans and Robots Using a Novel Music Driven Robotic Emotion Generator
NRI:FND:使用新颖的音乐驱动机器人情感发生器在人类和机器人群体之间建立信任
- 批准号:
1925178 - 财政年份:2019
- 资助金额:
$ 17.88万 - 项目类别:
Standard Grant
I-Corps: Dexterous Robotic Prosthetic Control Using Deep Learning Pattern Prediction from Ultrasound Signal
I-Corps:利用超声波信号的深度学习模式预测灵巧的机器人假肢控制
- 批准号:
1744192 - 财政年份:2017
- 资助金额:
$ 17.88万 - 项目类别:
Standard Grant
EAGER: Sub-second human-robot synchronization
EAGER:亚秒级人机同步
- 批准号:
1345006 - 财政年份:2013
- 资助金额:
$ 17.88万 - 项目类别:
Standard Grant
HCC: Small: Multi Modal Music Intelligence for Robotic Musicianship
HCC:小型:机器人音乐的多模式音乐智能
- 批准号:
1017169 - 财政年份:2010
- 资助金额:
$ 17.88万 - 项目类别:
Standard Grant
HRI: The Robotic Musician - Facilitating Novel Musical Experiences and Outcomes through Human Robot Interaction
HRI:机器人音乐家 - 通过人机交互促进新颖的音乐体验和成果
- 批准号:
0713269 - 财政年份:2007
- 资助金额:
$ 17.88万 - 项目类别:
Standard Grant
相似海外基金
SBIR Phase I: Volition With An App
SBIR 第一阶段:通过应用程序决定
- 批准号:
2026010 - 财政年份:2020
- 资助金额:
$ 17.88万 - 项目类别:
Standard Grant
Development of a restraints free program for patients regarding improvement of staff motivation, volition, and performance to care
为患者制定一项无约束计划,以提高员工的积极性、意愿和护理表现
- 批准号:
20K11015 - 财政年份:2020
- 资助金额:
$ 17.88万 - 项目类别:
Grant-in-Aid for Scientific Research (C)
Proposal of gait model related to intellect, emotion, volition and physical and its application to suspicious person detection
智力、情感、意志、身体相关步态模型的提出及其在可疑人员检测中的应用
- 批准号:
18H04115 - 财政年份:2018
- 资助金额:
$ 17.88万 - 项目类别:
Grant-in-Aid for Scientific Research (A)
Development of assistive walking devices which change the motions with user's body conditions and volition.
开发可根据使用者的身体状况和意志改变动作的步行辅助装置。
- 批准号:
17K01551 - 财政年份:2017
- 资助金额:
$ 17.88万 - 项目类别:
Grant-in-Aid for Scientific Research (C)
Volition - a machine learning decision support platform for Innovate UK
Volition - Innovate UK 的机器学习决策支持平台
- 批准号:
971578 - 财政年份:2017
- 资助金额:
$ 17.88万 - 项目类别:
Small Business Research Initiative
SFB 940: Volition and Cognitive Control: Mechanisms, Modulators and Dysfunctions
SFB 940:意志和认知控制:机制、调节器和功能障碍
- 批准号:
178833530 - 财政年份:2012
- 资助金额:
$ 17.88万 - 项目类别:
Collaborative Research Centres
Lifestyle Change: Values and Volition
生活方式的改变:价值观和意志
- 批准号:
ES/G010072/2 - 财政年份:2012
- 资助金额:
$ 17.88万 - 项目类别:
Research Grant
Lifestyle Change: Values and Volition
生活方式的改变:价值观和意志
- 批准号:
ES/G010072/1 - 财政年份:2009
- 资助金额:
$ 17.88万 - 项目类别:
Research Grant
Analysis of movement-related sympathetic responses reflecting volition of individuals with severe motor impairment
反映严重运动障碍患者意志的运动相关交感神经反应分析
- 批准号:
21500539 - 财政年份:2009
- 资助金额:
$ 17.88万 - 项目类别:
Grant-in-Aid for Scientific Research (C)














{{item.name}}会员




