NEUROINFORMATICS AND EYE-HAND COORDINATION

神经信息学和眼手协调

基本信息

项目摘要

DESCRIPTION (provided by applicant): This project integrates neurophysiological research into the visual and somatosensory mechanisms that govern planning of skilled motor behaviors of the hand with development and distribution of neuroinformatics tools useful for data visualization, analysis and manipulation required for such studies. Aim 1 establishes a Neuroinformatics Resource to distribute integrated tools enabling quantitative analyses of spike trains obtained with digital video (DV), and their correlation to the kinematics of hand actions. Interactive tools will be provided over the Internet for 1) spike recognition and separation, 2) event- linked rasters and PSTHs, 3) continuous firing rate graphs, and 4) metrics of spike synchrony. An international network of neuroscientists studying primate hand function will evaluate the tools, and provide research synergies from diverse studies of hand function. The tools are used in Aims 2 and 3 to assess the contribution of visual and haptic information about object size, shape and location in motor planning of acquisition and manipulation by the hand. Spike trains recorded in posterior parietal cortex (PPC) from single neurons, and from neuronal assemblies studied with multielectrode arrays, are synchronized to simultaneously acquired images of hand kinematics with DV. We postulate that skilled hand behaviors require the registration and coordination of an external map that locates an object's spatial coordinates and encodes its intrinsic geometry, and an internal map of the body's own image that represents the hand posture and dynamics. Protocols comparing shape and location-dependent cues distinguish intention-related activity planning grasping behaviors, from sensory responses to views of objects and hand-object interactions during performance of a trained grasp-and-lift task. Aim 2 analyzes firing patterns obtained when objects can be viewed, and when an opaque barrier, requiring the use of memory, blocks vision and haptic cues for object acquisition and manipulation. Aim 3 measures the integration of vision and touch during tool use when grasped objects are inserted into matching slots to illuminate a target. These experiments test hypotheses that synchronization and/or coherence of firing between PPC regions dominated by vision and touch enable a match-to-sample mode of sensorimotor control and error correction that enables efficient, adaptive behaviors. This research has important clinical implications for understanding dysmetrias, optic ataxias, grasp abnormalities and neglect syndromes resulting from neurological damage to PPC, and provides basic insights into mechanisms of visuomotor control of the hand that may prove useful for rehabilitation after stroke or injury.
描述(由申请人提供):该项目集成 神经生理学研究的视觉和体感机制, 通过发展来管理手部熟练运动行为的规划, 分布的神经信息学工具,用于数据可视化,分析 和操作所需要的。目标1建立了一个 神经信息学资源,用于分发集成工具, 数字视频(DV)获得的锋电位序列的分析, 与手部动作的运动学相关。互动工具将 通过互联网提供,用于1)尖峰识别和分离,2)事件- 链接的栅格和PSTH,3)连续的放电率图,以及4) 尖峰同步一个研究灵长类动物的神经科学家的国际网络 手功能将评估的工具,并提供研究协同作用,从 对手部功能的各种研究。目标2和目标3使用这些工具评估 关于物体大小、形状和形状的视觉和触觉信息的贡献 在运动规划中的位置的采集和操作的手。尖峰 在后顶叶皮层(PPC)中记录的来自单个神经元的列车,以及 从多电极阵列研究的神经元组件, 利用DV同时采集手部运动学图像。我们推测 熟练的手部动作需要一个 定位对象的空间坐标并对其进行编码的外部贴图 内在的几何形状,以及身体自身形象的内部地图, 表示手的姿势和动态。协议比较形状和 位置相关线索区分意图相关活动规划 抓握行为,从感觉反应到物体和手-物体的视图 在执行训练过的抓握和抬起任务期间的相互作用。目的2 分析当可以看到物体时获得的发射模式,以及当 不透明的屏障,需要使用记忆,阻挡视觉和触觉线索, 对象获取和操作。目标3测量视觉的整合 在工具使用过程中,当抓取的物体插入匹配时, 用来照亮目标这些实验验证了以下假设, PPC区域之间的同步和/或一致性, 视觉和触觉使得能够实现感觉运动控制的匹配样本模式, 错误纠正,使有效的,自适应的行为。本研究 对于理解屈光不正,视共济失调, 神经损伤导致的抓握异常和忽视综合征 PPC,并提供了基本的洞察机制的视觉控制, 这只手可能被证明对中风或受伤后的康复有用。

项目成果

期刊论文数量(0)
专著数量(0)
科研奖励数量(0)
会议论文数量(0)
专利数量(0)

数据更新时间:{{ journalArticles.updateTime }}

{{ item.title }}
{{ item.translation_title }}
  • DOI:
    {{ item.doi }}
  • 发表时间:
    {{ item.publish_year }}
  • 期刊:
  • 影响因子:
    {{ item.factor }}
  • 作者:
    {{ item.authors }}
  • 通讯作者:
    {{ item.author }}

数据更新时间:{{ journalArticles.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ monograph.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ sciAawards.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ conferencePapers.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ patent.updateTime }}

ESTHER P. GARDNER其他文献

ESTHER P. GARDNER的其他文献

{{ item.title }}
{{ item.translation_title }}
  • DOI:
    {{ item.doi }}
  • 发表时间:
    {{ item.publish_year }}
  • 期刊:
  • 影响因子:
    {{ item.factor }}
  • 作者:
    {{ item.authors }}
  • 通讯作者:
    {{ item.author }}

{{ truncateString('ESTHER P. GARDNER', 18)}}的其他基金

Quantitative Tactile Assessment of Human Manual Dexterity
人类手动灵活性的定量触觉评估
  • 批准号:
    10017141
  • 财政年份:
    2019
  • 资助金额:
    $ 39.98万
  • 项目类别:
Quantitative Tactile Assessment of Human Manual Dexterity
人类手动灵活性的定量触觉评估
  • 批准号:
    9808888
  • 财政年份:
    2019
  • 资助金额:
    $ 39.98万
  • 项目类别:
Neural Mechanisms of Cutaneous Spatial Integration
皮肤空间整合的神经机制
  • 批准号:
    8664107
  • 财政年份:
    2014
  • 资助金额:
    $ 39.98万
  • 项目类别:
NEUROINFORMATICS AND EYE-HAND COORDINATION
神经信息学和眼手协调
  • 批准号:
    6743978
  • 财政年份:
    2002
  • 资助金额:
    $ 39.98万
  • 项目类别:
NEUROINFORMATICS AND EYE-HAND COORDINATION
神经信息学和眼手协调
  • 批准号:
    6890969
  • 财政年份:
    2002
  • 资助金额:
    $ 39.98万
  • 项目类别:
NEUROINFORMATICS AND EYE-HAND COORDINATION
神经信息学和眼手协调
  • 批准号:
    6625699
  • 财政年份:
    2002
  • 资助金额:
    $ 39.98万
  • 项目类别:
NEURAL MECHANISMS OF CUTANEOUS SPATIAL INTEGRATION
皮肤空间整合的神经机制
  • 批准号:
    3394631
  • 财政年份:
    1979
  • 资助金额:
    $ 39.98万
  • 项目类别:
NEURAL MECHANISMS OF CUTANEOUS SPATIAL INTEGRATION
皮肤空间整合的神经机制
  • 批准号:
    3394634
  • 财政年份:
    1979
  • 资助金额:
    $ 39.98万
  • 项目类别:
Neural Mechanisms of Cutaneous Spatial Integration
皮肤空间整合的神经机制
  • 批准号:
    7625327
  • 财政年份:
    1979
  • 资助金额:
    $ 39.98万
  • 项目类别:
NEURAL MECHANISMS OF CUTANEOUS SPATIAL INTEGRATION
皮肤空间整合的神经机制
  • 批准号:
    2655427
  • 财政年份:
    1979
  • 资助金额:
    $ 39.98万
  • 项目类别:

相似海外基金

Doctoral Dissertation Research: Assessing the chewing function of the hyoid bone and the suprahyoid muscles in primates
博士论文研究:评估灵长类动物舌骨和舌骨上肌的咀嚼功能
  • 批准号:
    2337428
  • 财政年份:
    2024
  • 资助金额:
    $ 39.98万
  • 项目类别:
    Standard Grant
Doctoral Dissertation Research: Obstetric constraints on neurocranial shape in nonhuman primates
博士论文研究:非人类灵长类动物神经颅骨形状的产科限制
  • 批准号:
    2341137
  • 财政年份:
    2024
  • 资助金额:
    $ 39.98万
  • 项目类别:
    Standard Grant
Testing the genetic impact on the internal and external shape of teeth in non-human primates
测试遗传对非人类灵长类动物牙齿内部和外部形状的影响
  • 批准号:
    2341544
  • 财政年份:
    2024
  • 资助金额:
    $ 39.98万
  • 项目类别:
    Standard Grant
Convergent evolution of placental villi in primates and ungulates: Are some placentas more efficient than others?
灵长类动物和有蹄类动物胎盘绒毛的趋同进化:某些胎盘是否比其他胎盘更有效?
  • 批准号:
    BB/Y005953/1
  • 财政年份:
    2024
  • 资助金额:
    $ 39.98万
  • 项目类别:
    Research Grant
The perceptual mechanisms of optical-flow speed in human and nonhuman primates
人类和非人类灵长类动物光流速度的感知机制
  • 批准号:
    24K16879
  • 财政年份:
    2024
  • 资助金额:
    $ 39.98万
  • 项目类别:
    Grant-in-Aid for Early-Career Scientists
Doctoral Dissertation Research: Assessing weight-gain tendencies in a non-human primates
博士论文研究:评估非人类灵长类动物的体重增加趋势
  • 批准号:
    2341173
  • 财政年份:
    2024
  • 资助金额:
    $ 39.98万
  • 项目类别:
    Standard Grant
Doctoral Dissertation Research: The three-dimensional biomechanics of the grasping big toe among higher primates
博士论文研究:高等灵长类抓握大脚趾的三维生物力学
  • 批准号:
    2341368
  • 财政年份:
    2024
  • 资助金额:
    $ 39.98万
  • 项目类别:
    Standard Grant
Doctoral Dissertation Research: Behavioral flexibility and space use in nonhuman primates
博士论文研究:非人类灵长类动物的行为灵活性和空间利用
  • 批准号:
    2316432
  • 财政年份:
    2023
  • 资助金额:
    $ 39.98万
  • 项目类别:
    Standard Grant
Doctoral Dissertation Research: Female mate choice in primates
博士论文研究:灵长类动物的雌性择偶
  • 批准号:
    2316896
  • 财政年份:
    2023
  • 资助金额:
    $ 39.98万
  • 项目类别:
    Standard Grant
Symbolic representation of objects via visual symbols in the primates brain
灵长类动物大脑中通过视觉符号对物体进行符号表示
  • 批准号:
    23K12942
  • 财政年份:
    2023
  • 资助金额:
    $ 39.98万
  • 项目类别:
    Grant-in-Aid for Early-Career Scientists
{{ showInfoDetail.title }}

作者:{{ showInfoDetail.author }}

知道了