CAREER: Visual Manipulation Learning for Challenging Object Grasping

职业:具有挑战性的物体抓取的视觉操纵学习

基本信息

  • 批准号:
    2143730
  • 负责人:
  • 金额:
    $ 53.89万
  • 依托单位:
  • 依托单位国家:
    美国
  • 项目类别:
    Continuing Grant
  • 财政年份:
    2022
  • 资助国家:
    美国
  • 起止时间:
    2022-06-01 至 2027-05-31
  • 项目状态:
    未结题

项目摘要

This Faculty Early Career Development (CAREER) project seeks to significantly enhance the capabilities of robotic systems to grasp objects. Object grasping is an important prerequisite for various manipulation tasks. Humans are capable of grasping diverse objects dexterously even if their workspaces are cluttered. When an object is in a constrained space, such as a cardboard box or a shelf, humans often take advantage of the constraint by pushing the object toward a wall or a corner to grasp it. Even if a targeted object is hidden in a pile of objects, humans actively search for it by removing the clutter objects. While object grasping in such challenging scenarios seems effortless and natural for humans, current robots are still grasping objects in moderately cluttered tabletop environments. This project will develop computational algorithms to allow robots to perform such challenging object grasping. This project has the potential to broaden the application domains of robotic manipulation, such as flexible manufacturing (e.g., supporting human worker by providing right parts), agriculture (e.g., automated fruit and vegetable harvesting), warehouse fulfillment centers or grocery shopping (e.g., pick-and-place ordered items in shelves or bins), and eldercare (e.g., fetching a remote controller or pills), which have not been feasible with current robotic labor.The objective of this project is to develop novel computational algorithms that enable robots to visually understand scenes, learn to perform a proper manipulation action sequence, and adapt to different environmental settings. This project will address three fundamental challenges in robotic object grasping: (1) context-aware object grasping -- considering spatial contexts related to grasping, such as clutteredness, reachability, and collision, (2) object grasping via leveraging fixtures -- making use of environmental fixtures to grasp challenging objects by learning pixel-level or object-level affordances, and (3) object searching and grasping -- searching and grasping a hidden target object queried by either a reference image or a human natural language. All the algorithms and systems are designed to be self-supervised, meaning that they can learn and adapt to novel objects, environments, and tasks with minimal human interventions or guidance. This project will lay a solid foundation for the affordable and reliable robotic labor beneficial to the broad areas of our society, such as homes, factories, farms, warehouses, and eldercare facilities.This project is supported by the cross-directorate Foundational Research in Robotics program, jointly managed and funded by the Directorates for Engineering (ENG) and Computer and Information Science and Engineering (CISE).This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
这个教师早期职业发展(CAREER)项目旨在显着提高机器人系统抓取物体的能力。物体抓取是完成各种操作任务的重要前提。人类能够灵巧地抓住不同的物体,即使他们的身体是杂乱的。当一个物体在一个受限的空间中时,比如一个纸板箱或一个架子,人类通常会利用这个限制,将物体推向墙壁或角落来抓住它。即使目标物体隐藏在一堆物体中,人类也会通过清除杂乱的物体来主动寻找它。虽然在这种具有挑战性的场景中抓取物体对人类来说似乎毫不费力且自然,但目前的机器人仍然在适度杂乱的桌面环境中抓取物体。该项目将开发计算算法,使机器人能够执行这种具有挑战性的物体抓取。该项目有可能扩大机器人操作的应用领域,如柔性制造(例如,通过提供正确的部分来支持人类工人),农业(例如,自动水果和蔬菜收获),仓库履行中心或杂货店购物(例如,在货架或箱子中拾取和放置订购的物品),以及老年人护理(例如,这个项目的目标是开发新的计算算法,使机器人能够在视觉上理解场景,学习执行适当的操作动作序列,并适应不同的环境设置。该项目将解决机器人物体抓取的三个基本挑战:(1)上下文感知的物体抓取--考虑与抓取相关的空间上下文,例如杂乱、可达性和碰撞,(2)通过利用固定装置的物体抓取--利用环境固定装置通过学习像素级或物体级启示来抓取具有挑战性的物体,以及(3)对象搜索和抓取-搜索和抓取通过参考图像或人类自然语言查询的隐藏目标对象。所有的算法和系统都被设计成自我监督的,这意味着它们可以学习和适应新的对象,环境和任务,只需最少的人为干预或指导。该项目将为我们社会的广泛领域,如家庭,工厂,农场,仓库和老年人护理设施,提供负担得起的可靠的机器人劳动力奠定坚实的基础。该项目得到了跨董事会机器人基础研究计划的支持,由工程局(ENG)和计算机与信息科学与工程局(CISE)共同管理和资助该奖项反映了NSF的法定使命,并通过使用基金会的知识价值和更广泛的影响审查标准进行评估,被认为值得支持。

项目成果

期刊论文数量(6)
专著数量(0)
科研奖励数量(0)
会议论文数量(0)
专利数量(0)
Fixture-Aware DDQN for Generalized Environment-Enabled Grasping
KINet: Unsupervised Forward Models for Robotic Pushing Manipulation
  • DOI:
    10.1109/lra.2023.3303829
  • 发表时间:
    2022-02
  • 期刊:
  • 影响因子:
    5.2
  • 作者:
    Alireza Rezazadeh;Changhyun Choi
  • 通讯作者:
    Alireza Rezazadeh;Changhyun Choi
Adversarial Object Rearrangement in Constrained Environments with Heterogeneous Graph Neural Networks
IOSG: Image-Driven Object Searching and Grasping
Self-Supervised Interactive Object Segmentation Through a Singulation-and-Grasping Approach
  • DOI:
    10.48550/arxiv.2207.09314
  • 发表时间:
    2022-07
  • 期刊:
  • 影响因子:
    9.5
  • 作者:
    Houjian Yu;Changhyun Choi
  • 通讯作者:
    Houjian Yu;Changhyun Choi
{{ item.title }}
{{ item.translation_title }}
  • DOI:
    {{ item.doi }}
  • 发表时间:
    {{ item.publish_year }}
  • 期刊:
  • 影响因子:
    {{ item.factor }}
  • 作者:
    {{ item.authors }}
  • 通讯作者:
    {{ item.author }}

数据更新时间:{{ journalArticles.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ monograph.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ sciAawards.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ conferencePapers.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ patent.updateTime }}

Changhyun Choi其他文献

Real-time 3D object pose estimation and tracking for natural landmark based visual servo
基于自然地标的视觉伺服的实时 3D 物体姿态估计和跟踪
Indicator Development and Evaluation of Storm and Flood Resilience Using Big Data Analysis: (1) Development of Resilience Indicators
利用大数据分析进行风暴和洪水抵御能力的指标制定和评估:(1)抵御力指标的制定
Applicability of Spatial Interpolation Methods for the Estimation of Rainfall Field
空间插值方法在降雨场估算中的适用性
  • DOI:
  • 发表时间:
    2015
  • 期刊:
  • 影响因子:
    0
  • 作者:
    Ho;Narae Kang;Huiseong Noh;D. R. Lee;Changhyun Choi;H. Kim
  • 通讯作者:
    H. Kim
Development of Heavy Rain Damage Prediction Function for Public Facility Using Machine Learning
利用机器学习开发公共设施暴雨损害预测功能
  • DOI:
    10.9798/kosham.2017.17.6.443
  • 发表时间:
    2017
  • 期刊:
  • 影响因子:
    0
  • 作者:
    Changhyun Choi;Kihyuck Park;Heekyung Park;M. Lee;Jongsung Kim;H. Kim
  • 通讯作者:
    H. Kim
Learning Object Relations with Graph Neural Networks for Target-Driven Grasping in Dense Clutter
使用图神经网络学习对象关系,以实现密集杂波中的目标驱动抓取

Changhyun Choi的其他文献

{{ item.title }}
{{ item.translation_title }}
  • DOI:
    {{ item.doi }}
  • 发表时间:
    {{ item.publish_year }}
  • 期刊:
  • 影响因子:
    {{ item.factor }}
  • 作者:
    {{ item.authors }}
  • 通讯作者:
    {{ item.author }}

相似国自然基金

基于多幅图象的Visual Hull重构及表面属性建模算法研究
  • 批准号:
    60373031
  • 批准年份:
    2003
  • 资助金额:
    23.0 万元
  • 项目类别:
    面上项目

相似海外基金

Visual methods for advanced automation of underwater manipulation
水下操纵高级自动化的视觉方法
  • 批准号:
    LP220100527
  • 财政年份:
    2023
  • 资助金额:
    $ 53.89万
  • 项目类别:
    Linkage Projects
Neural Mechanisms of the Representation, Prioritization, and Manipulation of Visual Working Memory
视觉工作记忆的表征、优先级和操纵的神经机制
  • 批准号:
    RGPIN-2019-04865
  • 财政年份:
    2022
  • 资助金额:
    $ 53.89万
  • 项目类别:
    Discovery Grants Program - Individual
Research and Development of Visual Perception and Manipulation in an Interactive Autonomous Robotic System
交互式自主机器人系统中视觉感知和操纵的研究与开发
  • 批准号:
    RGPIN-2017-05762
  • 财政年份:
    2022
  • 资助金额:
    $ 53.89万
  • 项目类别:
    Discovery Grants Program - Individual
Neural Mechanisms of the Representation, Prioritization, and Manipulation of Visual Working Memory
视觉工作记忆的表征、优先级和操纵的神经机制
  • 批准号:
    RGPIN-2019-04865
  • 财政年份:
    2021
  • 资助金额:
    $ 53.89万
  • 项目类别:
    Discovery Grants Program - Individual
Research and Development of Visual Perception and Manipulation in an Interactive Autonomous Robotic System
交互式自主机器人系统中视觉感知和操纵的研究与开发
  • 批准号:
    RGPIN-2017-05762
  • 财政年份:
    2021
  • 资助金额:
    $ 53.89万
  • 项目类别:
    Discovery Grants Program - Individual
Neural Mechanisms of the Representation, Prioritization, and Manipulation of Visual Working Memory
视觉工作记忆的表征、优先级和操纵的神经机制
  • 批准号:
    RGPIN-2019-04865
  • 财政年份:
    2020
  • 资助金额:
    $ 53.89万
  • 项目类别:
    Discovery Grants Program - Individual
Dissection and manipulation of inflammatory pathways underlying post-traumatic visual outcomes
创伤后视觉结果的炎症途径的剖析和操纵
  • 批准号:
    9910605
  • 财政年份:
    2020
  • 资助金额:
    $ 53.89万
  • 项目类别:
Research and Development of Visual Perception and Manipulation in an Interactive Autonomous Robotic System
交互式自主机器人系统中视觉感知和操纵的研究与开发
  • 批准号:
    RGPIN-2017-05762
  • 财政年份:
    2020
  • 资助金额:
    $ 53.89万
  • 项目类别:
    Discovery Grants Program - Individual
Dissection and manipulation of inflammatory pathways underlying post-traumatic visual outcomes
创伤后视觉结果的炎症途径的剖析和操纵
  • 批准号:
    10362517
  • 财政年份:
    2020
  • 资助金额:
    $ 53.89万
  • 项目类别:
Research and Development of Visual Perception and Manipulation in an Interactive Autonomous Robotic System
交互式自主机器人系统中视觉感知和操纵的研究与开发
  • 批准号:
    RGPIN-2017-05762
  • 财政年份:
    2019
  • 资助金额:
    $ 53.89万
  • 项目类别:
    Discovery Grants Program - Individual
{{ showInfoDetail.title }}

作者:{{ showInfoDetail.author }}

知道了