RI: Collaborative Research: Foreign accent conversion through articulatory inversion of the vocal-tract frontal cavity

RI:合作研究:通过声道额腔的发音倒转进行外国口音转换

基本信息

项目摘要

The ability to transform a ?foreign? accented voice into its ?native? counterpart could be an invaluable tool in pronunciation training for second-language learners. This requires separating those aspects of the speech signal that are determined by the anatomy of the vocal tract from those that result from the idiosyncratic way in which the speaker controls it. While these two sources interact in complex ways in the acoustic domain, a few studies indicate that they may be decoupled in the articulatory space, specifically in the vocal tract frontal cavity.The objective of this research is to determine the extent to which foreign-accent conversion can be performed through articulatory inversion of the frontal cavity. For this purpose, two complementary problems are being investigated. First, existing articulatory datasets are being used to develop a foreign-accent conversion model that operates in the frontal cavity domain. Second, articulatory inversion models are being developed to estimate the frontal cavity configuration from speech acoustics. Results from these models are being systematically validated through perceptual tests of foreign-accentedness, speaker identity and acoustic quality.English is a second language for a significant percentage of the workforce in the United States. Reduction of foreign accent becomes increasingly difficult beyond the ?critical period? of language learning, but substantial improvements in pronunciation do occur for adult second-language learners. This work will stimulate the development of new technology to facilitate such improvements. Its results may also find application for film dubbing/looping, as well as in speech technology at large (e.g., feature extraction, data compression).
改造“外国人”的能力口音变成其“母语”?对应物可能是第二语言学习者发音训练的宝贵工具。这需要将语音信号中由声道的解剖结构决定的那些方面与由说话者控制它的特殊方式产生的那些方面分开。虽然这两个源在声学领域以复杂的方式相互作用,但一些研究表明它们可能在发音空间中解耦,特别是在声道额腔中。本研究的目的是确定通过额腔的发音反转可以在多大程度上执行外国口音转换。为此,正在研究两个互补的问题。首先,现有的发音数据集被用来开发在额腔域中运行的外国口音转换模型。其次,正在开发发音反转模型来根据语音声学估计额腔配置。这些模型的结果正在通过对外国口音、说话者身份和音质的感知测试进行系统验证。英语是美国很大一部分劳动力的第二语言。过了“关键期”,减少外国口音就变得越来越困难。语言学习的进步,但成人第二语言学习者的发音确实发生了显着的改善。这项工作将刺激新技术的开发,以促进此类改进。其结果还可以应用于电影配音/循环以及整个语音技术(例如特征提取、数据压缩)。

项目成果

期刊论文数量(0)
专著数量(0)
科研奖励数量(0)
会议论文数量(0)
专利数量(0)

数据更新时间:{{ journalArticles.updateTime }}

{{ item.title }}
{{ item.translation_title }}
  • DOI:
    {{ item.doi }}
  • 发表时间:
    {{ item.publish_year }}
  • 期刊:
  • 影响因子:
    {{ item.factor }}
  • 作者:
    {{ item.authors }}
  • 通讯作者:
    {{ item.author }}

数据更新时间:{{ journalArticles.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ monograph.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ sciAawards.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ conferencePapers.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ patent.updateTime }}

Ricardo Gutierrez-Osuna其他文献

Context-sensitive intra-class clustering
  • DOI:
    10.1016/j.patrec.2013.04.031
  • 发表时间:
    2014-02-01
  • 期刊:
  • 影响因子:
  • 作者:
    Yingwei Yu;Ricardo Gutierrez-Osuna;Yoonsuck Choe
  • 通讯作者:
    Yoonsuck Choe
Web GIS in practice X: a Microsoft Kinect natural user interface for Google Earth navigation
  • DOI:
    10.1186/1476-072x-10-45
  • 发表时间:
    2011-07-26
  • 期刊:
  • 影响因子:
    3.200
  • 作者:
    Maged N Kamel Boulos;Bryan J Blanchard;Cory Walker;Julio Montero;Aalap Tripathy;Ricardo Gutierrez-Osuna
  • 通讯作者:
    Ricardo Gutierrez-Osuna

Ricardo Gutierrez-Osuna的其他文献

{{ item.title }}
{{ item.translation_title }}
  • DOI:
    {{ item.doi }}
  • 发表时间:
    {{ item.publish_year }}
  • 期刊:
  • 影响因子:
    {{ item.factor }}
  • 作者:
    {{ item.authors }}
  • 通讯作者:
    {{ item.author }}

{{ truncateString('Ricardo Gutierrez-Osuna', 18)}}的其他基金

Convergence Accelerator Workshop - Chemical sensing with an olfaction analogue: high-dimensional, bio-inspired sensing and computation
融合加速器研讨会 - 具有嗅觉模拟的化学传感:高维、仿生传感和计算
  • 批准号:
    2231512
  • 财政年份:
    2022
  • 资助金额:
    $ 22.99万
  • 项目类别:
    Standard Grant
Collaborative Research: Adaptive explicit and implicit feedback in second language pronunciation training
合作研究:第二语言发音训练中的自适应显式和隐式反馈
  • 批准号:
    2016959
  • 财政年份:
    2020
  • 资助金额:
    $ 22.99万
  • 项目类别:
    Standard Grant
CHS: Medium: Collaborative Research: Managing Stress in the Workplace: Unobtrusive Monitoring and Adaptive Interventions
CHS:媒介:协作研究:管理工作场所的压力:不显眼的监控和适应性干预
  • 批准号:
    1704636
  • 财政年份:
    2017
  • 资助金额:
    $ 22.99万
  • 项目类别:
    Continuing Grant
RI: Small: Collaborative Research: Developing Golden Speakers for Second-Language Pronunciation Training
RI:小型:合作研究:开发第二语言发音训练的黄金音箱
  • 批准号:
    1619212
  • 财政年份:
    2016
  • 资助金额:
    $ 22.99万
  • 项目类别:
    Standard Grant
EXP: Collaborative Research: Perception and Production in Second Language: The Roles of Voice Variability and Familiarity
EXP:协作研究:第二语言的感知和产生:语音变异性和熟悉度的作用
  • 批准号:
    1623750
  • 财政年份:
    2016
  • 资助金额:
    $ 22.99万
  • 项目类别:
    Standard Grant
Integrated Sensing and Acting with Tunable Chemical Sensors
使用可调谐化学传感器集成传感和操作
  • 批准号:
    1002028
  • 财政年份:
    2010
  • 资助金额:
    $ 22.99万
  • 项目类别:
    Standard Grant
CAREER: Computational Models for Sensor-Based Machine Olfaction
职业:基于传感器的机器嗅觉的计算模型
  • 批准号:
    0229598
  • 财政年份:
    2002
  • 资助金额:
    $ 22.99万
  • 项目类别:
    Continuing Grant
CAREER: Computational Models for Sensor-Based Machine Olfaction
职业:基于传感器的机器嗅觉的计算模型
  • 批准号:
    9984426
  • 财政年份:
    2000
  • 资助金额:
    $ 22.99万
  • 项目类别:
    Continuing Grant

相似海外基金

Collaborative Research: RI: Medium: Principles for Optimization, Generalization, and Transferability via Deep Neural Collapse
合作研究:RI:中:通过深度神经崩溃实现优化、泛化和可迁移性的原理
  • 批准号:
    2312841
  • 财政年份:
    2023
  • 资助金额:
    $ 22.99万
  • 项目类别:
    Standard Grant
Collaborative Research: RI: Medium: Principles for Optimization, Generalization, and Transferability via Deep Neural Collapse
合作研究:RI:中:通过深度神经崩溃实现优化、泛化和可迁移性的原理
  • 批准号:
    2312842
  • 财政年份:
    2023
  • 资助金额:
    $ 22.99万
  • 项目类别:
    Standard Grant
Collaborative Research: RI: Small: Foundations of Few-Round Active Learning
协作研究:RI:小型:少轮主动学习的基础
  • 批准号:
    2313131
  • 财政年份:
    2023
  • 资助金额:
    $ 22.99万
  • 项目类别:
    Standard Grant
Collaborative Research: RI: Medium: Lie group representation learning for vision
协作研究:RI:中:视觉的李群表示学习
  • 批准号:
    2313151
  • 财政年份:
    2023
  • 资助金额:
    $ 22.99万
  • 项目类别:
    Continuing Grant
Collaborative Research: RI: Medium: Principles for Optimization, Generalization, and Transferability via Deep Neural Collapse
合作研究:RI:中:通过深度神经崩溃实现优化、泛化和可迁移性的原理
  • 批准号:
    2312840
  • 财政年份:
    2023
  • 资助金额:
    $ 22.99万
  • 项目类别:
    Standard Grant
Collaborative Research: RI: Small: Deep Constrained Learning for Power Systems
合作研究:RI:小型:电力系统的深度约束学习
  • 批准号:
    2345528
  • 财政年份:
    2023
  • 资助金额:
    $ 22.99万
  • 项目类别:
    Standard Grant
Collaborative Research: RI: Small: Motion Fields Understanding for Enhanced Long-Range Imaging
合作研究:RI:小型:增强远程成像的运动场理解
  • 批准号:
    2232298
  • 财政年份:
    2023
  • 资助金额:
    $ 22.99万
  • 项目类别:
    Standard Grant
Collaborative Research: RI: Small: End-to-end Learning of Fair and Explainable Schedules for Court Systems
合作研究:RI:小型:法院系统公平且可解释的时间表的端到端学习
  • 批准号:
    2232055
  • 财政年份:
    2023
  • 资助金额:
    $ 22.99万
  • 项目类别:
    Standard Grant
Collaborative Research: RI: Medium: Lie group representation learning for vision
协作研究:RI:中:视觉的李群表示学习
  • 批准号:
    2313149
  • 财政年份:
    2023
  • 资助金额:
    $ 22.99万
  • 项目类别:
    Continuing Grant
Collaborative Research: CompCog: RI: Medium: Understanding human planning through AI-assisted analysis of a massive chess dataset
合作研究:CompCog:RI:中:通过人工智能辅助分析海量国际象棋数据集了解人类规划
  • 批准号:
    2312374
  • 财政年份:
    2023
  • 资助金额:
    $ 22.99万
  • 项目类别:
    Standard Grant
{{ showInfoDetail.title }}

作者:{{ showInfoDetail.author }}

知道了