SCH: INT: Collaborative Research: Exploiting Voice Assistant Systems for Early Detection of Cognitive Decline

SCH:INT:合作研究:利用语音辅助系统早期检测认知衰退

基本信息

  • 批准号:
    10190783
  • 负责人:
  • 金额:
    $ 29.26万
  • 依托单位:
  • 依托单位国家:
    美国
  • 项目类别:
  • 财政年份:
    2019
  • 资助国家:
    美国
  • 起止时间:
    2019-09-30 至 2023-05-31
  • 项目状态:
    已结题

项目摘要

Early detection of Alzheimer’s Disease and Related Dementias (ADRD) in older adults living alone is essential for developing, planning, and initiating interventions and support systems to improve patients’ everyday function and quality of life. Conventional, clinic-based methods for early diagnosis are expensive, impractical, and time-consuming. This project aims to develop a low-cost, passive, and practical home-based assessment method using Voice Assistant Systems (VAS) for early detection of ADRD, including a set of novel data mining techniques for sparse time-series speech. The project has three specific aims: 1. Using a recurrent neural network (RNN) and a softmax regression model, we will develop a transfer learning technique to investigate the link between the speech from in-lab VAS tasks and cognitive decline and discover ADRD-related voice biomarkers. The Pitt Corpus speech database will be used to optimize the RNN parameters and thereby overcome the limited data problem of VAS. The softmax regression model will allow us to align the feature distributions from the previous speech data and in-lab VAS speech; 2. We will develop a novel “many-to- difference” prediction model with a symmetric RNN structure to predict the ADRD-related cognitive differences at two ends of a time period from the sparse time-series data. The proposed model is different from previous ones as the learning focus is shifted from the short-term pattern differences across users to the pattern difference over time for an individual user. The proposed model accommodates well for the highly dynamic nature of the inputs and maximally removes individual characteristics from the prediction result. To analyze the sparse time-series speech, a new data sampling technique will be used to address the imbalanced data problem, and a data quality metric will be developed for the proposed model; 3. The team will conduct an 18- month in-lab evaluation and a 28-month in-home evaluation with a focus on whether the VAS tasks and features from the in-lab evaluation and the repetition features of the in-home VAS data can measure and predict ADRD-related cognitive decline in the in-home participants over time. The proposed methods will be integrated into an interactive system to enable efficient communication on ADRD status among patients, caregivers, and clinicians. If successful, the outcomes of this project will provide an opportunity to provide supportive evidence to clinicians for the early detection of ADRD outside of a clinic-based setting. Project Relevance This project aims to develop a low-cost, passive, and practical cognitive assessment method using Voice Assistant Systems (VAS) for early detection of ADRD-related cognitive decline. If successful, the proposed system may be widely disseminated for the early diagnosis of ADRD to complement existing diagnostic modalities that could ultimately enable long-term patient and caregiver planning to maintain individual’s independence at home.
在独居的老年人中早期发现阿尔茨海默病和相关痴呆症(ADRD)对于制定,计划和启动干预措施和支持系统以改善患者的日常功能和生活质量至关重要。传统的,基于临床的早期诊断方法是昂贵的,不切实际的,耗时。本项目旨在开发一种低成本,被动,实用的家庭评估方法,使用语音助理系统(VAS)早期检测ADRD,包括一套新的数据挖掘技术,稀疏的时间序列语音。该项目有三个具体目标:1。使用循环神经网络(RNN)和softmax回归模型,我们将开发一种迁移学习技术来研究实验室内VAS任务的语音与认知衰退之间的联系,并发现ADRD相关的语音生物标志物。Pitt Corpus语音数据库将用于优化RNN参数,从而克服VAS的数据有限问题。softmax回归模型将允许我们将来自先前语音数据和实验室内VAS语音的特征分布对齐; 2.我们将开发一种新的具有对称RNN结构的“多对差”预测模型,以从稀疏时间序列数据中预测一个时间段两端的ADRD相关认知差异。所提出的模型是不同于以前的学习重点是从短期的模式差异,跨用户的模式差异随着时间的推移,为个人用户。所提出的模型很好地适应了输入的高度动态性,并最大限度地从预测结果中去除了个体特征。为了分析稀疏时间序列语音,将使用一种新的数据采样技术来解决不平衡数据问题,并将为所提出的模型开发一个数据质量度量; 3.该团队将进行18个月的实验室评价和28个月的家庭评价,重点关注实验室评价的VAS任务和特征以及家庭VAS数据的重复特征是否可以测量和预测家庭参与者随时间推移的ADR相关认知下降。所提出的方法将被集成到一个交互式系统中,使患者,护理人员和临床医生之间的ADRD状态的有效沟通。如果成功,该项目的结果将提供一个机会,为临床医生提供支持性证据,以早期发现ADRD以外的诊所为基础的设置。 项目相关性 本项目旨在开发一种低成本、被动、实用的认知评估方法,使用语音辅助系统(VAS)早期检测ADRD相关的认知下降。如果成功的话,拟议的系统可以广泛传播ADRD的早期诊断,以补充现有的诊断模式,最终可以使长期的病人和护理人员的计划,以保持个人的独立性在家里。

项目成果

期刊论文数量(0)
专著数量(0)
科研奖励数量(0)
会议论文数量(0)
专利数量(0)

数据更新时间:{{ journalArticles.updateTime }}

{{ item.title }}
{{ item.translation_title }}
  • DOI:
    {{ item.doi }}
  • 发表时间:
    {{ item.publish_year }}
  • 期刊:
  • 影响因子:
    {{ item.factor }}
  • 作者:
    {{ item.authors }}
  • 通讯作者:
    {{ item.author }}

数据更新时间:{{ journalArticles.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ monograph.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ sciAawards.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ conferencePapers.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ patent.updateTime }}

Xiaohui Liang其他文献

Xiaohui Liang的其他文献

{{ item.title }}
{{ item.translation_title }}
  • DOI:
    {{ item.doi }}
  • 发表时间:
    {{ item.publish_year }}
  • 期刊:
  • 影响因子:
    {{ item.factor }}
  • 作者:
    {{ item.authors }}
  • 通讯作者:
    {{ item.author }}

{{ truncateString('Xiaohui Liang', 18)}}的其他基金

SCH: INT: Collaborative Research: Exploiting Voice Assistant Systems for Early Detection of Cognitive Decline
SCH:INT:合作研究:利用语音辅助系统早期检测认知衰退
  • 批准号:
    10019452
  • 财政年份:
    2019
  • 资助金额:
    $ 29.26万
  • 项目类别:
SCH: INT: Collaborative Research: Exploiting Voice Assistant Systems for Early Detection of Cognitive Decline
SCH:INT:合作研究:利用语音辅助系统早期检测认知衰退
  • 批准号:
    10404684
  • 财政年份:
    2019
  • 资助金额:
    $ 29.26万
  • 项目类别:

相似海外基金

Rational design of rapidly translatable, highly antigenic and novel recombinant immunogens to address deficiencies of current snakebite treatments
合理设计可快速翻译、高抗原性和新型重组免疫原,以解决当前蛇咬伤治疗的缺陷
  • 批准号:
    MR/S03398X/2
  • 财政年份:
    2024
  • 资助金额:
    $ 29.26万
  • 项目类别:
    Fellowship
CAREER: FEAST (Food Ecosystems And circularity for Sustainable Transformation) framework to address Hidden Hunger
职业:FEAST(食品生态系统和可持续转型循环)框架解决隐性饥饿
  • 批准号:
    2338423
  • 财政年份:
    2024
  • 资助金额:
    $ 29.26万
  • 项目类别:
    Continuing Grant
Re-thinking drug nanocrystals as highly loaded vectors to address key unmet therapeutic challenges
重新思考药物纳米晶体作为高负载载体以解决关键的未满足的治疗挑战
  • 批准号:
    EP/Y001486/1
  • 财政年份:
    2024
  • 资助金额:
    $ 29.26万
  • 项目类别:
    Research Grant
Metrology to address ion suppression in multimodal mass spectrometry imaging with application in oncology
计量学解决多模态质谱成像中的离子抑制问题及其在肿瘤学中的应用
  • 批准号:
    MR/X03657X/1
  • 财政年份:
    2024
  • 资助金额:
    $ 29.26万
  • 项目类别:
    Fellowship
CRII: SHF: A Novel Address Translation Architecture for Virtualized Clouds
CRII:SHF:一种用于虚拟化云的新型地址转换架构
  • 批准号:
    2348066
  • 财政年份:
    2024
  • 资助金额:
    $ 29.26万
  • 项目类别:
    Standard Grant
The Abundance Project: Enhancing Cultural & Green Inclusion in Social Prescribing in Southwest London to Address Ethnic Inequalities in Mental Health
丰富项目:增强文化
  • 批准号:
    AH/Z505481/1
  • 财政年份:
    2024
  • 资助金额:
    $ 29.26万
  • 项目类别:
    Research Grant
ERAMET - Ecosystem for rapid adoption of modelling and simulation METhods to address regulatory needs in the development of orphan and paediatric medicines
ERAMET - 快速采用建模和模拟方法的生态系统,以满足孤儿药和儿科药物开发中的监管需求
  • 批准号:
    10107647
  • 财政年份:
    2024
  • 资助金额:
    $ 29.26万
  • 项目类别:
    EU-Funded
BIORETS: Convergence Research Experiences for Teachers in Synthetic and Systems Biology to Address Challenges in Food, Health, Energy, and Environment
BIORETS:合成和系统生物学教师的融合研究经验,以应对食品、健康、能源和环境方面的挑战
  • 批准号:
    2341402
  • 财政年份:
    2024
  • 资助金额:
    $ 29.26万
  • 项目类别:
    Standard Grant
Ecosystem for rapid adoption of modelling and simulation METhods to address regulatory needs in the development of orphan and paediatric medicines
快速采用建模和模拟方法的生态系统,以满足孤儿药和儿科药物开发中的监管需求
  • 批准号:
    10106221
  • 财政年份:
    2024
  • 资助金额:
    $ 29.26万
  • 项目类别:
    EU-Funded
Recite: Building Research by Communities to Address Inequities through Expression
背诵:社区开展研究,通过表达解决不平等问题
  • 批准号:
    AH/Z505341/1
  • 财政年份:
    2024
  • 资助金额:
    $ 29.26万
  • 项目类别:
    Research Grant
{{ showInfoDetail.title }}

作者:{{ showInfoDetail.author }}

知道了