Computational Methods for the Study of American Sign Language Nonmanuals Using Very Large Databases
使用大型数据库研究美国手语非手册的计算方法
基本信息
- 批准号:9199411
- 负责人:
- 金额:$ 31.94万
- 依托单位:
- 依托单位国家:美国
- 项目类别:
- 财政年份:2016
- 资助国家:美国
- 起止时间:2016-01-01 至 2020-12-01
- 项目状态:已结题
- 来源:
- 关键词:Academic achievementAccess to InformationAddressAgreementAlgorithmsAmerican Sign LanguageArticulationBehavioralChildCodeCommunicationComputational algorithmComputer Vision SystemsComputer softwareComputing MethodologiesControlled StudyDatabasesDetectionDevicesDimensionsEmotionsExcisionFaceFacial ExpressionFacial MusclesGoalsHandHeadHearingHearing Impaired PersonsHumanImageIndividualInterventionLifeLinguisticsLogicMachine LearningManualsMethodsMovementParentsPattern RecognitionProductionResearchResearch PersonnelScienceSemanticsSeriesShapesSign LanguageSignal TransductionSpecific qualifier valueSpeechStructureSystemTeacher Professional DevelopmentTechnologyTestingTimeVisualVisual system structurebasebody positioncomparativecomputerized toolsdeafnessdesignexperienceexperimental studyface perceptioninnovationinstructorinterestpreventpublic health relevancereconstructionshowing emotionsyntaxtool
项目摘要
DESCRIPTION (provided by applicant): American Sign Language (ASL) grammar is specified by the manual sign (the hands) and by the nonmanual components, which include the face. Our general hypothesis is that nonmanual facial articulations perform significant semantic and syntactic functions by means of a more extensive set of facial expressions than that seen in other communicative systems (e.g., speech and emotion). This proposal will systematically study this hypothesis. Specifically, we will study the following three hypotheses needed to properly answer the general hypothesis stated above: First, we hypothesize (H1) that the facial muscles involved in the production of clause-level grammatical facial expressions in ASL and/or their intensity of activation are more extensive than those seen in speech and emotion. Second, we hypothesize (H2) that the temporal structure of these facial configurations are more extensive than those seen in speech and emotion. Finally, we hypothesize (H3) that eliminating these ASL nonmanual makers from the original videos, drastically reduces the chances of correctly identifying the clause type of the signed sentence. To test these three hypotheses, we define a highly innovative approach based on the design of computational tools for the analysis of nonmanuals in signing. In particular, we will examine the following three specific aims. In Aim 1, we will build a series of computer algorithms that allow us to automatically (i.e., without the need of any human intervention) detect the face, its facial features as well as the automatic detection of the movements of the facial muscles and their intensity of activation. These tools will be integrated into ELAN, a standard software used for linguistic analysis. These tools will then be used to test six specific hypotheses to successfully study H1. In Aim 2, we define computer vision and machine learning algorithms to identify the temporal structure of ASL facial configurations and examine how these compare to those seen in speech and emotion. We will study six specific hypotheses to successfully address H2. Alternative hypotheses are defined in both aims. Finally, in Aim 3 we define algorithms to automatically modify the original videos of facial expression in ASL to eliminate the identified nonmanual markers. Native users of ASL will complete behavioral experiments to examine H3 and test potential alternative hypotheses. Comparative analysis with non-signer controls will also be completed. These studies will thus further validate H1 and H2. We provide evidence of our ability to successfully complete the tasks in each of these aims. These aims address a critical need; at present, the study of nonmanuals must be carried out by hand. To be able to draw conclusive results, it is necessary to study thousands of videos. The proposed computational approach supposes at least a 50-fold reduction in time compared to methods done by hand.
项目成果
期刊论文数量(0)
专著数量(0)
科研奖励数量(0)
会议论文数量(0)
专利数量(0)
数据更新时间:{{ journalArticles.updateTime }}
{{
item.title }}
{{ item.translation_title }}
- DOI:
{{ item.doi }} - 发表时间:
{{ item.publish_year }} - 期刊:
- 影响因子:{{ item.factor }}
- 作者:
{{ item.authors }} - 通讯作者:
{{ item.author }}
数据更新时间:{{ journalArticles.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ monograph.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ sciAawards.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ conferencePapers.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ patent.updateTime }}
Aleix M Martinez其他文献
Aleix M Martinez的其他文献
{{
item.title }}
{{ item.translation_title }}
- DOI:
{{ item.doi }} - 发表时间:
{{ item.publish_year }} - 期刊:
- 影响因子:{{ item.factor }}
- 作者:
{{ item.authors }} - 通讯作者:
{{ item.author }}
{{ truncateString('Aleix M Martinez', 18)}}的其他基金
Computational Methods for the Study of American Sign Language Nonmanuals Using Very Large Databases
使用大型数据库研究美国手语非手册的计算方法
- 批准号:
9054574 - 财政年份:2016
- 资助金额:
$ 31.94万 - 项目类别:
Computational Methods for the Study of American Sign Language Nonmanuals Using Very Large Databases
使用大型数据库研究美国手语非手册的计算方法
- 批准号:
9841303 - 财政年份:2016
- 资助金额:
$ 31.94万 - 项目类别:
A Study of the Computational Space of Facial Expressions of Emotion
面部表情情感的计算空间研究
- 批准号:
8142075 - 财政年份:2010
- 资助金额:
$ 31.94万 - 项目类别:
A Study of the Computational Space of Facial Expressions of Emotion
面部表情情感的计算空间研究
- 批准号:
8494053 - 财政年份:2010
- 资助金额:
$ 31.94万 - 项目类别:
A Study of the Computational Space of Facial Expressions of Emotion
面部表情情感的计算空间研究
- 批准号:
7946918 - 财政年份:2010
- 资助金额:
$ 31.94万 - 项目类别:
Computational Methods for Analysis of Mouth Shapes in Sign Languages
手语嘴形分析的计算方法
- 批准号:
8109271 - 财政年份:2010
- 资助金额:
$ 31.94万 - 项目类别:
A Study of the Computational Space of Facial Expressions of Emotion
面部表情情感的计算空间研究
- 批准号:
8266468 - 财政年份:2010
- 资助金额:
$ 31.94万 - 项目类别:
A Study of the Computational Space of Facial Expressions of Emotion
面部表情情感的计算空间研究
- 批准号:
8669977 - 财政年份:2010
- 资助金额:
$ 31.94万 - 项目类别:
Computational Methods for Analysis of Mouth Shapes in Sign Languages
手语嘴形分析的计算方法
- 批准号:
8101448 - 财政年份:2010
- 资助金额:
$ 31.94万 - 项目类别:
相似海外基金
Improving access to information in perinatal women: Creating and piloting a needs-based information tools
改善围产期妇女获取信息的机会:创建和试点基于需求的信息工具
- 批准号:
23K16469 - 财政年份:2023
- 资助金额:
$ 31.94万 - 项目类别:
Grant-in-Aid for Early-Career Scientists
Is Better Access to Information Effective in Improving Labor Market Outcomes? Experimental Evidence
更好地获取信息是否能有效改善劳动力市场成果?
- 批准号:
1954016 - 财政年份:2019
- 资助金额:
$ 31.94万 - 项目类别:
Standard Grant
Collaborative Research: CESER: EAGER: "FabWave" - A Pilot Manufacturing Cyberinfrastructure for Shareable Access to Information Rich Product Manufacturing Data
合作研究:CESER:EAGER:“FabWave”——用于共享访问信息丰富的产品制造数据的试点制造网络基础设施
- 批准号:
1812687 - 财政年份:2018
- 资助金额:
$ 31.94万 - 项目类别:
Standard Grant
Collaborative Research: CESER: EAGER: "FabWave" - A Pilot Manufacturing Cyberinfrastructure for Shareable Access to Information Rich Product Manufacturing Data
合作研究:CESER:EAGER:“FabWave”——用于共享访问信息丰富的产品制造数据的试点制造网络基础设施
- 批准号:
1812675 - 财政年份:2018
- 资助金额:
$ 31.94万 - 项目类别:
Standard Grant
Is Better Access to Information Effective in Improving Labor Market Outcomes? Experimental Evidence
更好地获取信息是否能有效改善劳动力市场成果?
- 批准号:
1824465 - 财政年份:2018
- 资助金额:
$ 31.94万 - 项目类别:
Standard Grant
Index Herbariorum Upgrade: A Project to Improve Access to Information about the World's Plant and Fungal Collections Assets
Index Herbariorum 升级:改善获取世界植物和真菌收藏资产信息的项目
- 批准号:
1600051 - 财政年份:2016
- 资助金额:
$ 31.94万 - 项目类别:
Standard Grant
TC: Large: Collaborative Research: Facilitating Free and Open Access to Information on the Internet
TC:大型:合作研究:促进互联网上信息的自由和开放获取
- 批准号:
1540066 - 财政年份:2015
- 资助金额:
$ 31.94万 - 项目类别:
Continuing Grant
INEQUALITY IN HIGHER EDUCATION OUTCOMES IN THE UK: SUBJECTIVE EXPECTATIONS, PREFERENCES, AND ACCESS TO INFORMATION
英国高等教育成果的不平等:主观期望、偏好和信息获取
- 批准号:
ES/M008622/1 - 财政年份:2015
- 资助金额:
$ 31.94万 - 项目类别:
Research Grant
Collaborative Access to Information about Physical Objects via See-Through Displays
通过透视显示器协作访问有关物理对象的信息
- 批准号:
413142-2011 - 财政年份:2013
- 资助金额:
$ 31.94万 - 项目类别:
Strategic Projects - Group
Study on the social system for guaranteeing equal access to information in Scandinavia as human rights protection system
斯堪的纳维亚地区保障平等信息的社会制度作为人权保障制度的研究
- 批准号:
24530777 - 财政年份:2012
- 资助金额:
$ 31.94万 - 项目类别:
Grant-in-Aid for Scientific Research (C)














{{item.name}}会员




