Collaborative Research: AF: Small: A Unified Framework for Analyzing Adaptive Stochastic Optimization Methods Based on Probabilistic Oracles
合作研究:AF:Small:基于概率预言的自适应随机优化方法分析统一框架
基本信息
- 批准号:2140057
- 负责人:
- 金额:$ 25万
- 依托单位:
- 依托单位国家:美国
- 项目类别:Standard Grant
- 财政年份:2022
- 资助国家:美国
- 起止时间:2022-01-15 至 2024-12-31
- 项目状态:已结题
- 来源:
- 关键词:
项目摘要
Data science and machine learning have transformed modern science, engineering, and business. One of the pillars of modern-day machine-learning technology is mathematical optimization, which is the methodology that drives the process of learning from available and/or real-time generated data. Unfortunately, however, despite the successes of certain optimization techniques, large-scale learning remains extremely expensive in terms of time and energy, which puts the ability to train machines to perform certain fundamental tasks exclusively in the hands of those with access to extreme-scale supercomputing facilities. A significant deficiency of many contemporary techniques is that they "launch" an algorithm with a prescribed "trajectory," despite the fact that the actual trajectory that the algorithm will follow depends on unknown factors. Contemporary optimization techniques for machine learning essentially account for this by "tuning" algorithmic parameters, which means that the target is typically only hit after numerous expensive misses. Another significant deficiency of contemporary techniques is the restrictive set of assumptions often made about the optimization being performed, which typically includes the assumption that the machine-learning model is being trained with uncorrupted data. Modern real-world applications are far more complex.This project will explore the design and analysis of adaptive ("self-tuning") optimization techniques for machine learning and related topics. One goal is to produce adaptive algorithms with rigorous guarantees that can avoid the extreme amounts of wasteful computation that are required by contemporary algorithms for parameter tuning. Another goal is to extend the use of these algorithms to settings with imperfect data/information, which may be due to biased function information, corrupted data, or novel techniques for approximating the objective. Finally, many applications ultimately require the learning process or model to satisfy some explicit or implicit constraints. Optimization methods for such machine-learning applications are still in their infancy, largely due to their more complicated nature and further dependence on algorithmic parameters. This project aims to design a unified framework for analyzing adaptive stochastic optimization methods that will offer researchers and practitioners a set of easy-to-use tools for designing next-generation algorithms for cutting-edge applications.This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
数据科学和机器学习已经改变了现代科学、工程和商业。现代机器学习技术的支柱之一是数学优化,这是一种从可用和/或实时生成的数据中驱动学习过程的方法。然而,不幸的是,尽管某些优化技术取得了成功,大规模学习在时间和精力方面仍然非常昂贵,这使得训练机器执行某些基本任务的能力只掌握在那些拥有超大规模超级计算设施的人手中。许多当代技术的一个显著缺陷是,尽管算法将遵循的实际轨迹取决于未知因素,但它们会按照规定的“轨迹”“启动”算法。机器学习的当代优化技术基本上是通过“调整”算法参数来解决这个问题的,这意味着目标通常只有在多次昂贵的失误之后才会被击中。当代技术的另一个重大缺陷是,通常对正在执行的优化进行限制性假设,其中通常包括假设机器学习模型正在使用未损坏的数据进行训练。现代现实世界的应用程序要复杂得多。该项目将探索机器学习和相关主题的自适应(“自调优”)优化技术的设计和分析。一个目标是产生具有严格保证的自适应算法,以避免当代算法在参数调优时所需要的极端数量的浪费计算。另一个目标是将这些算法的使用扩展到具有不完美数据/信息的设置,这可能是由于有偏差的函数信息,损坏的数据或近似目标的新技术。最后,许多应用程序最终要求学习过程或模型满足一些显式或隐式约束。这类机器学习应用的优化方法仍处于起步阶段,主要是由于它们更复杂的性质和对算法参数的进一步依赖。本项目旨在设计一个统一的框架来分析自适应随机优化方法,为研究人员和实践者提供一套易于使用的工具,为前沿应用设计下一代算法。该奖项反映了美国国家科学基金会的法定使命,并通过使用基金会的知识价值和更广泛的影响审查标准进行评估,被认为值得支持。
项目成果
期刊论文数量(2)
专著数量(0)
科研奖励数量(0)
会议论文数量(0)
专利数量(0)
Nesterov Accelerated Shuffling Gradient Method for Convex Optimization
- DOI:
- 发表时间:2022-02
- 期刊:
- 影响因子:0
- 作者:Trang H. Tran;Lam M. Nguyen;K. Scheinberg
- 通讯作者:Trang H. Tran;Lam M. Nguyen;K. Scheinberg
First- and second-order high probability complexity bounds for trust-region methods with noisy oracles
- DOI:10.1007/s10107-023-01999-5
- 发表时间:2022-05
- 期刊:
- 影响因子:2.7
- 作者:Liyuan Cao;A. Berahas;K. Scheinberg
- 通讯作者:Liyuan Cao;A. Berahas;K. Scheinberg
{{
item.title }}
{{ item.translation_title }}
- DOI:
{{ item.doi }} - 发表时间:
{{ item.publish_year }} - 期刊:
- 影响因子:{{ item.factor }}
- 作者:
{{ item.authors }} - 通讯作者:
{{ item.author }}
数据更新时间:{{ journalArticles.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ monograph.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ sciAawards.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ conferencePapers.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ patent.updateTime }}
Katya Scheinberg其他文献
Novel and Efficient Approximations for Zero-One and Ranking Losses of Linear Classifiers
- DOI:
10.1007/s10013-025-00767-6 - 发表时间:
2025-07-21 - 期刊:
- 影响因子:0.700
- 作者:
Hiva Ghanbari;Minhan Li;Katya Scheinberg - 通讯作者:
Katya Scheinberg
Efficient block-coordinate descent algorithms for the Group Lasso
- DOI:
10.1007/s12532-013-0051-x - 发表时间:
2013-03-31 - 期刊:
- 影响因子:3.600
- 作者:
Zhiwei Qin;Katya Scheinberg;Donald Goldfarb - 通讯作者:
Donald Goldfarb
OPTIMA Mathematical Programming Society Newsletter 79
- DOI:
- 发表时间:
2009 - 期刊:
- 影响因子:0
- 作者:
Katya Scheinberg - 通讯作者:
Katya Scheinberg
Katya Scheinberg的其他文献
{{
item.title }}
{{ item.translation_title }}
- DOI:
{{ item.doi }} - 发表时间:
{{ item.publish_year }} - 期刊:
- 影响因子:{{ item.factor }}
- 作者:
{{ item.authors }} - 通讯作者:
{{ item.author }}
{{ truncateString('Katya Scheinberg', 18)}}的其他基金
Collaborative Research: AF: Small: Adaptive Optimization of Stochastic and Noisy Function
合作研究:AF:小:随机和噪声函数的自适应优化
- 批准号:
2008434 - 财政年份:2020
- 资助金额:
$ 25万 - 项目类别:
Standard Grant
Randomized Models for Nonlinear Optimization: Theoretical Foundations and Practical Numerical Methods
非线性优化的随机模型:理论基础和实用数值方法
- 批准号:
1319356 - 财政年份:2013
- 资助金额:
$ 25万 - 项目类别:
Standard Grant
相似国自然基金
Research on Quantum Field Theory without a Lagrangian Description
- 批准号:24ZR1403900
- 批准年份:2024
- 资助金额:0.0 万元
- 项目类别:省市级项目
Cell Research
- 批准号:31224802
- 批准年份:2012
- 资助金额:24.0 万元
- 项目类别:专项基金项目
Cell Research
- 批准号:31024804
- 批准年份:2010
- 资助金额:24.0 万元
- 项目类别:专项基金项目
Cell Research (细胞研究)
- 批准号:30824808
- 批准年份:2008
- 资助金额:24.0 万元
- 项目类别:专项基金项目
Research on the Rapid Growth Mechanism of KDP Crystal
- 批准号:10774081
- 批准年份:2007
- 资助金额:45.0 万元
- 项目类别:面上项目
相似海外基金
Collaborative Research: AF: Medium: The Communication Cost of Distributed Computation
合作研究:AF:媒介:分布式计算的通信成本
- 批准号:
2402836 - 财政年份:2024
- 资助金额:
$ 25万 - 项目类别:
Continuing Grant
Collaborative Research: AF: Medium: Foundations of Oblivious Reconfigurable Networks
合作研究:AF:媒介:遗忘可重构网络的基础
- 批准号:
2402851 - 财政年份:2024
- 资助金额:
$ 25万 - 项目类别:
Continuing Grant
Collaborative Research: AF: Small: New Directions in Algorithmic Replicability
合作研究:AF:小:算法可复制性的新方向
- 批准号:
2342244 - 财政年份:2024
- 资助金额:
$ 25万 - 项目类别:
Standard Grant
Collaborative Research: AF: Small: Exploring the Frontiers of Adversarial Robustness
合作研究:AF:小型:探索对抗鲁棒性的前沿
- 批准号:
2335411 - 财政年份:2024
- 资助金额:
$ 25万 - 项目类别:
Standard Grant
NSF-BSF: Collaborative Research: AF: Small: Algorithmic Performance through History Independence
NSF-BSF:协作研究:AF:小型:通过历史独立性实现算法性能
- 批准号:
2420942 - 财政年份:2024
- 资助金额:
$ 25万 - 项目类别:
Standard Grant
Collaborative Research: AF: Medium: Algorithms Meet Machine Learning: Mitigating Uncertainty in Optimization
协作研究:AF:媒介:算法遇见机器学习:减轻优化中的不确定性
- 批准号:
2422926 - 财政年份:2024
- 资助金额:
$ 25万 - 项目类别:
Continuing Grant
Collaborative Research: AF: Small: Structural Graph Algorithms via General Frameworks
合作研究:AF:小型:通过通用框架的结构图算法
- 批准号:
2347322 - 财政年份:2024
- 资助金额:
$ 25万 - 项目类别:
Standard Grant
Collaborative Research: AF: Small: Real Solutions of Polynomial Systems
合作研究:AF:小:多项式系统的实数解
- 批准号:
2331401 - 财政年份:2024
- 资助金额:
$ 25万 - 项目类别:
Standard Grant
Collaborative Research: AF: Small: Real Solutions of Polynomial Systems
合作研究:AF:小:多项式系统的实数解
- 批准号:
2331400 - 财政年份:2024
- 资助金额:
$ 25万 - 项目类别:
Standard Grant
Collaborative Research: AF: Medium: Fast Combinatorial Algorithms for (Dynamic) Matchings and Shortest Paths
合作研究:AF:中:(动态)匹配和最短路径的快速组合算法
- 批准号:
2402283 - 财政年份:2024
- 资助金额:
$ 25万 - 项目类别:
Continuing Grant