CAREER: Parameter-free Optimization Algorithms for Machine Learning
职业:机器学习的无参数优化算法
基本信息
- 批准号:2046096
- 负责人:
- 金额:$ 58.39万
- 依托单位:
- 依托单位国家:美国
- 项目类别:Continuing Grant
- 财政年份:2021
- 资助国家:美国
- 起止时间:2021-02-15 至 2026-01-31
- 项目状态:未结题
- 来源:
- 关键词:
项目摘要
Machine Learning (ML) has been described as the fuel of the next industrial revolution. Yet, current state-of-the-art ML algorithms still heavily rely on having a human in the loop in order to work properly. Indeed, the training process of a ML algorithm requires significant human intervention through twisting and tuning of the many knobs of the algorithm. Moreover, most of the time this tweaking process is carried out without any theoretical guidelines. The most common options for the ML practitioners are to follow their intuitions or to exhaustively evaluate all the possibilities, making the overall time needed to train these algorithms difficult to predict. In this project, the investigator aims at designing truly parameter-free training algorithms for ML, to remove the burden of human parameter-tuning and make ML algorithms more scalable. The investigator also proposes an Education Plan that is designed to fill the gap between theoretical and applied ML in the minds of most students. It will involve a collaboration with the PROgram in Mathematics for Young Scientists (PROMYS), to introduce the study of very basic concepts of the theory of machine learning to both high school students and teachers, and with the women’s Smith College.The project stems from new methods the investigator has recently introduced to design optimal parameter-free optimization algorithms, e.g., Stochastic Gradient Descent without learning rates, for the particular case of convex functions. In this project, the investigator proposes to go beyond convex functions. In particular, the investigator plans to pursue the following inter-related aims:1. Computational complexity of learning without prior knowledge. The objective is to fully characterize the computational complexity of learning, without assuming knowledge of unknown quantities, in worst and "easy" cases. In other words, if a learning algorithm needs parameters to achieve optimal performance, the goal is to fully characterize the overall computational complexity, including the search for the optimal parameters.2. Non-convex stochastic optimization without parameters. The project's goal is to generalize recent results on parameter-free optimization to deal with the non-convex problems in modern ML algorithms. The investigator plans to use a novel optimization framework that mixes elements of Online Mirror Descent with Dual Averaging, the two frameworks used to design optimization algorithms.3. Reducing parameters in optimization for Deep Learning. Deep Learning models are the most parameter-heavy in ML. The project considers three applications of the project's research to widely used deep-learning algorithms: adaptive scaling heuristics, saddle-point optimization, and Bayesian Optimization with Gaussian Processes.This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
机器学习(ML)被认为是下一次工业革命的燃料。然而,目前最先进的ML算法仍然严重依赖于人类参与循环才能正常工作。事实上,ML算法的训练过程需要大量的人为干预,通过扭曲和调整算法的许多旋钮。此外,大多数时候,这种调整过程是在没有任何理论指导的情况下进行的。ML从业者最常见的选择是遵循他们的直觉或彻底评估所有可能性,这使得训练这些算法所需的总时间难以预测。在这个项目中,研究人员的目标是为ML设计真正无参数的训练算法,以消除人类参数调整的负担,并使ML算法更具可扩展性。研究人员还提出了一个教育计划,旨在填补大多数学生心目中理论和应用ML之间的差距。它将涉及与青年科学家数学项目(PROMYS)的合作,向高中学生和教师介绍机器学习理论的基本概念,并与女子史密斯学院合作。该项目源于研究人员最近推出的设计最佳无参数优化算法的新方法,例如,无学习率的随机梯度下降,适用于凸函数的特殊情况。在这个项目中,研究人员提出超越凸函数。 特别是,研究者计划追求以下相互关联的目标:1。没有先验知识的学习的计算复杂性。我们的目标是充分表征学习的计算复杂性,而不假设未知量的知识,在最坏的和“简单”的情况下。换句话说,如果一个学习算法需要参数来实现最优性能,那么目标就是充分表征整个计算复杂度,包括搜索最优参数。2.无参数非凸随机优化。该项目的目标是推广最近的无参数优化结果,以处理现代ML算法中的非凸问题。研究人员计划使用一种新的优化框架,该框架混合了在线镜像下降与双重平均的元素,这两个框架用于设计优化算法。减少深度学习优化中的参数。深度学习模型是机器学习中参数最多的模型。该项目考虑了该项目研究在广泛使用的深度学习算法中的三个应用:自适应缩放算法、鞍点优化和高斯过程贝叶斯优化。该奖项反映了NSF的法定使命,并通过使用基金会的知识价值和更广泛的影响审查标准进行评估,被认为值得支持。
项目成果
期刊论文数量(12)
专著数量(0)
科研奖励数量(0)
会议论文数量(0)
专利数量(0)
On the Last Iterate Convergence of Momentum Methods
- DOI:
- 发表时间:2021-02
- 期刊:
- 影响因子:0
- 作者:Xiaoyun Li;Mingrui Liu;Francesco Orabona
- 通讯作者:Xiaoyun Li;Mingrui Liu;Francesco Orabona
Better Parameter-free Stochastic Optimization with ODE Updates for Coin-Betting
通过 ODE 更新实现更好的无参数随机优化以进行硬币投注
- DOI:
- 发表时间:2022
- 期刊:
- 影响因子:0
- 作者:Chen, Keyi;Langford, John;Orabona, Francesco
- 通讯作者:Orabona, Francesco
Minimax Optimal Quantile and Semi-Adversarial Regret via Root-Logarithmic Regularizers
通过根对数正则化器的最小最大最优分位数和半对抗性遗憾
- DOI:
- 发表时间:2021
- 期刊:
- 影响因子:0
- 作者:Negrea, Jeffrey;Bilodeau, Blair;Campolongo, Nicolò;Orabona, Francesco;Roy, Dan
- 通讯作者:Roy, Dan
Robustness to Unbounded Smoothness of Generalized SignSGD
- DOI:10.48550/arxiv.2208.11195
- 发表时间:2022-08
- 期刊:
- 影响因子:0
- 作者:M. Crawshaw;Mingrui Liu;Francesco Orabona;W. Zhang;Zhenxun Zhuang
- 通讯作者:M. Crawshaw;Mingrui Liu;Francesco Orabona;W. Zhang;Zhenxun Zhuang
Understanding AdamW through Proximal Methods and Scale-Freeness
- DOI:
- 发表时间:2022-01
- 期刊:
- 影响因子:0
- 作者:Zhenxun Zhuang;Mingrui Liu;Ashok Cutkosky;Francesco Orabona
- 通讯作者:Zhenxun Zhuang;Mingrui Liu;Ashok Cutkosky;Francesco Orabona
{{
item.title }}
{{ item.translation_title }}
- DOI:
{{ item.doi }} - 发表时间:
{{ item.publish_year }} - 期刊:
- 影响因子:{{ item.factor }}
- 作者:
{{ item.authors }} - 通讯作者:
{{ item.author }}
数据更新时间:{{ journalArticles.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ monograph.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ sciAawards.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ conferencePapers.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ patent.updateTime }}
Francesco Orabona其他文献
Implicit Interpretation of Importance Weight Aware Updates
重要性重量感知更新的隐式解释
- DOI:
10.48550/arxiv.2307.11955 - 发表时间:
2023 - 期刊:
- 影响因子:0
- 作者:
Keyi Chen;Francesco Orabona - 通讯作者:
Francesco Orabona
3 Warm-Up : From Betting to One-Dimensional Online Linear Optimization
3 热身:从投注到一维在线线性优化
- DOI:
- 发表时间:
2016 - 期刊:
- 影响因子:0
- 作者:
Francesco Orabona - 通讯作者:
Francesco Orabona
A theoretical framework for transfer of knowledge across modalities in artificial and biological systems
人工和生物系统中跨模式知识转移的理论框架
- DOI:
- 发表时间:
2009 - 期刊:
- 影响因子:0
- 作者:
Francesco Orabona;B. Caputo;A. Fillbrandt;F. Ohl - 通讯作者:
F. Ohl
Regression-tree Tuning in a Streaming Setting
流媒体设置中的回归树调整
- DOI:
- 发表时间:
2013 - 期刊:
- 影响因子:0
- 作者:
Samory Kpotufe;Francesco Orabona - 通讯作者:
Francesco Orabona
Discrete camera calibration from pixel streams
- DOI:
10.1016/j.cviu.2009.03.009 - 发表时间:
2010-02-01 - 期刊:
- 影响因子:
- 作者:
Etienne Grossmann;José António Gaspar;Francesco Orabona - 通讯作者:
Francesco Orabona
Francesco Orabona的其他文献
{{
item.title }}
{{ item.translation_title }}
- DOI:
{{ item.doi }} - 发表时间:
{{ item.publish_year }} - 期刊:
- 影响因子:{{ item.factor }}
- 作者:
{{ item.authors }} - 通讯作者:
{{ item.author }}
{{ truncateString('Francesco Orabona', 18)}}的其他基金
Collaborative Research: TRIPODS Institute for Optimization and Learning
合作研究:TRIPODS 优化与学习研究所
- 批准号:
1925930 - 财政年份:2019
- 资助金额:
$ 58.39万 - 项目类别:
Continuing Grant
AF: Small: Collaborative Research: New Representations for Learning Algorithms and Secure Computation
AF:小型:协作研究:学习算法和安全计算的新表示
- 批准号:
1908111 - 财政年份:2019
- 资助金额:
$ 58.39万 - 项目类别:
Standard Grant
Collaborative Research: TRIPODS Institute for Optimization and Learning
合作研究:TRIPODS 优化与学习研究所
- 批准号:
1740762 - 财政年份:2018
- 资助金额:
$ 58.39万 - 项目类别:
Continuing Grant
相似海外基金
Adaptive optimization: parameter-free self-tuning algorithms beyond smoothness and convexity
自适应优化:超越平滑性和凸性的无参数自调整算法
- 批准号:
24K20737 - 财政年份:2024
- 资助金额:
$ 58.39万 - 项目类别:
Grant-in-Aid for Early-Career Scientists
NSF-BSF: AF: Small: Parameter-Free Stochastic Optimization via Trajectory Cues
NSF-BSF:AF:小:通过轨迹线索进行无参数随机优化
- 批准号:
2239527 - 财政年份:2023
- 资助金额:
$ 58.39万 - 项目类别:
Standard Grant
Parameter-free Algorithms for Reinforcement Learning
强化学习的无参数算法
- 批准号:
558512-2021 - 财政年份:2022
- 资助金额:
$ 58.39万 - 项目类别:
Alexander Graham Bell Canada Graduate Scholarships - Doctoral
Parameter-Free Stochastic Gradient Descent: Fast, Self-Tuning Algorithms for Training Deep Neural Networks
无参数随机梯度下降:用于训练深度神经网络的快速自调整算法
- 批准号:
547242-2020 - 财政年份:2022
- 资助金额:
$ 58.39万 - 项目类别:
Postgraduate Scholarships - Doctoral
Parameter-free Algorithms for Reinforcement Learning
强化学习的无参数算法
- 批准号:
558512-2021 - 财政年份:2021
- 资助金额:
$ 58.39万 - 项目类别:
Alexander Graham Bell Canada Graduate Scholarships - Doctoral
Parameter-Free Stochastic Gradient Descent: Fast, Self-Tuning Algorithms for Training Deep Neural Networks
无参数随机梯度下降:用于训练深度神经网络的快速自调整算法
- 批准号:
547242-2020 - 财政年份:2021
- 资助金额:
$ 58.39万 - 项目类别:
Postgraduate Scholarships - Doctoral
Parameter-Free Optimization Algorithms for Underdetermined Linear Inverse Problems
欠定线性反问题的无参数优化算法
- 批准号:
20K23324 - 财政年份:2020
- 资助金额:
$ 58.39万 - 项目类别:
Grant-in-Aid for Research Activity Start-up
Parameter-Free Stochastic Gradient Descent: Fast, Self-Tuning Algorithms for Training Deep Neural Networks
无参数随机梯度下降:用于训练深度神经网络的快速自调整算法
- 批准号:
547242-2020 - 财政年份:2020
- 资助金额:
$ 58.39万 - 项目类别:
Postgraduate Scholarships - Doctoral
A Label-Free, Many-Parameter Benchtop Platform for Functionally-Preserved, Viable Cancer Stem Cell Isolation and Biomarker Discovery to Probe Urothelial Carcinoma of the Bladder
无标记、多参数台式平台,用于功能保留、活的癌症干细胞分离和生物标志物发现,以探测膀胱尿路上皮癌
- 批准号:
10086817 - 财政年份:2019
- 资助金额:
$ 58.39万 - 项目类别:
Towards a Parameter-Free Theory for Electrochemical Phenomena at the Nanoscale (NanoEC)
迈向纳米尺度电化学现象的无参数理论 (NanoEC)
- 批准号:
EP/P033555/1 - 财政年份:2018
- 资助金额:
$ 58.39万 - 项目类别:
Fellowship