Non-convex Optimization for Machine Learning: Theory and Methods
机器学习的非凸优化:理论与方法
基本信息
- 批准号:RGPIN-2019-06167
- 负责人:
- 金额:$ 2.84万
- 依托单位:
- 依托单位国家:加拿大
- 项目类别:Discovery Grants Program - Individual
- 财政年份:2020
- 资助国家:加拿大
- 起止时间:2020-01-01 至 2021-12-31
- 项目状态:已结题
- 来源:
- 关键词:
项目摘要
Non-convex optimization has become an indispensable component of artificial intelligence
due to the structural properties of popular machine learning models.
Owing to their key role and empirical success in numerous learning tasks,
they have been a major focus of recent optimization research.
Many important characteristics of machine learning models,
such as generalization and fast-trainability,
are inherited from these optimization methods;
thus, a good understanding of these algorithms are crucial.
To this end, we use appropriate tools from statistics, diffusion theory,
and differential geometry to explain the empirical success of popular non-convex methods.
We further propose new paradigms for designing more efficient algorithms in this regime
where scalability is a structural issue, yet can be resolved by appealing to non-convex methods.
The main purpose of my research agenda is to improve our understanding on
non-convex algorithms which have become the dominant optimization tools in machine learning.
We further pursue several directions to build on our theoretical findings to design fast and efficient
algorithms for practical problems. The overall research plan can be broken into three sections,
to be pursued simultaneously:
1- Theoretical analysis of commonly used non-convex optimization algorithms,
2- Design of efficient optimization algorithms for machine learning,
3- Applying these methods to real problems.
For example in a recent work, we established non-asymptotic analysis of
discretized diffusions for non-convex optimization tasks.
Our results provide explicit, finite-time convergence rates to global minima (item 1 above).
Based on this, we show that different diffusions are suitable for optimizing different classes of convex
and non-convex functions. This allows us to design diffusions suitable for globally optimizing convex and
non-convex functions not covered by the existing literature (item 2 above).
We complement these results by showing that diffusions designed for
a specific objective function can attain better global convergence
guarantees leading to problem-specific algorithm design (item 3 above).
In this proposal, we focus on two popular non-convex methods in machine learning:
1- diffusion based and 2- matrix factorization based optimization.
Early work on diffusion based non-convex optimization has focused on a specific
diffusion named Langevin dynamics. Our work considers general Ito diffusions
which provide us with various benefits including fast convergence,
wide applicability, and better convergence properties.
We further study widely used matrix factorization based non-convex methods,
and establish their theoretical guarantees. For both of these directions, we build on our theory,
and design efficient and scalable algorithms for various machine learning problems.
Applications of these algorithms include recommender systems,
inference in graphical models, neural networks etc.
非凸优化已经成为人工智能不可缺少的组成部分
项目成果
期刊论文数量(0)
专著数量(0)
科研奖励数量(0)
会议论文数量(0)
专利数量(0)
数据更新时间:{{ journalArticles.updateTime }}
{{
item.title }}
{{ item.translation_title }}
- DOI:
{{ item.doi }} - 发表时间:
{{ item.publish_year }} - 期刊:
- 影响因子:{{ item.factor }}
- 作者:
{{ item.authors }} - 通讯作者:
{{ item.author }}
数据更新时间:{{ journalArticles.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ monograph.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ sciAawards.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ conferencePapers.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ patent.updateTime }}
Erdogdu, Murat其他文献
The effects of acute moderate intensity training on hematological parameters in elite para-badminton athletes
- DOI:
10.31083/jomh.2021.106 - 发表时间:
2021-08-29 - 期刊:
- 影响因子:0.7
- 作者:
Erdogdu, Murat;Yuksel, Mehmet Fatih;Sevindi, Tarik - 通讯作者:
Sevindi, Tarik
Erdogdu, Murat的其他文献
{{
item.title }}
{{ item.translation_title }}
- DOI:
{{ item.doi }} - 发表时间:
{{ item.publish_year }} - 期刊:
- 影响因子:{{ item.factor }}
- 作者:
{{ item.authors }} - 通讯作者:
{{ item.author }}
{{ truncateString('Erdogdu, Murat', 18)}}的其他基金
Non-convex Optimization for Machine Learning: Theory and Methods
机器学习的非凸优化:理论与方法
- 批准号:
RGPIN-2019-06167 - 财政年份:2022
- 资助金额:
$ 2.84万 - 项目类别:
Discovery Grants Program - Individual
Non-convex Optimization for Machine Learning: Theory and Methods
机器学习的非凸优化:理论与方法
- 批准号:
RGPIN-2019-06167 - 财政年份:2021
- 资助金额:
$ 2.84万 - 项目类别:
Discovery Grants Program - Individual
Non-convex Optimization for Machine Learning: Theory and Methods
机器学习的非凸优化:理论与方法
- 批准号:
RGPIN-2019-06167 - 财政年份:2019
- 资助金额:
$ 2.84万 - 项目类别:
Discovery Grants Program - Individual
Non-convex Optimization for Machine Learning: Theory and Methods
机器学习的非凸优化:理论与方法
- 批准号:
DGECR-2019-00127 - 财政年份:2019
- 资助金额:
$ 2.84万 - 项目类别:
Discovery Launch Supplement
相似海外基金
Collaborative Research: Consensus and Distributed Optimization in Non-Convex Environments with Applications to Networked Machine Learning
协作研究:非凸环境中的共识和分布式优化及其在网络机器学习中的应用
- 批准号:
2240789 - 财政年份:2023
- 资助金额:
$ 2.84万 - 项目类别:
Standard Grant
Collaborative Research: Consensus and Distributed Optimization in Non-Convex Environments with Applications to Networked Machine Learning
协作研究:非凸环境中的共识和分布式优化及其在网络机器学习中的应用
- 批准号:
2240788 - 财政年份:2023
- 资助金额:
$ 2.84万 - 项目类别:
Standard Grant
Non-convex Optimization for Machine Learning: Theory and Methods
机器学习的非凸优化:理论与方法
- 批准号:
RGPIN-2019-06167 - 财政年份:2022
- 资助金额:
$ 2.84万 - 项目类别:
Discovery Grants Program - Individual
Non-convex Optimization for Machine Learning: Theory and Methods
机器学习的非凸优化:理论与方法
- 批准号:
RGPIN-2019-06167 - 财政年份:2021
- 资助金额:
$ 2.84万 - 项目类别:
Discovery Grants Program - Individual
Discrete convex approximation on non-linear discrete optimization
非线性离散优化的离散凸逼近
- 批准号:
21K04533 - 财政年份:2021
- 资助金额:
$ 2.84万 - 项目类别:
Grant-in-Aid for Scientific Research (C)
Non-convex Optimization for Machine Learning: Theory and Methods
机器学习的非凸优化:理论与方法
- 批准号:
RGPIN-2019-06167 - 财政年份:2019
- 资助金额:
$ 2.84万 - 项目类别:
Discovery Grants Program - Individual
CAREER: Optimization Landscape for Non-convex Functions - Towards Provable Algorithms for Neural Networks
职业:非凸函数的优化景观 - 走向可证明的神经网络算法
- 批准号:
1845171 - 财政年份:2019
- 资助金额:
$ 2.84万 - 项目类别:
Continuing Grant
Nonlinear non-convex optimization algorithm design for robust transceiver optimization in multi-user multi-antenna cloud radio access networks
用于多用户多天线云无线接入网络中鲁棒收发器优化的非线性非凸优化算法设计
- 批准号:
517334-2018 - 财政年份:2019
- 资助金额:
$ 2.84万 - 项目类别:
Postdoctoral Fellowships
Simulation and Optimization of Rate-Independent Systems with Non-Convex Energies
具有非凸能量的速率无关系统的仿真和优化
- 批准号:
423630709 - 财政年份:2019
- 资助金额:
$ 2.84万 - 项目类别:
Priority Programmes
Non-convex Optimization for Machine Learning: Theory and Methods
机器学习的非凸优化:理论与方法
- 批准号:
DGECR-2019-00127 - 财政年份:2019
- 资助金额:
$ 2.84万 - 项目类别:
Discovery Launch Supplement