Non-convex Optimization for Machine Learning: Theory and Methods

机器学习的非凸优化:理论与方法

基本信息

  • 批准号:
    RGPIN-2019-06167
  • 负责人:
  • 金额:
    $ 2.84万
  • 依托单位:
  • 依托单位国家:
    加拿大
  • 项目类别:
    Discovery Grants Program - Individual
  • 财政年份:
    2019
  • 资助国家:
    加拿大
  • 起止时间:
    2019-01-01 至 2020-12-31
  • 项目状态:
    已结题

项目摘要

Non-convex optimization has become an indispensable component of artificial intelligence***due to the structural properties of popular machine learning models.***Owing to their key role and empirical success in numerous learning tasks,***they have been a major focus of recent optimization research.***Many important characteristics of machine learning models,***such as generalization and fast-trainability,***are inherited from these optimization methods;***thus, a good understanding of these algorithms are crucial.******To this end, we use appropriate tools from statistics, diffusion theory,***and differential geometry to explain the empirical success of popular non-convex methods.***We further propose new paradigms for designing more efficient algorithms in this regime***where scalability is a structural issue, yet can be resolved by appealing to non-convex methods.******The main purpose of my research agenda is to improve our understanding on***non-convex algorithms which have become the dominant optimization tools in machine learning.***We further pursue several directions to build on our theoretical findings to design fast and efficient***algorithms for practical problems. The overall research plan can be broken into three sections,***to be pursued simultaneously:******1- Theoretical analysis of commonly used non-convex optimization algorithms,***2- Design of efficient optimization algorithms for machine learning,***3- Applying these methods to real problems.******For example in a recent work, we established non-asymptotic analysis of ***discretized diffusions for non-convex optimization tasks.***Our results provide explicit, finite-time convergence rates to global minima (item 1 above).***Based on this, we show that different diffusions are suitable for optimizing different classes of convex***and non-convex functions. This allows us to design diffusions suitable for globally optimizing convex and ***non-convex functions not covered by the existing literature (item 2 above).***We complement these results by showing that diffusions designed for ***a specific objective function can attain better global convergence***guarantees leading to problem-specific algorithm design (item 3 above).******In this proposal, we focus on two popular non-convex methods in machine learning: ***1- diffusion based and 2- matrix factorization based optimization.******Early work on diffusion based non-convex optimization has focused on a specific***diffusion named Langevin dynamics. Our work considers general Ito diffusions***which provide us with various benefits including fast convergence, ***wide applicability, and better convergence properties. ***We further study widely used matrix factorization based non-convex methods, ***and establish their theoretical guarantees. For both of these directions, we build on our theory, ***and design efficient and scalable algorithms for various machine learning problems.***Applications of these algorithms include recommender systems, ***inference in graphical models, neural networks etc.
由于流行的机器学习模型的结构特性,非凸优化已经成为人工智能不可或缺的组成部分。由于它们在许多学习任务中的关键作用和经验成功,***它们已成为最近优化研究的主要焦点。机器学习模型的许多重要特征,如泛化和快速可训练性,都是从这些优化方法中继承而来的;因此,很好地理解这些算法是至关重要的。******为此,我们使用统计学、扩散理论、***和微分几何中的适当工具来解释流行的非凸方法的经验成功。我们进一步提出了新的范例来设计更有效的算法,其中可扩展性是一个结构性问题,但可以通过吸引非凸方法来解决。******我的研究议程的主要目的是提高我们对***非凸算法的理解,这些算法已成为机器学习中占主导地位的优化工具。我们进一步研究了几个方向,以建立我们的理论发现,为实际问题设计快速有效的***算法。总体研究计划可分为三个部分,***同时进行:******1-常用非凸优化算法的理论分析,***2-设计用于机器学习的高效优化算法,***3-将这些方法应用于实际问题。******例如,在最近的一项工作中,我们为非凸优化任务建立了***离散扩散的非渐近分析。***我们的结果提供了明确的、有限时间的全局最小收敛率(上文第1项)。在此基础上,我们证明了不同的扩散适用于优化不同类型的凸函数和非凸函数。这允许我们设计适用于全局优化凸函数和现有文献未涵盖的非凸函数的扩散(上文第2项)。***我们通过证明为***特定目标函数设计的扩散可以获得更好的全局收敛***保证,从而实现针对特定问题的算法设计(上文第3项)来补充这些结果。******在本提案中,我们关注机器学习中两种流行的非凸方法:基于***1的扩散和基于2矩阵分解的优化。******基于扩散的非凸优化的早期工作集中在一个特定的称为朗格万动力学的扩散。我们的工作考虑了一般的Ito扩散,它为我们提供了各种好处,包括快速收敛,广泛的适用性和更好的收敛性。***我们进一步研究了广泛应用的基于矩阵分解的非凸方法,并建立了它们的理论保证。对于这两个方向,我们都以我们的理论为基础,为各种机器学习问题设计高效且可扩展的算法。这些算法的应用包括推荐系统、图形模型推理、神经网络等。

项目成果

期刊论文数量(0)
专著数量(0)
科研奖励数量(0)
会议论文数量(0)
专利数量(0)

数据更新时间:{{ journalArticles.updateTime }}

{{ item.title }}
{{ item.translation_title }}
  • DOI:
    {{ item.doi }}
  • 发表时间:
    {{ item.publish_year }}
  • 期刊:
  • 影响因子:
    {{ item.factor }}
  • 作者:
    {{ item.authors }}
  • 通讯作者:
    {{ item.author }}

数据更新时间:{{ journalArticles.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ monograph.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ sciAawards.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ conferencePapers.updateTime }}

{{ item.title }}
  • 作者:
    {{ item.author }}

数据更新时间:{{ patent.updateTime }}

Erdogdu, Murat其他文献

The effects of acute moderate intensity training on hematological parameters in elite para-badminton athletes
  • DOI:
    10.31083/jomh.2021.106
  • 发表时间:
    2021-08-29
  • 期刊:
  • 影响因子:
    0.7
  • 作者:
    Erdogdu, Murat;Yuksel, Mehmet Fatih;Sevindi, Tarik
  • 通讯作者:
    Sevindi, Tarik

Erdogdu, Murat的其他文献

{{ item.title }}
{{ item.translation_title }}
  • DOI:
    {{ item.doi }}
  • 发表时间:
    {{ item.publish_year }}
  • 期刊:
  • 影响因子:
    {{ item.factor }}
  • 作者:
    {{ item.authors }}
  • 通讯作者:
    {{ item.author }}

{{ truncateString('Erdogdu, Murat', 18)}}的其他基金

Non-convex Optimization for Machine Learning: Theory and Methods
机器学习的非凸优化:理论与方法
  • 批准号:
    RGPIN-2019-06167
  • 财政年份:
    2022
  • 资助金额:
    $ 2.84万
  • 项目类别:
    Discovery Grants Program - Individual
Non-convex Optimization for Machine Learning: Theory and Methods
机器学习的非凸优化:理论与方法
  • 批准号:
    RGPIN-2019-06167
  • 财政年份:
    2021
  • 资助金额:
    $ 2.84万
  • 项目类别:
    Discovery Grants Program - Individual
Non-convex Optimization for Machine Learning: Theory and Methods
机器学习的非凸优化:理论与方法
  • 批准号:
    RGPIN-2019-06167
  • 财政年份:
    2020
  • 资助金额:
    $ 2.84万
  • 项目类别:
    Discovery Grants Program - Individual
Non-convex Optimization for Machine Learning: Theory and Methods
机器学习的非凸优化:理论与方法
  • 批准号:
    DGECR-2019-00127
  • 财政年份:
    2019
  • 资助金额:
    $ 2.84万
  • 项目类别:
    Discovery Launch Supplement

相似海外基金

Collaborative Research: Consensus and Distributed Optimization in Non-Convex Environments with Applications to Networked Machine Learning
协作研究:非凸环境中的共识和分布式优化及其在网络机器学习中的应用
  • 批准号:
    2240789
  • 财政年份:
    2023
  • 资助金额:
    $ 2.84万
  • 项目类别:
    Standard Grant
Collaborative Research: Consensus and Distributed Optimization in Non-Convex Environments with Applications to Networked Machine Learning
协作研究:非凸环境中的共识和分布式优化及其在网络机器学习中的应用
  • 批准号:
    2240788
  • 财政年份:
    2023
  • 资助金额:
    $ 2.84万
  • 项目类别:
    Standard Grant
Non-convex Optimization for Machine Learning: Theory and Methods
机器学习的非凸优化:理论与方法
  • 批准号:
    RGPIN-2019-06167
  • 财政年份:
    2022
  • 资助金额:
    $ 2.84万
  • 项目类别:
    Discovery Grants Program - Individual
Non-convex Optimization for Machine Learning: Theory and Methods
机器学习的非凸优化:理论与方法
  • 批准号:
    RGPIN-2019-06167
  • 财政年份:
    2021
  • 资助金额:
    $ 2.84万
  • 项目类别:
    Discovery Grants Program - Individual
Discrete convex approximation on non-linear discrete optimization
非线性离散优化的离散凸逼近
  • 批准号:
    21K04533
  • 财政年份:
    2021
  • 资助金额:
    $ 2.84万
  • 项目类别:
    Grant-in-Aid for Scientific Research (C)
Non-convex Optimization for Machine Learning: Theory and Methods
机器学习的非凸优化:理论与方法
  • 批准号:
    RGPIN-2019-06167
  • 财政年份:
    2020
  • 资助金额:
    $ 2.84万
  • 项目类别:
    Discovery Grants Program - Individual
CAREER: Optimization Landscape for Non-convex Functions - Towards Provable Algorithms for Neural Networks
职业:非凸函数的优化景观 - 走向可证明的神经网络算法
  • 批准号:
    1845171
  • 财政年份:
    2019
  • 资助金额:
    $ 2.84万
  • 项目类别:
    Continuing Grant
Nonlinear non-convex optimization algorithm design for robust transceiver optimization in multi-user multi-antenna cloud radio access networks
用于多用户多天线云无线接入网络中鲁棒收发器优化的非线性非凸优化算法设计
  • 批准号:
    517334-2018
  • 财政年份:
    2019
  • 资助金额:
    $ 2.84万
  • 项目类别:
    Postdoctoral Fellowships
Non-convex Optimization for Machine Learning: Theory and Methods
机器学习的非凸优化:理论与方法
  • 批准号:
    DGECR-2019-00127
  • 财政年份:
    2019
  • 资助金额:
    $ 2.84万
  • 项目类别:
    Discovery Launch Supplement
Simulation and Optimization of Rate-Independent Systems with Non-Convex Energies
具有非凸能量的速率无关系统的仿真和优化
  • 批准号:
    423630709
  • 财政年份:
    2019
  • 资助金额:
    $ 2.84万
  • 项目类别:
    Priority Programmes
{{ showInfoDetail.title }}

作者:{{ showInfoDetail.author }}

知道了