Research

Research Interests

My current interests include

  • Optimization and Game Theory

  • Machine Learning

  • Optimal Transport

  • Economic and Social Networks

Preprints

  1. A nonasymptotic analysis of gradient descent ascent for nonconvex-concave minimax problems
    Tianyi Lin, Chi Jin and Michael I. Jordan.
    Revision. [SSRN]

  2. Explicit second-order min-max optimization methods with optimal convergence guarantee
    Tianyi Lin, Panayotis Mertikopoulos and Michael. I. Jordan.
    Revision. [ArXiv]

  3. Curvature-independent last-iterate convergence for games on Riemannian manifolds
    Yang Cai, Michael I. Jordan, Tianyi Lin, Argyris Oikonomou and Emmanouil-Vasileios Vlatakis-Gkaragkounis.
    Submitted. [ArXiv]

Selected Refereed Journal Publications

  1. Perseus: A simple and optimal high-order method for variational inequalities
    Tianyi Lin and Michael I. Jordan.
    Mathematical Programming, 2024.
    [ArXiv] [DOI]

  2. Doubly optimal no-regret online learning in strongly monotone games with bandit feedback
    Wenjia Ba, Tianyi Lin, Jiawei Zhang and Zhengyuan Zhou.
    Operations Research, 2024.
    [ArXiv] [SSRN] [DOI]

  3. Monotone inclusions, acceleration and closed-loop control
    Tianyi Lin and Michael I. Jordan.
    Mathematics of Operations Research, 48(4): 2353-2382, 2023.
    [ArXiv] [DOI]

  4. A control-theoretic perspective on optimal high-order optimization
    Tianyi Lin and Michael I. Jordan.
    Mathematical Programming, 195(1): 929-975, 2022.
    [ArXiv] [DOI]

  5. On the efficiency of entropic regularized algorithms for optimal transport
    Tianyi Lin, Nhat Ho and Michael I. Jordan.
    Journal of Machine Learning Research, 23(137): 1-42, 2022.
    [ArXiv] [DOI]

  6. A unified adaptive tensor approximation scheme to accelerate composite convex optimization
    Bo Jiang, Tianyi Lin and Shuzhong Zhang.
    SIAM Journal on Optimization, 30(4): 2897-2926, 2020.
    [ArXiv] [DOI]

  7. On the global linear convergence of the ADMM with multi-block variables
    Tianyi Lin, Shiqian Ma and Shuzhong Zhang.
    SIAM Journal on Optimization, 25(3): 1478-1497, 2015.
    [ArXiv] [DOI]

Selected Refereed Conference Proceedings

  1. Deterministic nonsmooth nonconvex optimization
    Michael I. Jordan, Guy Kornowski, Tianyi Lin, Ohad Shamir and Manolis Zampetakis.
    Conference on Learning Theory (COLT), 2023.
    [ArXiv] [Earlier ArXiv] [DOI]

  2. Gradient-free methods for deterministic and stochastic nonsmooth nonconvex optimization
    Tianyi Lin, Zeyu Zheng and Michael I. Jordan.
    Neural Information Processing Systems (NeurIPS), 2022.
    [ArXiv] [DOI]

  3. First-order algorithms for min-max optimization in geodesic metric spaces
    Michael I. Jordan, Tianyi Lin and Emmanouil-Vasileios Vlatakis-Gkaragkounis.
    (Oral) Neural Information Processing Systems (NeurIPS), 2022.
    [ArXiv] [DOI]

  4. Projection robust Wasserstein distance and Riemannian optimization
    Tianyi Lin*, Chenyou Fan*, Nhat Ho, Macro Cuturi and Michael I. Jordan.
    (Spotlight) Neural Information Processing Systems (NeurIPS), 2020.
    [ArXiv] [DOI]

  5. Fixed-support Wasserstein barycenters: Computational hardness and fast algorithm
    Tianyi Lin, Nhat Ho, Xi Chen, Macro Cuturi and Michael I. Jordan.
    Neural Information Processing Systems (NeurIPS), 2020.
    [ArXiv] [DOI]

  6. Finite-time last-iterate convergence for multi-agent learning in games
    Tianyi Lin*, Zhengyuan Zhou*, Panayotis Mertikopoulos and Michael I. Jordan.
    International Conference on Machine Learning (ICML), 2020.
    [ArXiv] [DOI]

  7. On gradient descent ascent for nonconvex-concave minimax problems
    Tianyi Lin, Chi Jin and Michael I. Jordan.
    International Conference on Machine Learning (ICML), 2020.
    [ArXiv] [DOI]

  8. Near-optimal algorithms for minimax optimization
    Tianyi Lin, Chi Jin and Michael I. Jordan.
    Conference on Learning Theory (COLT), 2020.
    [ArXiv] [DOI]

  9. On efficient optimal transport: An analysis of greedy and accelerated mirror descent algorithms
    Tianyi Lin*, Nhat Ho* and Michael I. Jordan.
    International Conference on Machine Learning (ICML), 2019.
    [ArXiv] [DOI]

  10. The dual-sparse topic model: Mining focused topics and focused terms in short text
    Tianyi Lin*, Wentao Tian*, Qiaozhu Mei and Hong Cheng.
    ACM International Conference on World Wide Web (WWW), 2014.
    [DOI]