Publications

2024

  1. Perseus: A simple and optimal high-order method for variational inequalities
    Tianyi Lin, and Michael I Jordan
    Mathematical Programming (Series A), 2024
  2. Doubly optimal no-regret online learning in strongly monotone games with bandit feedback
    Wenjia Ba, Tianyi Lin, Jiawei Zhang, and Zhengyuan Zhou
    Operations Research, 2024
  3. Adaptive, doubly optimal no-regret learning in strongly monotone and exp-concave games with gradient feedback
    Michael I Jordan, Tianyi Lin, and Zhengyuan Zhou
    Operations Research, 2024
  4. A continuous-time perspective on global acceleration for monotone equation problems
    Tianyi Lin, and Michael I Jordan
    Communications in Optimization Theory, (Invited paper on Special issue dedicated to the memory of Professor Hedy Attouch) , 2024
  5. A specialized semismooth Newton method for kernel-based optimal transport
    Tianyi Lin, Marco Cuturi, and Michael I Jordan
    In International Conference on Artificial Intelligence and Statistics (AISTATS) , 2024

2023

  1. Curvature-independent last-iterate convergence for games on Riemannian manifolds
    Yang Cai, Michael I Jordan, Tianyi Lin, Argyris Oikonomou, and Emmanouil-Vasileios Vlatakis-Gkaragkounis
    ArXiv Preprint, 2023
  2. Structure-driven algorithm design in optimization and machine learning
    Tianyi Lin
    UC Berkeley , 2023
  3. Monotone inclusions, acceleration, and closed-loop control
    Tianyi Lin, and Michael I Jordan
    Mathematics of Operations Research, 2023
  4. First-order algorithms for nonlinear generalized Nash equilibrium problems
    Michael I Jordan, Tianyi Lin, and Manolis Zampetakis
    Journal of Machine Learning Research, 2023
  5. Deterministic nonsmooth nonconvex optimization
    Michael I Jordan, Guy Kornowski, Tianyi Lin, Ohad Shamir, and Manolis Zampetakis
    In Conference on Learning Theory (COLT) , 2023

2022

  1. Two-timescale gradient descent ascent algorithms for nonconvex minimax optimization
    Tianyi Lin, Chi Jin, and Michael I Jordan
    Preprint, 2022
  2. Explicit second-order min-max optimization methods with optimal convergence guarantee
    Tianyi Lin, Panayotis Mertikopoulos, and Michael I Jordan
    Preprint, 2022
  3. A control-theoretic perspective on optimal high-order optimization
    Tianyi Lin, and Michael I Jordan
    Mathematical Programming (Series A), 2022
  4. On the efficiency of entropic regularized algorithms for optimal transport
    Tianyi Lin, Nhat Ho, and Michael I Jordan
    Journal of Machine Learning Research, 2022
  5. Accelerating adaptive cubic regularization of Newton’s method via random sampling
    Xi Chen, Bo Jiang, Tianyi Lin, and Shuzhong Zhang
    Journal of Machine Learning Research, 2022
  6. On the complexity of approximating multimarginal optimal transport
    Tianyi Lin, Nhat Ho, Marco Cuturi, and Michael I Jordan
    Journal of Machine Learning Research, 2022
  7. Gradient-free methods for deterministic and stochastic nonsmooth nonconvex optimization
    Tianyi Lin, Zeyu Zheng, and Michael Jordan
    In International Conference on Neural Information Processing Systems (NeurIPS) , 2022
  8. First-order algorithms for min-max optimization in geodesic metric spaces
    Michael I Jordan, Tianyi Lin, and Emmanouil-Vasileios Vlatakis-Gkaragkounis
    In International Conference on Neural Information Processing Systems (NeurIPS) , 2022
  9. Online nonsubmodular minimization with delayed costs: From full information to bandit feedback
    Tianyi Lin, Aldo Pacchiano, Yaodong Yu, and Michael I Jordan
    In International Conference on Machine Learning (ICML) , 2022
  10. Fast distributionally robust learning with variance-reduced min-max optimization
    Yaodong Yu, Tianyi Lin, Eric V Mazumdar, and Michael I Jordan
    In International Conference on Artificial Intelligence and Statistics (AISTATS) , 2022
  11. On structured filtering-clustering: Global error bound and optimal first-order algorithms
    Nhat Ho, Tianyi Lin, and Michael I Jordan
    In International Conference on Artificial Intelligence and Statistics (AISTATS) , 2022

2021

  1. An ADMM-based interior-point method for large-scale linear programming
    Tianyi Lin, Shiqian Ma, Yinyu Ye, and Shuzhong Zhang
    Optimization Methods and Software, (Invited paper on Special issue dedicated to the memory of Professor Masao Iri) , 2021
  2. A variational inequality approach to Bayesian regression games
    Wenshuo Guo, Michael I Jordan, and Tianyi Lin
    In IEEE Conference on Decision and Control (CDC) , 2021
  3. Relaxed Wasserstein with applications to GANs
    Xin Guo, Johnny Hong, Tianyi Lin, and Nan Yang
    In International Conference on Acoustics, Speech and Signal Processing (ICASSP) , 2021
  4. On projection robust optimal transport: Sample complexity and model misspecification
    Tianyi Lin, Zeyu Zheng, Elynn Chen, Marco Cuturi, and Michael I Jordan
    In International Conference on Artificial Intelligence and Statistics (AISTATS) , 2021

2020

  1. A unified adaptive tensor approximation scheme to accelerate composite convex optimization
    Bo Jiang, Tianyi Lin, and Shuzhong Zhang
    SIAM Journal on Optimization, 2020
  2. Fixed-support Wasserstein barycenters: Computational hardness and fast algorithm
    Tianyi Lin, Nhat Ho, Xi Chen, Marco Cuturi, and Michael I Jordan
    In International Conference on Neural Information Processing Systems (NeurIPS) , 2020
  3. Projection robust Wasserstein distance and Riemannian optimization
    Tianyi Lin, Chenyou Fan, Nhat Ho, Marco Cuturi, and Michael I Jordan
    In International Conference on Neural Information Processing Systems (NeurIPS) , 2020
  4. New proximal Newton-type methods for convex optimization
    Ilan Adler, Zhiyue T Hu, and Tianyi Lin
    In IEEE Conference on Decision and Control (CDC) , 2020
  5. On gradient descent ascent for nonconvex-concave minimax problems
    Tianyi Lin, Chi Jin, and Michael I Jordan
    In International Conference on Machine Learning (ICML) , 2020
  6. Finite-time last-iterate convergence for multi-agent learning in games
    Tianyi Lin, Zhengyuan Zhou, Panayotis Mertikopoulos, and Michael I Jordan
    In International Conference on Machine Learning (ICML) , 2020
  7. Near-optimal algorithms for minimax optimization
    Tianyi Lin, Chi Jin, and Michael I Jordan
    In Conference on Learning Theory (COLT) , 2020
  8. Improved sample complexity for stochastic compositional variance reduced gradient
    Tianyi Lin, Chenyou Fan, Mengdi Wang, and Michael I Jordan
    In American Control Conference (ACC) , 2020

2019

  1. Structured nonconvex and nonsmooth optimization: Algorithms and iteration complexity analysis
    Bo Jiang, Tianyi Lin, Shiqian Ma, and Shuzhong Zhang
    Computational Optimization and Applications, 2019
  2. On efficient optimal transport: An analysis of greedy and accelerated mirror descent algorithms
    Tianyi Lin, Nhat Ho, and Michael Jordan
    In International Conference on Machine Learning (ICML) , 2019
  3. Sparsemax and relaxed Wasserstein for topic sparsity
    Tianyi Lin, Zhiyue Hu, and Xin Guo
    In International Conference on Web Search and Data Mining (WSDM) , 2019

2018

  1. Global convergence of unmodified 3-block ADMM for a class of convex minimization problems
    Tianyi Lin, Shiqian Ma, and Shuzhong Zhang
    Journal of Scientific Computing, 2018
  2. On the iteration complexity analysis of stochastic pimal-dual hybrid gradient approach with high probability
    Linbo Qiao, Tianyi Lin, Qi Qin, and Xicheng Lu
    Neurocomputing, 2018
  3. Stochastic primal-dual proximal extragradient descent for compositely regularized optimization
    Tianyi Lin, Linbo Qiao, Teng Zhang, Jiashi Feng, and Bofeng Zhang
    Neurocomputing, 2018

2017

  1. Distributed linearized alternating direction method of multipliers for composite convex consensus optimization
    Necdet Serhat Aybat, Zi Wang, Tianyi Lin, and Shiqian Ma
    IEEE Transactions on Automatic Control, 2017
  2. An extragradient-based alternating direction method for convex minimization
    Tianyi Lin, Shiqian Ma, and Shuzhong Zhang
    Foundations of Computational Mathematics, 2017
  3. Exploiting interactions of review text, hidden user communities and item groups, and time for collaborative filtering
    Yinqing Xu, Qian Yu, Wai Lam, and Tianyi Lin
    Knowledge and Information Systems, 2017

2016

  1. Iteration complexity analysis of multi-block ADMM for a family of convex minimization without strong convexity
    Tianyi Lin, Shiqian Ma, and Shuzhong Zhang
    Journal of Scientific Computing, 2016
  2. Understanding sparse topical structure of short text via stochastic variational-Gibbs inference
    Tianyi Lin, Siyuan Zhang, and Hong Cheng
    In International on Conference on Information and Knowledge Management (CIKM) , 2016
  3. On stochastic primal-dual hybrid gradient approach for compositely regularized minimization
    Linbo Qiao, Tianyi Lin, Yu-Gang Jiang, Fan Yang, Wei Liu, and Xicheng Lu
    In European Conference on Artificial Intelligence (ECAI) , 2016

2015

  1. On the global linear convergence of the ADMM with multiblock variables
    Tianyi Lin, Shiqian Ma, and Shuzhong Zhang
    SIAM Journal on Optimization, 2015
  2. On the sublinear convergence rate of multi-block ADMM
    Tianyi Lin, Shiqian Ma, and Shuzhong Zhang
    Journal of the Operations Research Society of China, 2015

2014

  1. Collaborative filtering incorporating review text and co-clusters of hidden user communities and item groups
    Yinqing Xu, Wai Lam, and Tianyi Lin
    In International Conference on Information and Knowledge Management (CIKM) , 2014
  2. Latent aspect mining via exploring sparsity and intrinsic information
    Yinqing Xu, Tianyi Lin, Wai Lam, Zirui Zhou, Hong Cheng, and Anthony Man-Cho So
    In International Conference on Information and Knowledge Management (CIKM) , 2014
  3. The dual-sparse topic model: mining focused topics and focused terms in short text
    Tianyi Lin, Wentao Tian, Qiaozhu Mei, and Hong Cheng
    In International Conference on World Wide Web (WWW) , 2014