Research
Research Interests
My current interests include
Preprints
A nonasymptotic analysis of gradient descent ascent for nonconvex-concave minimax problems
Tianyi Lin, Chi Jin and Michael I. Jordan.
Revision. [SSRN]
Explicit second-order min-max optimization methods with optimal convergence guarantee
Tianyi Lin, Panayotis Mertikopoulos and Michael. I. Jordan.
Revision. [ArXiv]
Curvature-independent last-iterate convergence for games on Riemannian manifolds
Yang Cai, Michael I. Jordan, Tianyi Lin, Argyris Oikonomou and Emmanouil-Vasileios Vlatakis-Gkaragkounis.
Submitted. [ArXiv]
Selected Refereed Journal Publications
Perseus: A simple and optimal high-order method for variational inequalities
Tianyi Lin and Michael I. Jordan.
Mathematical Programming, 2024.
[ArXiv] [DOI]
Doubly optimal no-regret online learning in strongly monotone games with bandit feedback
Wenjia Ba, Tianyi Lin, Jiawei Zhang and Zhengyuan Zhou.
Operations Research, 2024.
[ArXiv] [SSRN] [DOI]
Monotone inclusions, acceleration and closed-loop control
Tianyi Lin and Michael I. Jordan.
Mathematics of Operations Research, 48(4): 2353-2382, 2023.
[ArXiv] [DOI]
A control-theoretic perspective on optimal high-order optimization
Tianyi Lin and Michael I. Jordan.
Mathematical Programming, 195(1): 929-975, 2022.
[ArXiv] [DOI]
On the efficiency of entropic regularized algorithms for optimal transport
Tianyi Lin, Nhat Ho and Michael I. Jordan.
Journal of Machine Learning Research, 23(137): 1-42, 2022.
[ArXiv] [DOI]
A unified adaptive tensor approximation scheme to accelerate composite convex optimization
Bo Jiang, Tianyi Lin and Shuzhong Zhang.
SIAM Journal on Optimization, 30(4): 2897-2926, 2020.
[ArXiv] [DOI]
On the global linear convergence of the ADMM with multi-block variables
Tianyi Lin, Shiqian Ma and Shuzhong Zhang.
SIAM Journal on Optimization, 25(3): 1478-1497, 2015.
[ArXiv] [DOI]
Selected Refereed Conference Proceedings
Deterministic nonsmooth nonconvex optimization
Michael I. Jordan, Guy Kornowski, Tianyi Lin, Ohad Shamir and Manolis Zampetakis.
Conference on Learning Theory (COLT), 2023.
[ArXiv] [Earlier ArXiv] [DOI]
Gradient-free methods for deterministic and stochastic nonsmooth nonconvex optimization
Tianyi Lin, Zeyu Zheng and Michael I. Jordan.
Neural Information Processing Systems (NeurIPS), 2022.
[ArXiv] [DOI]
First-order algorithms for min-max optimization in geodesic metric spaces
Michael I. Jordan, Tianyi Lin and Emmanouil-Vasileios Vlatakis-Gkaragkounis.
(Oral) Neural Information Processing Systems (NeurIPS), 2022.
[ArXiv] [DOI]
Projection robust Wasserstein distance and Riemannian optimization
Tianyi Lin*, Chenyou Fan*, Nhat Ho, Macro Cuturi and Michael I. Jordan.
(Spotlight) Neural Information Processing Systems (NeurIPS), 2020.
[ArXiv] [DOI]
Fixed-support Wasserstein barycenters: Computational hardness and fast algorithm
Tianyi Lin, Nhat Ho, Xi Chen, Macro Cuturi and Michael I. Jordan.
Neural Information Processing Systems (NeurIPS), 2020.
[ArXiv] [DOI]
Finite-time last-iterate convergence for multi-agent learning in games
Tianyi Lin*, Zhengyuan Zhou*, Panayotis Mertikopoulos and Michael I. Jordan.
International Conference on Machine Learning (ICML), 2020.
[ArXiv] [DOI]
On gradient descent ascent for nonconvex-concave minimax problems
Tianyi Lin, Chi Jin and Michael I. Jordan.
International Conference on Machine Learning (ICML), 2020.
[ArXiv] [DOI]
Near-optimal algorithms for minimax optimization
Tianyi Lin, Chi Jin and Michael I. Jordan.
Conference on Learning Theory (COLT), 2020.
[ArXiv] [DOI]
On efficient optimal transport: An analysis of greedy and accelerated mirror descent algorithms
Tianyi Lin*, Nhat Ho* and Michael I. Jordan.
International Conference on Machine Learning (ICML), 2019.
[ArXiv] [DOI]
The dual-sparse topic model: Mining focused topics and focused terms in short text
Tianyi Lin*, Wentao Tian*, Qiaozhu Mei and Hong Cheng.
ACM International Conference on World Wide Web (WWW), 2014.
[DOI]
|