Large-scale convex optimization

Advances in randomized numerical linear algebra (randNLA) improve the runtime of a variety of fundamental linear algebraic operations. The resulting advances in low-rank matrix approximation have important implications throughout optimization: many subproblems admit speedups by using low-rank approximations to data matrices, problem variables, or Hessians. Our lab has used fast low rank approximations to accelerate cross-validation, dynamic optimization, optimal sensing, semidefinite programming, linear systems solvers, composite optimization, and conic optimization, and stochastic optimization. For example, we developed a low-rank preconditioner for solving linear systems (NystromPCG) and used it to speed up solution times for well-studied statistical learning problems such as LASSO, SVM, logistic regression by an order of magnitude (NysADMM and GeNIOS).

Talks

Software

  • GeNIOS for composite convex optimization

  • sketchySGD for stochastic optimization

  • PROMISE for faster stochastic optimization when the objective is convex

Papers

GeNIOS: an (almost) second-order operator-splitting solver for large-scale convex optimization
T. Diamandis, Z. Frangella, S. Zhao, B. Stellato, and M. Udell
Submitted to Math Programming Computation, 2023
[arxiv][url][bib]

PROMISE: Preconditioned Stochastic Optimization Methods by Incorporating Scalable Curvature Estimates
P. Rathore, Z. Frangella, and M. Udell
Submitted to JMLR, 2023
[arxiv][url][bib]

On the (linear) convergence of Generalized Newton Inexact ADMM
Z. Frangella, S. Zhao, T. Diamandis, B. Stellato, and M. Udell
2023
[arxiv][url][bib]

SketchySGD: Reliable Stochastic Optimization via Robust Curvature Estimates
Z. Frangella, P. Rathore, S. Zhao, and M. Udell
In revision at SIMODS, 2022
[arxiv][url][bib]

NysADMM: faster composite convex optimization via low-rank approximation
S. Zhao, Z. Frangella, and M. Udell
International Conference on Machine Learning (ICML), 2022
[arxiv][url][bib]

Randomized Nystr\"om Preconditioning
Z. Frangella, J. A. Tropp, and M. Udell
SIAM Journal on Matrix Analysis and Applications, 2022
[arxiv][url][bib]

A Strict Complementarity Approach to Error Bound and Sensitivity of Solution of Conic Programs
L. Ding and M. Udell
Optimization Letters, 2022
[arxiv][pdf][url][bib]

On the simplicity and conditioning of low rank semidefinite programs
L. Ding and M. Udell
SIAM Journal on Optimization (SIOPT), 2021
[arxiv][url][bib]

Scalable Semidefinite Programming
A. Yurtsever, J. Tropp, O. Fercoq, M. Udell, and V. Cevher
SIAM Journal on Mathematics of Data Science (SIMODS), 2021
[arxiv][pdf][bib]

Randomized Sketching Algorithms for Low-Memory Dynamic Optimization
R. Muthukumar, D. Kouri, and M. Udell
SIAM Journal on Optimization (SIOPT), 2021
[pdf][url][bib]

An Optimal-Storage Approach to Semidefinite Programming using Approximate Complementarity
L. Ding, A. Yurtsever, V. Cevher, J. Tropp, and M. Udell
SIAM Journal on Optimization (SIOPT), 2021
Winner of 2017 INFORMS Optimization Society student paper prize
[arxiv][pdf][slides][bib]

kFW: A Frank-Wolfe style algorithm with stronger subproblem oracles
L. Ding, J. Fan, and M. Udell
In revision at Computational Optimization and Applications, 2020
[arxiv][url][bib]

Frank-Wolfe Style Algorithms for Large Scale Optimization
L. Ding and M. Udell
Large-Scale and Distributed Optimization, 2018
[arxiv][pdf][bib]

Sketchy Decisions: Convex Low-Rank Matrix Optimization with Optimal Storage
A. Yurtsever, M. Udell, J. Tropp, and V. Cevher
International Conference on Artificial Intelligence and Statistics (AISTATS), 2017
Oral Presentation
[arxiv][pdf][url][slides][bib][video]

The Sound of APALM Clapping: Faster Nonsmooth Nonconvex Optimization with Stochastic Asynchronous PALM
D. Davis, B. Edmunds, and M. Udell
Advances in Neural Information Processing Systems, 2016
[arxiv][pdf][bib]