Preprints
-
Optimal Subgradient Methods for Lipschitz Convex Optimization with Error Bounds
Alex L. Wang
December 2025 [arXiv] -
Subgame Perfect Methods in Nonsmooth Convex Optimization
(ɑ) Benjamin Grimmer and Alex L. Wang
November 2025 [arXiv] -
Beyond Minimax Optimality: A Subgame Perfect Gradient Method
(ɑ) Benjamin Grimmer, Kevin Shu, and Alex L. Wang
December 2024 [arXiv][code] -
Sharpness and well-conditioning of nonsmooth convex formulations in statistical signal recovery
(ɑ) Lijun Ding and Alex L. Wang
July 2023 [arXiv] [code]
Journal publications
-
Composing optimized stepsize schedules for gradient descent
(ɑ) Benjamin Grimmer, Kevin Shu, and Alex L. Wang
Math. Oper. Res. 2025 [arXiv][code] - Accelerated Objective Gap and Gradient Norm Convergence for Gradient Descent via Long Steps
(ɑ) Benjamin Grimmer, Kevin Shu, and Alex L. Wang
INFORMS J. Optim. 2025
[arXiv] [article] [mathematica]- This paper significantly shortens and strengthens the analysis for the stepsize sequences first described in our previous technical report Accelerated Gradient Descent via Long Steps.
-
New notions of simultaneous diagonalizability of quadratic forms with applications to QCQPs
Alex L. Wang and Rujun Jiang
Math. Program. 2024 [arXiv] [article] -
Hidden convexity, optimization, and algorithms on rotation matrices
(ɑ) Akshay Ramachandran, Kevin Shu, and Alex L. Wang
Math. Oper. Res. 2024 [arXiv] [article] -
Accelerated first-order methods for a class of semidefinite programs
Alex L. Wang and Fatma Kılınç-Karzan
Math. Program. 2024 [arXiv] [article] [code] - On semidefinite descriptions for convex hulls of quadratic programs
Alex L. Wang and Fatma Kılınç-Karzan
Oper. Res. Letters (2024) [arXiv] [article]- This paper was significantly condensed and rewritten during the review process. The original manuscript, A Geometric View of SDP Exactness in QCQPs and its Applications [arXiv], contains additional minor results on exactness in random QCQPs.
-
Implicit regularity and linear convergence rates for the generalized trust-region subproblem
Alex L. Wang, Yunlei Lu, and Fatma Kılınç-Karzan
SIAM J. Optim. (2023) [arXiv] [article] -
Necessary and sufficient conditions for rank-one generated cones
(ɑ) C.J. Argue, Fatma Kılınç-Karzan, and Alex L. Wang
Math. Oper. Res. (2022)
[arXiv] [article] -
Exactness in SDP relaxations of QCQPs: Theory and applications
(ɑ) Fatma Kılınç-Karzan and Alex L. Wang
Tut. in Oper. Res. (2021) [arXiv] [article] -
On the tightness of SDP relaxations of QCQPs
Alex L. Wang and Fatma Kılınç-Karzan
Math. Program. (2022) [arXiv] [article]
INFORMS Optimization Society 2021 Student Paper Prize - The generalized trust region subproblem: solution complexity and convex hull results
Alex L. Wang and Fatma Kılınç-Karzan
Math. Program. (2022) [arXiv] [article]
Articles in refereed conference proceedings
-
Solving Stackelberg Prediction Game with Least Squares Loss via Spherically Constrained Least Squares Reformulation
Jiali Wang, Wen Huang, Rujun Jiang, Xudong Li, and Alex L. Wang
ICML 2022 [arXiv] [proceedings]
ICML 2022 Outstanding Paper Award -
On convex hulls of epigraphs of QCQPs
Alex L. Wang and Fatma Kılınç-Karzan
IPCO 2020 [arXiv] [proceedings] -
Hardy-Muckenhoupt bounds for Laplacian eigenvalues
(ɑ) Gary Miller, Noel Walkington, and Alex L. Wang
APPROX 2019 [arXiv] [proceedings] -
Clustering stable instances of Euclidean k-means
(ɑ) Abhratanu Dutta, Aravindan Vijayaraghavan, and Alex L. Wang
NeurIPS 2017 [arXiv] [proceedings]
Other writing
- A Strengthened Conjecture on the Minimax Optimal Constant Stepsize for Gradient Descent
(ɑ) Benjamin Grimmer, Kevin Shu, and Alex L. Wang
July 2024 [arXiv] [certificates]- This paper makes partial progress towards a conjecture on the minimax optimal constant stepsize for gradient descent. This conjecture was resolved shortly after this paper was posted online.
-
On QCQPs and their SDP Relaxations — Ph.D. Thesis
June 2022 [CMU CSD] - Weighted Cheeger and Buser inequalities, with applications to clustering and cutting probability densities
(ɑ) Timothy Chu, Gary Miller, Noel Walkington, and Alex L. Wang
May 2020 [arXiv]