The exact information-based complexity of smooth convex minimization
@article{Drori2016TheEI, title={The exact information-based complexity of smooth convex minimization}, author={Yoel Drori}, journal={J. Complex.}, year={2016}, volume={39}, pages={1-16}, url={https://api.semanticscholar.org/CorpusID:205861966} }
62 Citations
On the oracle complexity of smooth strongly convex minimization
- 2022
Mathematics, Computer Science
Optimizing the Efficiency of First-Order Methods for Decreasing the Gradient of Smooth Convex Functions
- 2020
Mathematics, Computer Science
It is illustrated that the proposed method has a computationally efficient form that is similar to the optimized gradient method, and the worst-case gradient bound of the resulting method is optimal up to a constant for large-dimensional smooth convex minimization problems, under the initial bounded condition on the cost function value.
An optimal gradient method for smooth strongly convex minimization
- 2022
Mathematics
This work presents an optimal gradient method for smooth strongly convex optimization and provides a constructive recipe for obtaining the algorithmic parameters of the method and illustrates that it can be used for deriving methods for other optimality criteria as well.
An optimal lower bound for smooth convex functions
- 2024
Mathematics, Computer Science
This work defines a global lower bound for smooth differentiable objectives that is optimal with respect to the collected oracle information that can be readily employed by the Gradient Method with Memory to improve its performance.
Better Worst-Case Complexity Analysis of the Block Coordinate Descent Method for Large Scale Machine Learning
- 2017
Computer Science, Mathematics
A new lower bound is obtained, which is 16p3 times smaller than the best known on the information-based complexity of BCD method, by using an effective technique called Performance Estimation Problem (PEP) approach for analyzing the performance of first-order black box optimization methods.
The Complexity of Finding Stationary Points with Stochastic Gradient Descent
- 2020
Mathematics, Computer Science
It is shown that for nonconvex functions, the feasibility of minimizing gradients with SGD is surprisingly sensitive to the choice of optimality criteria, and this holds even if the authors limit ourselves to convex quadratic functions.
Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization
- 2017
Computer Science, Mathematics
A new analytical worst-case guarantee is presented for the proximal point algorithm that is twice better than previously known, and the standard worst- case guarantee for the conditional gradient method is improved by more than a factor of two.
On the Properties of Convex Functions over Open Sets
- 2018
Mathematics
We consider the class of smooth convex functions defined over an open convex set. We show that this class is essentially different than the class of smooth convex functions defined over the entire…
Efficient first-order methods for convex minimization: a constructive approach
- 2019
Mathematics, Computer Science
We describe a novel constructive technique for devising efficient first-order methods for a wide range of large-scale convex minimization settings, including smooth, non-smooth, and strongly convex…
Exact Worst-Case Convergence Rates of the Proximal Gradient Method for Composite Convex Minimization
- 2018
Mathematics, Computer Science
We study the worst-case convergence rates of the proximal gradient method for minimizing the sum of a smooth strongly convex function and a non-smooth convex function, whose proximal operator is…
15 References
Information-Theoretic Lower Bounds on the Oracle Complexity of Stochastic Convex Optimization
- 2012
Computer Science, Mathematics
A new notion of discrepancy between functions is introduced, and used to reduce problems of stochastic convex optimization to statistical parameter estimation, which can be lower bounded using information-theoretic methods.
On lower complexity bounds for large-scale smooth convex optimization
- 2015
Computer Science, Mathematics
Performance of first-order methods for smooth convex minimization: a novel approach
- 2013
Mathematics, Computer Science
A novel approach for analyzing the worst-case performance of first-order black-box optimization methods, which focuses on smooth unconstrained convex minimization over the Euclidean space and derives a new and tight analytical bound on its performance.
Information-based complexity
- 1987
Computer Science, Mathematics
Information-based complexity seeks to develop general results about the intrinsic difficulty of solving problems where available information is partial or approximate and to apply these results to…
Information-Based Complexity, Feedback and Dynamics in Convex Programming
- 2011
Mathematics, Computer Science
The present work connects the intuitive notions of “information” in optimization, experimental design, estimation, and active learning to the quantitative notion of Shannon information and shows that optimization algorithms often obey the law of diminishing returns.
Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- 2016
Mathematics, Computer Science
We show that the exact worst-case performance of fixed-step first-order methods for unconstrained optimization of smooth (possibly strongly) convex functions can be obtained by solving convex…
An optimal variant of Kelley’s cutting-plane method
- 2016
Mathematics
A new variant of Kelley’s cutting-plane method for minimizing a nonsmooth convex Lipschitz-continuous function over the Euclidean space is proposed and it is proved that it attains the optimal rate of convergence for this class of problems.
Introductory Lectures on Convex Optimization - A Basic Course
- 2004
Mathematics
It was in the middle of the 1980s, when the seminal paper by Kar markar opened a new epoch in nonlinear optimization, and it became more and more common that the new methods were provided with a complexity analysis, which was considered a better justification of their efficiency than computational experiments.
Information-based complexity of linear operator equations
- 1992
Computer Science, Mathematics
On Complexity of Stochastic Programming Problems
- 2005
Mathematics
It is argued that two-stage (linear) stochastic programming problems with recourse can be solved with a reasonable accuracy by using Monte Carlo sampling techniques, while multistage Stochastic programs, in general, are intractable.