CRII: CIF: Universal Analysis of Optimization Algorithms

2017 - presentpresent

Iterative optimization algorithms lie at the heart of modern data-intensive applications such as machine learning, computer vision, and data science. The choice of algorithm, or even the choice of tuning parameters for a particular algorithm, has a dramatic effect on performance and viability. Even simple variants of well-known algorithms can be troublesome to analyze and must be considered on a case-by-case basis with the help of either deep insights by experts or extensive numerical simulations. Then, theoretical analyses of algorithms may fail to provide accurate or faithful performance guarantees because they do not account for sensitivity to parameter choice, robustness to noise either inherent to the algorithm or built in to how the iterations are computed, or other sources of uncertainty. The starting point for this research is to view iterative algorithms as dynamical systems with feedback. In gradient-based descent methods, for example, gradients are evaluated at each step and used to compute subsequent iterates. This research leverages tools from robust control (specifically, integral quadratic constraints and semidefinite programming) to develop a versatile, scalable, and modular framework capable of analyzing a variety of algorithms under different assumptions in an efficient and systematic manner. Of particular interest will be large-scale algorithms such as stochastic gradient descent and its variants, as well as distributed or asynchronous implementations.