When: April 8, 2016, 3:00 PM
Location: 4130, Discovery Building, 330 N. Orchard Street, Discovery Building
Probabilistic analysis of derivative-free methods
Numerical optimization has recently faced a change of paradigm, due to the outbreak of randomization within algorithmic frameworks. Motivated by the large-scale setting arising from data treatment, research in optimization has been focusing on economic variants of well-known methods that rely on random elements to improve practical performance, while maintaining convergence guarantees. However, the stochastic character of those algorithms often requires their usual, deterministic analysis to be reconsidered. Probability theory thus plays a key role in obtaining suitable theoretical results.
In this talk, we will focus on recent advances regarding the randomization of derivative-free frameworks. After reviewing the most significant developments of such techniques, we will derive a typical analysis in the context of direct-search-type methods. We will describe a direct-search instance based on random directions, that can be proven convergent with probability one. Using a generic technique applicable to a wide range of derivative-free methods, we will prove that such a method exhibits a global convergence rate for the gradient norm matching its deterministic counterpart, with overwhelming probability. We will also discuss how the worst-case estimates for the number of calls to the objective function can be improved by randomization, putting it in perspective with the practical performance. We will conclude the presentation by addressing second-order probabilistic aspects, with an emphasis on challenging open iss.