Pierre Yves Bouchet, Ecole Polytchnique di Montreal

Aula Beltrami, Dipartimento di Matematica – Martedì 10 Dicembre 2019 h.16:00


Abstract. In adaptive precision optimization, the objective function can only be computed with some stochastic errors with controllable standard deviation. A common strategy is to monotonically diminish this  standard deviation during the optimization process, making it asymptotically converge to zero to ensure convergence of algorithms because the noise is dismantled. The algorithm MpMads presented in this work follows this approach. However, a second algorithm called DpMads is introduced to explore another strategy which does not force the standard deviation to monotonically diminish. Although these strategies are proved to be theoretically equivalent, some tests shows practical differences. Derivative-free optimization framework (no
assumption is used about the derivability of the objective function) is used, as MpMads and DpMads generalise the deterministic Mads algorithm designed for such a framework.

This is a joint work with Prof. Charles Audet and Sébastien Le Digabel from Polytechnique Montréal, and Stéphane Alarie from Hydro Québec research centre.