Constrained global optimization: adaptive gradient flows
Abstract. We consider finite dimensional smooth optimization problems with a compact and connected feasible set. A basic problem in global optimization is how to get from one local minimum to all other ones by using local ascent or descent directions. In general, these local directions are induced by a Riemannian (i.e. variable) metric. We consider the "bang-bang" strategy of starting at a local minimum, going upward via the ascent flow until a maximum is reached, and then moving downward along the descent flow to a (possibly different) local minimum. Iteration of this procedure gives rise to a bipartite digraph whose nodes are the local minima/maxima.
It is well-known that in absence of inequality constraints this digraph is strongly connected for generic metrics, whereas this might not be the case when inequality constraints are present. In [1] a special global adaptation of the metric, based on global information, is constructed such that the digraph becomes connected. The present article presents an automatic adaptation of the metric based on local information which generically guarantees the desired connectedness.
The theorem is first proved for the special case of a smooth boundary of the feasible set. Then we show how the general case can either be reduced to the special case by logarithmic smoothing, or how it can be treated exactly. The paper ends with a conjecture concerning the replacement of the linear independence constraint qualification and nondegeneracy of certain critical points by the Mangasarian-Fromovitz constraint qualification and strong stability, respectively.
[1] H. Th. Jongen, A. Ruiz Jhones, Nonlinear Optimization: On the min-max digraph and global smoothing, in: Calculus of Variations and Differential Equations (Eds.: A. Ioffe, S. Reich, I. Shafrir), Chapman & Hall / CRC Research Notes in Mathematics Series, Vol. 410 (1999), CRC Press, Boca Raton FL, 119-135.