Friday, 9:00 - 9:50 - Room 161
In the recent book Optimization Stories edited by Martin Groetschel, Robert Mifflin and Claudia Sagastizabal  describe how nonsmooth optimization arose through a rare collaboration between Soviet and Western scientists in the 1960s and 1970s. In the seemingly simple case of a convex objective function there exists everywhere a nonempty set of subgradients whose shortest element provides a direction of descent. While this set is also convex it may be quite difficult to compute and epresent. On the other hand there are classical scenarios where it is quite easy to compute one of the subgradients, or in the more general Lipschitzian case, one generalized gradient. This lead to the widely accepted black-box paradigm, namely that nonsmooth optimization algorithms should be based on an evaluation oracle that provides the function value and just one generalized gradient at any point in the function domain. The resulting bundle methods were described and analyzed in great detail, for example in the classical books of Jean Baptiste Hirriart Urruty and Claude Lemarechal . There exist very simple examples, where due to the codependence of intermediate quantities, such a generalized gradient cannot be computed by the application of generalized differentiation rules. Moreover, even when generalized gradients are available everywhere, the resulting descent algorithms are typically quite slow and suffer from the lack of practical stopping criteria. Ideally, these should be based on computable and reasonably sharp optimality conditions, that are satisfied approximately in a small but open neighborhood of a local minimizer. It will be shown here that this desirable situation can be achieved for objectives in the so called abs-normal form, which also yields enough information to achieve, linear, superlinear, or even quadratic convergence, by successive piecewise linearization provided certain nondegeneracy conditions are satisfied. Any piecewise smooth function that is specified by an evaluation procedure involving smooth elemental functions and piecewise linear functions like min and max can be represented in abs-normal form . This is in particular true for most popular nonsmooth test functions. By an extension of algorithmic, or automatic differentiation, one can then compute certain first and second order derivative vectors and matrices that represent a local piecewise linearization and provide additional curvature information. On the basis of these quantities we characterize local optimality by first and second order necessary and sufficient conditions , which generalize the corresponding KKT and SSC theory for smooth problems. The key assumption is the Linear Independence Kink Qualification (LIKQ), a generalization of LICQ familiar from NLOP. It implies that the objective has locally a so-called VU decomposition  and renders everything tractable in terms of matrix factorizations and other simple linear algebra operations. In the smooth case local optimality conditions usually combine a stationarity condition with a convexity conditions. In contrast, nonsmooth objectives can have strongly isolated minima without having even weakly supporting hyperplanes at arbitrarily close neighboring point. Since in some applications such as multiphase equilibria, local convexity itself is necessary for single phase stability, one may also ask for convexity conditions for piecewise smooth functions in absnormal form. As it turns out this property seems to be much more difficult to verify than optimality as even the question of first order convexity, i.e. convexity of the local piecewise linear model appears to be NP hard.
Keywords: Abs-normal form, Piecewise linearization, Karush-Kuhn-Tucker, Second order optimality, Tangential stationarity, Normal growth, VU decomposition, First order convexity.
References  A. Griewank. On stable piecewise linearization and generalized algorithmic differentiation. Opt. Meth. and Softw., 28(6):1139-1178, 2013.  A. Griewank and A. Walther. First and second order optimality conditions for piecewise smooth objective functions. Opt. Meth. and Softw., accepted, 2016.  J.-B. Hiriart-Urruty and C. Lemarechal. Convex Analysis and Minimization Algorithms I, Springer, 1993.  R. Mifflin and C. Sagastizabal. A science fiction story in no nsmooth optimization originating at IIASA. Documenta Mathematica, Extra Vol.:291-300, 2012.
1976 Diplom in Math at University of Freiburg, Germany 1980 PhD in Computer Science at Australian National University 1987 Assistant/Associate Prof Southern Methodist University Dallas 1993 (Senior) Scientist Argonne National Laboratory 2003 Prof at TU Dresden 2015 Prof at Humboldt Univ. Berlin Since October 2015 Dean of Faculty of Mathematical Sciences and Computer Engineering at Yachay Tech Ibarra.