14th EUROPT Workshop on Advances in Continuous Optimization

Warsaw (Poland), July 1-2, 2016

SD-3:

Saturday, 14:20 - 16:00 - Room 108

ICS »




Stream: Convex and non-smooth optimization

Chair: Tatiana Tchemisova

  1. On a Class of Gradient-Like Dynamical Systems with Noisy First-Order Feedback

    Mathias Staudigl, Panayotis Mertikopoulos


    In view of solving convex optimization problems with noisy feedback, we examine the convergence properties of a class of mirror descent (MD) dynamical systems with stochastically perturbed gradient input. Formulating the problem as a stochastic differential equation (SDE), we focus on the convergence of sample paths and the SDE's ergodicity properties. If the problem admits an isolated interior solution, the process admits an invariant measure which is sharply concentrated around the problem's solution; at the other end of the spectrum, corner point solutions are globally attracting (a.s.).

  2. Some duality results in evenly convex programming

    José Vicente-Pérez


    In this talk we present a new exact conjugation scheme for the class of extended real-valued evenly convex functions defined on general topological vector spaces which is obtained by exploiting the relationship between even convexity and even quasiconvexity. We also show a new characterization of the even convexity of a function at a given point, and establish the links between even convexity and subdifferentiability and the regularization of a given function. Finally, we derive a sufficient condition for strong duality fulfillment in convex optimization programs.

  3. From error bounds to the complexity of first-order descent methods for convex functions

    Phong Nguyen


    This talk aims to show that error bounds can be used as effective tools for deriving complexity results for fi rst-order descent methods in convex minimization. In a first stage, this objective led us to revisit the interplay between error bounds and the Kurdyka- Lojasiewicz (KL) inequality. One can show the equivalence between the two concepts for convex functions having a moderately flat profile near the set of minimizers. In a second stage, we show how KL inequalities can in turn be employed to compute new complexity bounds for a wealth of descent methods for convex problems.

  4. Convergence Analysis of Douglas-Rachford Splitting Method for

    Xiaoming Yuan


    We consider the convergence of the Douglas-Rachford splitting method for minimizing the sum of a strongly convex function and a weakly convex function; a setting having various applications especially in some sparsity-driven scenarios with the purpose of avoiding biased estimates which usually occur when convex penalties are used. We prove the convergence under relatively mild assumptions and establish the non-ergodic worst-case convergence rate in term of iteration complexity; and locally linear convergence rate in asymptotical sense under some regularity conditions.

  5. On study of properties of special nonlinear problems arising in parametric SIP

    Olga Kostyukova, Tatiana Tchemisova


    We study parametric Semi-infinite Programming (SIP) problems with finitely representable compact index sets and investigate dependence of solutions of these problems on the parameters. We show that differential properties of solutions of the parametric SIP problems can be formulated in terms of solutions of some special auxiliary Nonlinear Programming (NLP) problems depending on a finite number of integers (parameters). We discover different properties of solutions of these NLP problems w.r.t. the values of the integers that permit us to obtain important conclusions about behavior of solution.