Friday, 14:00 - 15:00 - Room 161
Stream: Large scale optimization
In this paper we deal with preconditioning techniques for Nonlinear Conjugate Gradient methods in large scale unconstrained optimization. We combine information collected by the NCG scheme with specific appealing properties of quasi-Newton updates, particularly the recent damped quasi-Newton techniques. Our aim is to possibly approximate in some sense the inverse of the Hessian matrix, while still preserving information provided by the satisfaction of the secant equation or its modification. Preliminary numerical experiences described confirm the effectiveness of the proposed approach.
For linear autonomous singularly perturbed system with small parameter at the highest derivative of the part of variables and with delay in the slow state variables construct a dynamic composite controller, independent of the small parameter, according to the feedback principle so that for sufficiently small values of the parameter the closed-loop system was asymptotically stable by Lyapunov, and the state of the system and control's law were close to zero (order of smallness of the parameter), starting at some point of time.
A separable, additive, convex optimization problem is considered. Under quite moderate assumptions, the Lagrangian decomposition, based on the relaxation of constraints, can be applied, what leads to a hierarchical, two-level scheme with the dual problem on the upper level and the decomposed primal problem on the lower level . A new gradient algorithm for the upper level, using more information concerning the algorithm applied on the lower level, will be proposed.
Ref.  Karbowski A., Comm. on "Optimization Flow Control...", IEEE/ACM Trans. on Networking, 11(2): 338-339, 2003.