Invited Session Tue.2.H 1012

Tuesday, 13:15 - 14:45 h, Room: H 1012

Cluster 17: Nonsmooth optimization [...]

Nonsmooth optimization methods

 

Chair: Alain Pietrus

 

 

Tuesday, 13:15 - 13:40 h, Room: H 1012, Talk 1

Alain Pietrus
Some methods for solving perturbed variational inclusions

 

Abstract:
This paper deals with variational inclusions of the form 0 ∈ f(x)+g(x)+F(x)
where f is a Fréchet differentiable function, g is a Lipschitz function and F is a set-valued map acting in Rn.

In a first time in this talk, we recall some existing results in relation with metric regularity. In a second time, we focus on the case where the set valued map F is a cone and in this case we introduce different algorithms to approximate a solution x* of the variational inclusion. Different situations are considered: the case where g is smooth, the case where g is semi-smooth (existence of differences divided, … ) and the case where g is only Lipschitz. We show the convergence of these algorithms without the metric regularity assumption.

 

 

Tuesday, 13:45 - 14:10 h, Room: H 1012, Talk 2

Christopher Hendrich
A double smoothing technique for solving nondifferentiable convex optimization problems

Coauthor: Radu I. Bot

 

Abstract:
The aim of this talk is to develop an efficient algorithm for solving a class of unconstrained nondifferentiable convex optimization problems. To this end we formulate first its Fenchel dual problem and regularize it in two steps into a differentiable strongly convex one with Lipschitz continuous gradient. The doubly regularized dual problem is then solved via a fast gradient method with the aim of accelerating the resulting convergence scheme.

 

 

Tuesday, 14:15 - 14:40 h, Room: H 1012, Talk 3

Emil Gustavsson
Primal convergence from dual subgradient methods for convex optimization

Coauthors: Michael Patriksson, Ann-Brith Strömberg

 

Abstract:
When solving a convex optimization problem through a Lagrangian dual reformulation subgradient optimization methods are favourably utilized, since they often find near-optimal dual solutions quickly. However, an optimal primal solution is generally not obtained directly through such a subgradient approach. We construct a sequence of convex combinations of primal subproblem solutions, a so called ergodic sequence, which is shown to converge to an optimal primal solution when the convexity weights are appropriately chosen. We generalize previous convergence results from linear to convex optimization and present a new set of rules for constructing the convexity weights defining the ergodic sequence of primal solutions. In contrast to rules previously proposed, they exploit more information from later subproblem solutions than from earlier ones. We evaluate the proposed rules on a set of nonlinear multicommodity flow problems and demonstrate that they clearly outperform the previously proposed ones.

 

  To apply for Payday Loans In United States you don't have to seek the help of relatives or go to a bank. On the global pharmaceutical market this medicine was issued in 2003 by two companies - Eli Lilly and ICOS. Initially, permission to sell Cialis was obtained in Europe, Australia, New Zealand.