Invited Session Wed.1.H 1012

Wednesday, 10:30 - 12:00 h, Room: H 1012

Cluster 17: Nonsmooth optimization [...]

Recent advances in optimization methods

 

Chair: Marc Teboulle

 

 

Wednesday, 10:30 - 10:55 h, Room: H 1012, Talk 1

Jérôme Bolte
Forward-backward splitting and other descent methods for semi-algebraic minimization problems

 

Abstract:
In view of the minimization of a nonsmooth nonconvex function of the form f+g (where f is smooth semi-algebraic and g is lsc semi-algebraic), we present an abstract convergence result for descent methods which covers in particular the case of forward-backward splitting methods.
Our result guarantees the convergence of bounded sequences, under the assumption that the function f satisfies the Kurdyka-?ojasiewicz ine\-quality (KL inequality). This result applies to a wide range of problems, including nonsmooth semi-algebraic (i.e., polynomial-like) problems but also to analytic-like or more generally to the so-called tame problems.
In this talk we shall emphasize on the following facts:
\begin{compactitem}[-]

  • the verifiability of the assumptions and the ubiquity of KL inequality
  • the flexibility of the abstract result and its impact on the understanding of widely used methods.
    \end{compactitem}

     

     

    Wednesday, 11:00 - 11:25 h, Room: H 1012, Talk 2

    Amir Beck
    The 2-coordinate descent method for solving double-sided simplex constrained minimization problems

     

    Abstract:
    This talk considers the problem of minimizing a smooth function subject to a single linear equality constraint and additional bound constraints on the decision variables. We introduce and analyze several variants of a 2-coordinate descent method - a block descent method that performs an optimization step with respect to only two variables at each iteration. Based on two new optimality measures, we establish convergence to stationarity points for general nonconvex objective functions.
    In the convex case, when all the variables are lower bounded but not upper bounded, we show that the sequence of function values converges at a sublinear rate.

     

     

    Wednesday, 11:30 - 11:55 h, Room: H 1012, Talk 3

    Marc Teboulle
    Nonsmooth convex minimization: To smooth or not to smooth?

     

    Abstract:
    Well known smoothing approaches tackling nonsmooth optimization problems via algorithms applied to their smoothed counterparts only provide an ε-optimal solution to the approximated smoothed problem. In this talk, we prove that independently of the structure of the function to be minimized, and of a given fast first order iterative scheme, by solving an adequately smoothed approximation, the original nonsmooth problem can be solved with an O(ε-1) efficiency estimate. Our approach allows for clarification and unification to several issues on the design, analysis, and the potential applications of smoothing methods when combined with fast first order algorithms, and eventually answer to the question posed in the title!
    %This talk is based on joint works with Amir Beck, Technion, Israel.

     

  •   Getting California Loans Online should be thought of many times. Since its introduction in the market buying Buy Generic Cialis can be exclusively in pharmacy chains with a prescription from a doctor. I agree that this is very inconvenient and takes a lot of time and effort.