Invited Session Mon.1.H 1012

Monday, 10:30 - 12:00 h, Room: H 1012

Cluster 17: Nonsmooth optimization [...]

Iterative methods for variational analysis

 

Chair: Alain Pietrus

 

 

Monday, 10:30 - 10:55 h, Room: H 1012, Talk 1

Celia Jean-Alexis
The second order generalized derivative and generalized equations

Coauthors: Michel Geoffroy, Alain Pietrus

 

Abstract:
We consider a generalized equation of the form
0 ∈ f(x)+G(x) where f: RnRn is a C1,1 function such that its Fr├ęchet-derivative f' is subanalytic and G: Rn → 2Rn is a set-valued map metrically regular. First of all, we present some iterative methods introduced for solving this equation and then we state our main result. In fact, we propose a method using the second order generalized derivative and we show existence and convergence of a sequence defined by this method.

 

 

Monday, 11:00 - 11:25 h, Room: H 1012, Talk 2

Robert Baier
Set-valued Newton's method for computing convex invariant sets

 

Abstract:
A new realization of Newton's method for "smooth'' set-valued fixed-point problems is presented. For a dynamical system xk+1 = g(xk) a convex invariant set X ⊂ Rn has to be determined with g(X) = X.
This fixed-point problem is transformed to a zero-finding problem in the Banach space of directed sets for which Newton's method can be formulated. The cone of convex, compact subsets of Rn can be embedded into this Banach space such that usual set arithmetics are extended and a visualization of differences of embedded convex compact sets as usually nonconvex subsets of Rn is available.

Important assumptions are the existence of a set of convex subsets such that their image under g remains convex and the existence of a differentiable extension of g to directed sets. The visualization of an embedded fixed set
for the transformed problem is a convex invariant set for the original problem.
First examples illustrate that the convergence assumptions can be verified and local quadratic convergence even to unstable convex invariant sets is observed in contrary to fixed set iterations. Further extensions of this approach are
indicated.

 

 

Monday, 11:30 - 11:55 h, Room: H 1012, Talk 3

Elza Farkhi
The directed subdifferential and applications

Coauthors: Robert Baier, Vera Roshchina

 

Abstract:
The directed subdifferential of quasidifferentiable functions
is introduced as the difference of two convex subdifferentials
embedded in the Banach space of directed sets.
Preserving the most important properties of the quasidifferential,
such as exact calculus rules, the directed subdifferential lacks
major drawbacks of the quasidifferential: non-uniqueness and
growing in size of the two convex sets representing the quasidifferential
after applying calculus rules. Its visualization, the Rubinov subdifferential,
is a non-empty, generally non-convex set in Rn.
Calculus rules for the directed subdifferentials are derived.
Important properties as well as necessary and suffcient optimality
conditions for the directed subdifferential are obtained.
The Rubinov subdifferential is compared with other well-known subdifferentials.

 

  Payday Loans In Indiana. In this section we give only a brief summary recommendation for admission of Levitra. Full information can be found in the instructions for receiving medications with vardenafil.