## Invited Session Wed.2.H 0107

#### Wednesday, 13:15 - 14:45 h, Room: H 0107

**Cluster 16: Nonlinear programming** [...]

### Regularization techniques in optimization II

**Chair: Jacek Gondzio**

**Wednesday, 13:15 - 13:40 h, Room: H 0107, Talk 1**

**Stefania Bellavia**

Regularized Euclidean residual algorithm for nonlinear least-squares with strong local convergence properties

**Coauthor: Benedetta Morini**

**Abstract:**

This talk deals with Regularized Euclidean Residual methods

for solving nonlinear least-squares problems of the form:

min_{x} ||F(x)||^{2}

where *F: ℜ *^{n} → ℜ ^{m}. Any relationship between *n* and *m* is allowed.

This approaches use a model of the objective function consisting of the unsquared Euclidean residual regularized by a quadratic term. The role of the regularization term

is to provide global convergence of these procedure without the need to wrap them into a globalization strategy.

We will show that the introduction of the regularization term also allow to get fast local convergence to roots of the underlying system of nonlinear equations,

even if the Jacobian is not full rank at the solution. In fact, they are locally fast convergent under the weaker condition that *||F||* provides a local error bound

around the solution. In particular, in case *m ≥ n*, this condition allows the solution set to be locally nonunique.

Some numerical results are also presented.

**Wednesday, 13:45 - 14:10 h, Room: H 0107, Talk 2**

**Benedetta Morini**

Preconditioning of sequences of linear systems in regularization techniques for optimization

**Coauthors: Stefania Bellavia, Valentina de Simone, Daniela di Serafino**

**Abstract:**

We build preconditioners for sequences of linear systems

* (A+Δ*_{k})x_{k}=b_{k}, k=1,2, … ,

where *A ∈ ℜ *^{n × n} is symmetric positive semidefinite and sparse, *Δ*_{k} ∈ ℜ ^{n × n} is diagonal positive semidefinite and the systems are compatible. Such sequences arise in many optimization methods based on regularization techniques: trust-region and overestimation methods for nonlinear least-squares, regularized affine-scaling methods for convex bound-constrained quadratic programming and bound-constrained linear least-squares.

We propose a framework for updating any symmetric positive definite preconditioner for *A*, factorized as *LDL*^{T}. The resulting preconditioners are effective on slowly varying sequences and cluster eigenvalues of the preconditioned matrix when *Δ*_{k} has sufficiently large entries. We discuss two preconditioners in this framework and show their efficiency on sequences of linear systems arising in the solution of nonlinear least-squares problems and bound-constrained convex quadratic programming.

**Wednesday, 14:15 - 14:40 h, Room: H 0107, Talk 3**

**Serge Gratton**

Preconditioning inverse problems using duality

**Coauthors: Selime Gurol, Philippe L. Toint, Jean Tshimanga**

**Abstract:**

The problem considered in this talk is the data assimilation problem arising in weather forecasting and oceanography, which consists in estimating the initial condition of a dynamical system whose future behaviour is to be predicted. More specifically, new optimization techniques will be discussed for the iterative solution of the particular nonlinear least-squares formulation of this inverse problem known under the name of 4DVAR, for four-dimensional data assimilation.

These new methods are designed to decrease the computational cost in applications where the number of variables involved is expected to exceed *10*^{9}. They involve the exploitation of the problem's underlying geometrical structure in reformulating standard trust-region techniques into significantly cheaper variants. Adapted preconditioning issues for the

considered systems of equations will be discussed, which also depend on the problem's geometrical structure and which exploit limited-memory techniques in a novel way.