## Invited Session Tue.1.H 3503

#### Tuesday, 10:30 - 12:00 h, Room: H 3503

**Cluster 6: Derivative-free & simulation-based optimization** [...]

### Derivative-free optimization and constraints

**Chair: Stefan Wild and Luís Nunes Vicente**

**Tuesday, 10:30 - 10:55 h, Room: H 3503, Talk 1**

**Giovanni Fasano**

An exact penalty method for constrained Lipschitz optimization

**Coauthors: Giampaolo Liuzzi, Stefano Lucidi, Francesco Rinaldi**

**Abstract:**

In this work we consider the minimization of a real function subject to inequality constraints along with bound constraints on the variables. In the latter problem we assume that both the objective function and the constraints are Lipschitz continuous. We first study the solution of a bound constrained minimization problem and propose a line search type derivative free method for its solution. Then, to take into account the presence of nonlinear constraints, we consider the minimization of a new Lipschitz continuous exact penalty function subject to bound constraints. We prove the equivalence of the original inequality constrained problem with the penalized problem subject to bound constraints. In particular, we show that using our derivative free line search approach, global convergence to Clarke-stationary points is guaranteed for the penalized problem. Then, convergence to Clarke-stationary points is also guaranteed for the original constrained problem. We complete our work with a numerical experience on significant test problems, showing the reliability of our proposal.

**Tuesday, 11:00 - 11:25 h, Room: H 3503, Talk 2**

**Kevin Kofler**

Derivative-free optimization with equality constraints using data analysis

**Coauthors: Arnold Neumaier, Hermann Schichl**

**Abstract:**

This talk will present an algorithm (BBOWDA - Black Box Optimization With Data Analysis) we developed to solve constrained black box optimization problems globally. Our techniques do not require gradients nor direct derivative approximations. Instead, we approximate the functions by a quadratic version of covariance models from data analysis. A particular focus is on constraints: In addition to bound constraints, we also handle black box inequality and equality constraints. In particular, we support equality constraints given in implicit form *f(x)=0* where *f* is a black box function and *x* a vector of one or more variables. That is achieved by bounding those implicit equality constraints by quadratic approximations using linear programming. We thus obtain surrogate models which we can solve by derivative-based optimization software. Finally, we attempt a heuristic global search by another method from data analysis: We use Gaussian mixture models to locate holes in the search space to fill with sample points. Our approach is particularly tuned for problems where function evaluations are expensive: It requires significantly fewer function evaluations than evolutionary algorithms.

**Tuesday, 11:30 - 11:55 h, Room: H 3503, Talk 3**

**Mjd Powell**

On derivative-free optimization with linear constraints

**Abstract:**

The current research of the speaker is on optimization without derivatives when there are linear constraints on the variables. Many features of his NEWUOA software for unconstrained optimization are retained, but it is necessary to include the linear constraints in the subproblem that minimizes the current quadratic model approximately within a trust region. Truncated conjugate gradients is still chosen for solving this subproblem, a restart being made if the usual steplength

of an iteration has to be reduced in order to prevent a constraint violation. Each restart gives a smaller subproblem that is regarded as unconstrained after using active constraints to eliminate some of the variables. The active set of the first of these subproblems is chosen carefully, so that the steplength of the first conjugate gradient iteration cannot be made arbitrarily small by the need for feasibility. The progress of this work will be reported, with some preliminary numerical results.