Invited Session Thu.3.H 3003A

Thursday, 15:15 - 16:45 h, Room: H 3003A

Cluster 6: Derivative-free & simulation-based optimization [...]

Recent progress in direct search methods

 

Chair: Luís Nunes Vicente and Stefan Wild

 

 

Thursday, 15:15 - 15:40 h, Room: H 3003A, Talk 1

Sébastien Le Digabel
The mesh adaptive direct search algorithm with reduced number of directions

Coauthors: Charles Audet, Andrea Ianni, Christophe Tribes

 

Abstract:
The Mesh Adaptive Direct Search (MADS) class of algorithms is designed for blackbox optimization where the objective function and constraints are typically computed by launching a time-consuming computer simulation. The core of each iteration of the algorithm consists of launching the simulation at a finite number of trial points. These candidates are constructed from MADS directions. The current and efficient implementation of MADS uses 2n directions at each iteration, where n is the number of variables. The scope of the present work is the reduction of that number to a minimal positive spanning set of n+1 directions. This transformation is generic and can be applied to any method that generates more than n+1 MADS directions.

 

 

Thursday, 15:45 - 16:10 h, Room: H 3003A, Talk 2

José Mario Martínez
Inexact restoration method for derivative-free optimization with smooth constraints

Coauthors: Luís Felipe Bueno, Ana Friedlander, Francisco N. C. Sobral

 

Abstract:
A new method is introduced for solving constrained optimization problems in which the derivatives of the constraints are available but the derivatives of the objective function are not. The method is based on the Inexact Restoration framework, by means of which each iteration is divided in two phases. In the first phase one considers only the constraints, in order to improve feasibility. In the second phase one minimizes a suitable objective function subject to a linear approximation of the constraints. The second phase must be solved using derivative-free methods. An algorithm introduced recently by Kolda, Lewis, and Torczon for linearly constrained derivative-free optimization is employed for this purpose. Under usual assumptions, convergence to stationary points is proved. A computer implementation is described and numerical experiments are presented.

 

 

Thursday, 16:15 - 16:40 h, Room: H 3003A, Talk 3

Rohollah Garmanjani
Smoothing and worst case complexity for direct-search methods in non-smooth optimization

Coauthor: Luís Nunes Vicente

 

Abstract:
For smooth objective functions it has been shown that the worst case cost
of direct-search methods is of the same order as the one of steepest descent.
Motivated by the lack of such a result in the non-smooth case, we propose, analyze,
and test a
class of smoothing direct-search methods for the optimization of non-smooth functions.
Given a parameterized family of smoothing functions for the non-smooth objective
function,
this class of methods consists of applying a direct search for a fixed value of the
smoothing parameter until the step size is relatively small, after which the
smoothing parameter
is reduced and the process is repeated.
One can show that the worst case complexity (or cost) of this procedure is roughly
one order
of magnitude worse than the one for direct search or steepest descent on smooth
functions.
The class of smoothing direct-search methods is also showed to enjoy asymptotic
global convergence properties.
Numerical experience indicates that this approach leads to better values of the
objective function, apparently without an additional cost in the number of function
evaluations.

 

  cash advance loans . In this section we give only a brief summary recommendation for admission of Levitra. Full information can be found in the instructions for receiving medications with vardenafil.