Invited Session Thu.2.H 3003A

Thursday, 13:15 - 14:45 h, Room: H 3003A

Cluster 6: Derivative-free & simulation-based optimization [...]

Addressing noise in derivative-free optimization

 

Chair: Luís Nunes Vicente and Stefan Wild

 

 

Thursday, 13:15 - 13:40 h, Room: H 3003A, Talk 1

Stefan Wild
Computational noise in simulation-based optimization

 

Abstract:
Efficient simulation of complex phenomena often results in computational noise. Noise destroys underlying smoothness that otherwise could benefit optimization algorithms. We present a non-intrusive method for estimating computational noise and show how this noise can be used to derive finite-difference estimates with provable approximation guarantees. Building upon these results, we show how step sizes for model minimization and improvement can be selected. These techniques can also be used to determine when to transition from interpolation-based to regression-based surrogate models in derivative-free optimization.

 

 

Thursday, 13:45 - 14:10 h, Room: H 3003A, Talk 2

Stephen C. Billups
Managing the trust region and sample set for regression model based methods for optimizing noisy functions without derivatives

 

Abstract:
The presence of noise or uncertainty in function evaluations can negatively impact the performance of model based trust-region algorithms for derivative free optimization. One remedy for this problem is to use regression models, which are less sensitive to noise; and this approach can be enhanced by using weighted regression. But this raises questions of how to efficiently select sample points for model construction and how to manage the trust region radius, taking noise into account. This talk proposes strategies for addressing these questions and presents an algorithm based on these strategies.

 

 

Thursday, 14:15 - 14:40 h, Room: H 3003A, Talk 3

Anke Tröltzsch
A model-based trust-region algorithm for derivative-free optimization and its adaptation to handle noisy functions and gradients

Coauthors: Serge Gratton, Philippe L. Toint

 

Abstract:
Optimization algorithms are crucial to solve industrial optimization problems characterized by different requirements. Depending on the availability of the gradient, different algorithms have been developed such as Derivative-Free Optimization (DFO) or gradient-based algorithms. The software BC-DFO (Bound-Constrained Derivative-Free Optimization), using a self-correcting property of the geometry and an active-set strategy to handle bound constraints, has shown to be efficient on a set of test problems of the CUTEr collection. Here, we propose to extend this code by adding the possibility of handling noisy gradient information. It is well known that the L-BFGS method is a very efficient method for solving bound-constrained optimization problems when accurate gradient information is provided. Whereas, this is often not the case in practice. We would like to propose a family of algorithms which contains both, the derivative-free approach and the L-BFGS method, and which is therefore able to optimally take into account the error occurring in the cost function and/or gradient of the problem. We will present numerical experiments on academic and real-life test cases.

 

  There are three major facts that should be watched out for in all payday loans in the United States. In this section we give only a brief summary recommendation for admission of Levitra Sale. Full information can be found in the instructions for receiving medications with vardenafil.