Invited Session Wed.3.H 0107

Wednesday, 15:15 - 16:45 h, Room: H 0107

Cluster 16: Nonlinear programming [...]

Line-search strategies

 

Chair: José Mario Martínez

 

 

Wednesday, 15:15 - 15:40 h, Room: H 0107, Talk 1

Ernesto G. Birgin
Spectral projected gradients: Reviewing ten years of applications

 

Abstract:
The Spectral Projected Gradient method (SPG) seeks the minimization of a smooth function over a convex set for which the projection operation can be inexpensively computed. The SPG is based on projected gradients and combines the spectral steplength with nonmonotone line searches. Since its introduction in 2000, many successful usages on a variety of fields have been reported, comprising Machine Learning, Medical Imaging, Meteorology, and Image Reconstruction, including Compressive Sensing, just to name a few. In this talk, some of those applications will be reviewed and analyzed.

 

 

Wednesday, 15:45 - 16:10 h, Room: H 0107, Talk 2

Sandra Santos
An adaptive spectral approximation-based algorithm for nonlinear least-squares problems

Coauthors: Marcia A. Gomes-Ruggiero, Douglas S. Goncalves

 

Abstract:
In this work we propose an adaptive algorithm for solving nonlinear least-squares problems, based on scalar spectral matrices employed in the approximation of the residual Hessians. Besides regularizing the Gauss-Newton step and providing an automatic updating for the so-called Levenberg- Marquardt parameter, the spectral approximation has a quasi-Newton flavour, including second-order information along the generated directions, obtained from the already computed first-order derivatives. A nonmonotone line search strategy is employed to ensure global convergence, and local convergence analysis is provided as well. Comparative numerical experiments with the routines LMDER and NL2SOL put the approach into perspective, indicating its effectiveness in two collections of problems from the literature.

 

 

Wednesday, 16:15 - 16:40 h, Room: H 0107, Talk 3

Natasa Krejic
Nonmonotone line search methods with variable sample sizes

Coauthor: Natasa Krklec

 

Abstract:
Nonmonotone line search methods for minimization of unconstrained objective functions in the form of mathematical expectation are considered. Nonmonotone schemes can improve the likelihood of finding a global minimizer and convergence speed. Sample Average Approximation - SAA method transforms the expectation objective function into a real-valued deterministic function using a large sample in each iteration. The main drawback of this approach is its cost. We will analyze a couple of nonmonotone line search strategies with variable sample sizes. Two measures of progress - lack of precision and functional decrease are calculated at each iteration. Based on this two measures a new sample size is determined. Additional safe guard rule is imposed to ensure the consistency of the linear models obtained with different samples. The rule we will present allows us to increase or decrease the sample size in each iteration until we reach some neighborhood of the solution. After that the maximal sample size is used so the variable sample size strategy generates the solution of the same quality as SAA method but with significantly smaller number of functional evaluations.

 

  The system of instant Virginia Payday Loans allows any adult U.S. citizen. If you have already decided to take Levitra, be sure to consult a doctor, you don't have any contraindications and act strictly due to a prescription.