Contributed Session Mon.3.H 0112

Monday, 15:15 - 16:45 h, Room: H 0112

Cluster 16: Nonlinear programming [...]

Unconstrained optimization I

 

Chair: Roummel Marcia

 

 

Monday, 15:15 - 15:40 h, Room: H 0112, Talk 1

Saman Babaie-Kafaki
A modification on the Hager-Zhang conjugate gradient method

 

Abstract:
Conjugate gradient (CG) methods comprise a class of unconstrained optimization algorithms
characterized by low memory requirements and strong global convergence properties which made them
popular for engineers and mathematicians engaged in solving large-scale unconstrained optimization
problems. One of the efficient CG methods has been proposed by Hager and Zhang. Here, a singular
value study is made in order to find lower and upper bounds for the condition number of the matrix
which generates the search directions of the Hager-Zhang method. Then, based on the insight gained
by our analysis, a modified version of the Hager-Zhang method is proposed using an adaptive switch
form the Hager-Zhang method to the Hestenes-Stiefel method, when the mentioned condition number is
large. It can be shown that if the line search fulfills the strong Wolfe conditions, then the
proposed method is globally convergent for uniformly convex objective functions. Numerical
experiments on a set of unconstrained optimization test problems of the CUTEr collection demonstrate
the efficiency of the suggested adaptive CG method in the sense of the performance profile
introduced by Dolan and Moré.

 

 

Monday, 15:45 - 16:10 h, Room: H 0112, Talk 2

Tove Odland
On the relationship between quasi-Newton methods and the conjugate

Coauthor: Anders Forsgren

 

Abstract:
It is well known that a Quasi-Newton method using any well-defined
update from the Broyden class of updates and the conjugate gradient
method produce the same iterates on a quadratic objective function with
positive-definite Hessian. In this case both methods produce conjugate
directions with respect to the Hessian. This equivalence does not hold
for any quasi-Newton method. We discuss more precisely what the updates
in a Quasi-Newton method need satisfy to gives rise to this behavior.

 

 

Monday, 16:15 - 16:40 h, Room: H 0112, Talk 3

Roummel Marcia
Limited-memory BFGS with diagonal updates

Coauthor: Jennifer Erway

 

Abstract:
We investigate a formula to solve limited-memory BFGS quasi-Newton Hessian systems with full-rank diagonal updates. Under some conditions, the system can be solved via a recursion that uses only vector inner products. This approach has broad applications in trust region and barrier methods.

 

  Payday Loans In Virginia. The main active actual substance of Levitra Professional online - Vardenafil does not affect the seminal fluid and is not addictive.