**Wednesday, 15:45 - 16:10 h, Room: H 2036**

**Clovis Gonzaga**

On the complexity of steepest descent algorithms for minimizing quadratic functions

**Abstract:**

We discuss the question of how fast a steepest descent algorithm can be for minimizing a convex quadratic function. We do not tackle the general case of convex differentiable functions, which is more difficult. Steepest descent methods differ exclusively on the choice of step length at each iteration. We examine patterns in the distribution of these step lengths for minimizing a convex quadratic function. We show how a large number of short steps are needed, and how these relate to the much smaller number of large steps. We note that the order in which the step lengths are used is irrelevant, and show a worst case example with a small number of variables. We also conceive a brute force algorithm which is in a certain way optimal, and compare it with known algorithms.

Talk 2 of the invited session Wed.3.H 2036

**"First-derivative methods in convex optimization"** [...]

Cluster 4

**"Conic programming"** [...]