Invited Session Wed.3.H 2036

Wednesday, 15:15 - 16:45 h, Room: H 2036

Cluster 4: Conic programming [...]

First-derivative methods in convex optimization

 

Chair: Stephen A. Vavasis

 

 

Wednesday, 15:15 - 15:40 h, Room: H 2036, Talk 1

Yoel Drori
Performance of first-order methods for smooth convex minimization: A novel approach

Coauthor: Marc Teboulle

 

Abstract:
We introduce a novel approach for analyzing the performance of first-order black-box optimization methods. Following the seminal work of Nemirovski and Yudin (1983) in the complexity analysis of convex optimization methods, we measure the computational cost based on the oracle model of optimization. Building on this model, our approach relies on the observation that by definition, the worst case behavior of a black-box optimization method is by itself an optimization problem, which we call the Performance Estimation Problem (PEP). We analyze the properties of the resulting PEP for various black-box first order schemes. This allows us to prove a new tight analytical bound for the classical gradient method, as well as to derive numerical bounds that can be efficiency computed for a broad class of first order schemes. Moreover, we derive an efficient procedure for finding step sizes which produces a first-order black-box method that achieves best performance.

 

 

Wednesday, 15:45 - 16:10 h, Room: H 2036, Talk 2

Clovis C. Gonzaga
On the complexity of steepest descent algorithms for minimizing quadratic functions

 

Abstract:
We discuss the question of how fast a steepest descent algorithm can be for minimizing a convex quadratic function. We do not tackle the general case of convex differentiable functions, which is more difficult. Steepest descent methods differ exclusively on the choice of step length at each iteration. We examine patterns in the distribution of these step lengths for minimizing a convex quadratic function. We show how a large number of short steps are needed, and how these relate to the much smaller number of large steps. We note that the order in which the step lengths are used is irrelevant, and show a worst case example with a small number of variables. We also conceive a brute force algorithm which is in a certain way optimal, and compare it with known algorithms.

 

 

Wednesday, 16:15 - 16:40 h, Room: H 2036, Talk 3

Sahar Karimi
CGSO for convex problems with polyhedral constraints

Coauthor: Stephen Vavasis

 

Abstract:
We have proposed CGSO (Conjugate Gradient with Subspace Optimization) as an extension to Nemirovski-Yudin's algorithm. CGSO is a conjugate gradient type algorithm that benefits from the optimal complexity bound Nemirovski-Yudin's algorithm achieves for the class of unconstrained convex problems. In this talk, we discuss CGSO for convex problems with polyhedral constraints. We study the theoretical properties as well as the practical performance of CGSO for this class of problems.

 

  loans online . What can cause long-term use of Viagra? In the network and other sources of information, there is no reliable data on the long-term use of Viagra and its negative effects on the body.