Invited Session Mon.1.H 0110

Monday, 10:30 - 12:00 h, Room: H 0110

Cluster 16: Nonlinear programming [...]

Nonlinear optimization I

 

Chair: Frank E. Curtis and Daniel Robinson

 

 

Monday, 10:30 - 10:55 h, Room: H 0110, Talk 1

Jonathan Eckstein
Alternating direction methods and relative error criteria for augmented Lagrangians

Coauthor: Yao Wang

 

Abstract:
We examine the computational behavior of a number of variations on the alternating direction method of multipliers (ADMM) for convex optimization, focusing largely on lasso problems, whose structure is well-suited to the method. In particular, we computationally compare the classical ADMM to minimizing the augmented Lagrangian essentially exactly by alternating minimization before each multiplier update, and to approximate versions of this strategy using the recent augmented Lagrangian relative error criterion of Eckstein and Silva.

 

 

Monday, 11:00 - 11:25 h, Room: H 0110, Talk 2

Gillian Chin
A family of second order methods for L1 convex optimization

Coauthors: Richard H. Byrd, Jorge Nocedal, Figen Oztoprak

 

Abstract:
We describe and analyze a family of second order methods for minimizing an objective that is composed of a smooth convex function and an L1 regularization term. The algorithms in this family are categorized as two phase methods, differing with respect to the active manifold identification phase and the second order subspace step. We will show how to endow these algorithms with convergence guarantees and as well, propose a new algorithm, contrasting this method with established approaches. We report numerical results on large scale machine learning applications.

 

 

Monday, 11:30 - 11:55 h, Room: H 0110, Talk 3

Stefan Solntsev
Dynamic batch methods for L1 regularized problems and constrained optimization

Coauthors: Richard Byrd, Jorge Nocedal

 

Abstract:
A methodology for using dynamic sample sizes in batch-type optimization methods is proposed. Motivated by machine learning applications, dynamic batching can successfully be applied to smooth convex constrained problems as well as non-smooth L1-regularized problems. By dynamically changing the batch size, the algorithm is able to keep overall costs low. The use of a batch approach allows the algorithm to exploit parallelism.

 

  There are three major facts that should be watched out for in all payday loans in the United States. In this section we give only a brief summary recommendation for admission of Cheap Levitra. Full information can be found in the instructions for receiving medications with vardenafil.