Invited Session Tue.3.H 2036

Tuesday, 15:15 - 16:45 h, Room: H 2036

Cluster 4: Conic programming [...]

First-derivative and interior methods in convex optimization

 

Chair: Stephen A. Vavasis

 

 

Tuesday, 15:15 - 15:40 h, Room: H 2036, Talk 1

Miguel F. Anjos
Convergence and polynomiality of a primal-dual interior-point algorithm for linear programming with selective addition of inequalities

Coauthor: Alexander Engau

 

Abstract:
We present the convergence proof and complexity analysis for an interior-point framework that solves linear programming problems by dynamically selecting and adding inequalities. First, we formulate a new primal-dual interior-point algorithm for solving linear programs in nonstandard form with equality and inequality constraints. The algorithm uses a primal-dual path-following predictor-corrector short-step interior-point method that starts with a reduced problem without any inequalities and selectively adds a given inequality only if it becomes active on the way to optimality. Second, we prove convergence of this algorithm to an optimal solution at which all inequalities are satisfied regardless of whether they have been added by the algorithm or not. We thus provide a theoretical foundation for similar schemes already used in practice. We also establish conditions under which the complexity of the algorithm is polynomial in the problem dimension.

 

 

Tuesday, 15:45 - 16:10 h, Room: H 2036, Talk 2

Olivier Devolder
Intermediate gradient methods for smooth convex optimization problems with inexact oracle

Coauthors: Fran├žois Glineur, Yurii Nesterov

 

Abstract:
Between the slow but robust gradient method and the fast but sensitive to errors fast gradient method, we develop new intermediate gradient methods for smooth convex optimization problems.
We show, theoretically and on numerical experiments, that these new intermediate first-order methods can be used in order to accelerate the minimization of a smooth convex function when only inexact first-order information is available.

 

 

Tuesday, 16:15 - 16:40 h, Room: H 2036, Talk 3

Jose Herskovits
A feasible direction interior point algorithm for nonlinear convex semidefinite programming

Coauthors: Miguel Aroztegui, Jean R. Roche

 

Abstract:
The present method employs basic ideas of FDIPA [1], the Feasible Direction Interior Point Algorithm for nonlinear optimization. It generates a descent sequence of points at the interior of the feasible set, defined by the semidefinite constraints. The algorithm performs Newton-like iterations to solve the first order Karush-Kuhn-Tucker optimality conditions. At each iteration, two linear systems with the same coefficient matrix must be solved. The first one generates a descent direction. In the second linear system, a precisely defined perturbation in the left hand side is done and, as a consequence, a descent feasible direction is obtained. An inexact line search is then performed to ensure that the new iterate is interior and the objective is lower. A proof of global convergence of is presented. Some numerical are described. We also present the results with structural topology optimization problems employing a mathematical model based on semidefinite programming. The results suggest efficiency and high robustness of the proposed method.



  1. Herskovits J. A Feasible Directions Interior Point Technique For Nonlinear Optimization. JOTA, v. 99, n. 1, p. 121-146, 1998.


 

  Payday Loans In Tennessee. In this section we give only a brief summary recommendation for admission of Cheap Levitra. Full information can be found in the instructions for receiving medications with vardenafil.