Invited Session Mon.1.H 2036

Monday, 10:30 - 12:00 h, Room: H 2036

Cluster 4: Conic programming [...]

Geometry and duality in convex programming

 

Chair: Gabor Pataki

 

 

Monday, 10:30 - 10:55 h, Room: H 2036, Talk 1

Hayato Waki
Computation of facial reduction algorithm

Coauthor: Masakazu Muramatsu

 

Abstract:
Facial reduction algorithm (FRA) generates a smaller conic optimization problem by exploiting the degeneracy in a given conic optimization problem. However, the computation is comparable to solving the original problem. In addition, the resulting problem may lose the sparsity. In this talk, we show that one can apply FRA effectively by using structure in the original problem. For this, we present some examples, such as SDP relaxation for polynomial optimization and graph partitioning. In particular, we prove that in SDP relaxation for polynomial optimization, some methods for reducing the size of the SDP relaxation problems are partial application of FRA. This is joint work with Professor Masakazu Muramatsu (UEC).

 

 

Monday, 11:00 - 11:25 h, Room: H 2036, Talk 2

Vera Roshchina
Partition and complementarity in multifold conic systems

Coauthor: Javier Peña

 

Abstract:
We consider a homogeneous multifold convex conic system and its dual and show that there is canonical partition of the multifold structure determined by certain complementarity sets associated to the most interior solutions to the two systems. Our results are inspired by and extend the Goldman-Tucker Theorem for linear programming.

 

 

Monday, 11:30 - 11:55 h, Room: H 2036, Talk 3

Osman Guler
Efficient first-order methods for convex programming

 

Abstract:
First-order methods for convex programming use only information about gradients (or subgradients). They are especially useful for large-scale
problems since each iteration is cheap, memory requirements are low, and the convergence rates do not depend on the dimension of the problem. After the pioneering work by Polyak and later on by Yudin-Nemirovskii, and especially after Nesterov's work on optimal first-order method which emulates the conjugate gradient method, there has been a lot of recent interest in such methods. These algorithms can be extended to optimization problems with constraints, minimax problems, and have connections with the proximal-point methods. However, certain aspects of the algorithms are somewhat mysterious and not well understood. In our talk, we will explore the theoretical underpinnings of these methods and find new applications for them.

 

  Do you need Missouri Loans Online as soon as possible? Therefore, we can say that the active substances in its composition are more perfectly mixed. Vardenafil is not only present in the original Levitra Online, but also as part of its analogs.