Invited Session Mon.3.H 2038

Monday, 15:15 - 16:45 h, Room: H 2038

Cluster 4: Conic programming [...]

Matrix optimization


Chair: Defeng Sun



Monday, 15:15 - 15:40 h, Room: H 2038, Talk 1

Houduo Qi
Computing the nearest Euclidean distance matrix


The Nearest Euclidean distance matrix problem (NEDM) is a fundamental computational problem in applications such as multidimensional scaling and molecular conformation from nuclear magnetic resonance data in computational chemistry. Especially in the latter application, the problem is often a large scale with the number of atoms ranging from a few hundreds to a few thousands. In this paper, we introduce a semismooth Newton method that solves the dual problem of (NEDM). We prove that the method is quadratically convergent. We then present an application of the Newton method to NEDM with H-weights via majorization and an accelerated proximal gradient scheme. We demonstrate the superior performance of the Newton method over existing methods including the latest quadratic semi-definite programming solver. This research also opens a new avenue towards efficient solution methods for the molecular embedding problem.



Monday, 15:45 - 16:10 h, Room: H 2038, Talk 2

Bin Wu
The Moreau-Yosida regularization of the Ky Fan k-norm related functions

Coauthors: Chao Ding, Defeng Sun, Kim-Chuan Toh


Matrix optimization problems (MOPs) involving the Ky Fan k-norm arise frequently in diverse fields such as matrix norm approximation, graph theory, and so on. In order to apply the proximal point algorithms to solve large scale MOPs involving the Ky Fan k-norm, we need to understand the first and second order properties of the Moreau-Yosida regularization of the Ky Fan k-norm function and the indicator function of its epigraph. As an initial step, we first study the counterparts of the vector k-norm related functions, including the metric projectors over the dual vector k-norm ball and the vector k-norm epigraph, and their directional derivatives and Fr├ęchet differentiability. We then use these results to study the corresponding properties for the Moreau-Yosida regularization of the Ky Fan k-norm epigraph indicator function.



Monday, 16:15 - 16:40 h, Room: H 2038, Talk 3

Renato D. Monteiro
An accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methods

Coauthor: Benar F. Svaiter


We present an accelerated variant of the hybrid proximal
extragradient (HPE) method for convex optimization,
referred to as the A-HPE method.
Iteration-complexity results are established for
the A-HPE method, as well as a special version of it,
where a large stepsize condition is imposed.
Two specific implementations of the A-HPE method are described in the context
of a structured convex optimization problem whose objective function consists
of the sum of a smooth convex function and an extended real-valued non-smooth
convex function. In the first implementation, a generalization of a variant of
Nesterov's method is obtained for the case where the smooth component
of the objective function has Lipschitz continuous gradient.
In the second one, an accelerated Newton proximal extragradient
(A-NPE) method is obtained for the case where the smooth component of
the objective function has Lipschitz continuous Hessian. It is shown that
the A-NPE method has a O(1/k7/2) convergence rate,
which improves upon the O(1/k3)
convergence rate bound for another accelerated Newton-type
method presented by Nesterov.


  There are three major facts that should be watched out for in all payday loans in the United States. If you have already decided to take Levitra For Sale, be sure to consult a doctor, you don't have any contraindications and act strictly due to a prescription.