## Contributed Session Mon.2.H 2036

#### Monday, 13:15 - 14:45 h, Room: H 2036

**Cluster 4: Conic programming** [...]

### Algorithms for matrix optimization problems

**Chair: Yu Xia**

**Monday, 13:15 - 13:40 h, Room: H 2036, Talk 1**

**Qingna Li**

Sequential semismooth Newton method for nearest low-rank correlation matrix problem

**Coauthor: Houduo Qi**

**Abstract:**

Rank constrained matrix optimization problems have been receiving great interest in the past few years due to the applications in various fields. One of such problems is the nearest low-rank correlation matrix problem, arising from finance. In this talk, we propose the sequential semismooth Newton method to solve it. We analyze the connections between the propsed method and some other methods. Potential improvement of the method is also discussed.

**Monday, 13:45 - 14:10 h, Room: H 2036, Talk 2**

**Chengjing Wang**

On how to solve large scale matrix log-determinant optimization problems

**Coauthors: Defeng Sun, Kim-Chuan Toh**

**Abstract:**

We propose a Newton-CG primal proximal point algorithm (PPA) and a Newton-CG primal augmented Lagrangian method (ALM) for solving large scale nonlinear semidefinite programming problems whose objective functions are a sum of a log-determinant term with a linear function and a sum of a log-determinant term with a convex quadratic function, respectively. Our algorithms employ the essential ideas of the PPA, the ALM, the Newton method, and the preconditioned conjugate gradient (CG) solver. We demonstrate that our algorithms perform favorably compared to existing state-of-the-art algorithms and are much preferred when a high quality solution is required for problems with many equality constraints.

**Monday, 14:15 - 14:40 h, Room: H 2036, Talk 3**

**Yu Xia**

Gradient methods for a general least squares problem

**Abstract:**

We consider a constrained least squares problem over cones. We show how to adapt Nesterov's fast gradient methods to the problem efficiently. Numerical examples will be provided.