Tuesday, 15:15 - 15:40 h, Room: H 2038


Bamdev Mishra
Fixed-rank matrix factorizations and the design of invariant optimization algorithms

Coauthor: Rodolphe Sepulchre


Optimizing over low-rank matrices is a fundamental problem arising in many modern machine learning applications. One way of handling the rank constraint is by fixing the rank a priori resulting in a fixed-rank factorization model. We study the underlying geometries of several well-known fixed-rank matrix factorizations and then exploit the Riemannian framework of the search space in the design of gradient descent and trust-region algorithms.
We focus on the invariance properties of certain metrics. Specifically, we seek to develop algorithms that can be made invariant to linear transformation of the data space. We show that different Riemannian geometries lead to different invariance properties and we provide numerical evidence to support the effect of invariance properties on the algorithm performance.
We make connections with existing algorithms and discuss relative usefulness of the proposed framework. Numerical experiments suggest that the proposed algorithms compete with the state-of-the-art and that manifold optimization offers an effective and versatile framework for the design of machine learning algorithms that learn a fixed-rank matrix.


Talk 1 of the invited session Tue.3.H 2038
"Conic and convex programming in statistics and signal processing I" [...]
Cluster 4
"Conic programming" [...]


  USA Payday Loans Online. Therefore, we can say that the active substances in its composition are more perfectly mixed. Vardenafil is not only present in the original Buy Levitra, but also as part of its analogs.