Invited Session Tue.3.H 2038

Tuesday, 15:15 - 16:45 h, Room: H 2038

Cluster 4: Conic programming [...]

Conic and convex programming in statistics and signal processing I

 

Chair: Parikshit Shah

 

 

Tuesday, 15:15 - 15:40 h, Room: H 2038, Talk 1

Bamdev Mishra
Fixed-rank matrix factorizations and the design of invariant optimization algorithms

Coauthor: Rodolphe Sepulchre

 

Abstract:
Optimizing over low-rank matrices is a fundamental problem arising in many modern machine learning applications. One way of handling the rank constraint is by fixing the rank a priori resulting in a fixed-rank factorization model. We study the underlying geometries of several well-known fixed-rank matrix factorizations and then exploit the Riemannian framework of the search space in the design of gradient descent and trust-region algorithms.
We focus on the invariance properties of certain metrics. Specifically, we seek to develop algorithms that can be made invariant to linear transformation of the data space. We show that different Riemannian geometries lead to different invariance properties and we provide numerical evidence to support the effect of invariance properties on the algorithm performance.
We make connections with existing algorithms and discuss relative usefulness of the proposed framework. Numerical experiments suggest that the proposed algorithms compete with the state-of-the-art and that manifold optimization offers an effective and versatile framework for the design of machine learning algorithms that learn a fixed-rank matrix.

 

 

Tuesday, 15:45 - 16:10 h, Room: H 2038, Talk 2

Martin Skovgaard Andersen
Multifrontal barrier computations for sparse matrix cones

Coauthor: Vandenberghe Lieven

 

Abstract:
We discuss conic optimization problems involving two types of convex
matrix cones: the cone of positive semidefinite matrices with a given
chordal sparsity pattern, and its dual cone, the cone of matrices
with the same sparsity that have a positive semidefinite completion.
We describe efficient algorithms for evaluating the values, gradients,
and Hessians of the logarithmic barrier functions for the two types
of cones. The algorithms are based on techniques used in multifrontal
and supernodal sparse Cholesky factorization methods.
The results will be illustrated with applications in covariance selection
and semidefinite programming.

 

 

Tuesday, 16:15 - 16:40 h, Room: H 2038, Talk 3

Venkat Chandrasekaran
Computational and sample tradeoffs via convex relaxation

Coauthor: Michael Jordan

 

Abstract:
In modern data analysis, one is frequently faced with statistical inference problems involving massive datasets. In this talk we discuss a computational framework based on convex relaxation in order to reduce the computational complexity of an inference procedure when one has access to increasingly larger datasets. Essentially, the statistical gains from larger datasets can be exploited to reduce the runtime of inference algorithms.

 

  Today, Ohio Payday Loans are legitimate, but there are illegal transactions that are still on the rise. But at the same time, it acts only with sexual arousal. Viagra has a number of advantages in comparison with injections in the sexual organ or other procedures aimed at treatment of impotency.