Invited Session Fri.2.H 1028

Friday, 13:15 - 14:45 h, Room: H 1028

Cluster 21: Sparse optimization & compressed sensing [...]

Structured matrix optimization

 

Chair: Inderjit Dhillon

 

 

Friday, 13:15 - 13:40 h, Room: H 1028, Talk 1

Ewout van den Berg
Phase-retrieval using explicit low-rank matrix factorization

Coauthor: Emmanuel Candes

 

Abstract:
Recently, Candes et al. proposed a novel methodology for phase retrieval from magnitude
information by formulating it as a matrix-completion problem. In this work we develop an
algorithm aimed at solving large-scale instances of this problem. We take advantage of
the fact that the desired solution is of rank one and use low-rank matrix factorization
techniques to attain considerable speed-up over existing approaches. We consider phase
recovery in both the noisy and noiseless setting and study how various design choices
affect the performance and reliability of the algorithm.

 

 

Friday, 13:45 - 14:10 h, Room: H 1028, Talk 2

Zaid Harchaoui
Lifted coordinate descent for learning with Gauge regularization

Coauthors: Miroslav Dudik, Jérôme Malick

 

Abstract:
We study learning problems with general sparsity-inducing matrix regularization penalties. We formulate the matrix regularizers as Gauge functions, and, using their structure, we lift the optimization problem in a higher space where we propose to apply a coordinate descent algorithm. Our framework allows to efficiently tackle difficult matrix-regularized objectives, e.g., with a trace-norm, or a group
trace-norm, regularization penalty. We present experimental results on synthetic datasets and on real-world large-scale computer vision datasets. Our algorithm is competitive and often outperforms existing approaches on large-scale problems.

 

 

Friday, 14:15 - 14:40 h, Room: H 1028, Talk 3

Inderjit Dhillon
Sparse inverse covariance matrix estimation using quadratic approximation

Coauthors: Cho-Jui Hsieh, Pradeep Ravikumar, Matyas Sustik

 

Abstract:
The L1-regularized Gaussian maximum likelihood estimator has been
shown to have strong statistical guarantees in recovering a sparse
inverse covariance matrix, or alternatively the underlying graph
structure of a Gaussian Markov Random Field, from very limited
samples. We propose a new algorithm for solving the resulting
optimization problem which is a regularized log-determinant
program. In contrast to other state-of-the-art methods that largely
use first order gradient information, our algorithm is based on
Newton's method and employs a quadratic approximation, but with some
modifications that leverage the structure of the sparse Gaussian MLE
problem. We present experimental results using synthetic and real
application data that demonstrate the considerable improvements in
performance of our method when compared to other state-of-the-art
methods.

 

  quick loans . One of the main advantages of Sovaldi is that it can be used by patients belonging to all 4 genotypes. Buy Sovaldi is a very strong drug, and as all of them, it has a number of side effects that can be caused.