Invited Session Fri.1.H 1028

Friday, 10:30 - 12:00 h, Room: H 1028

Cluster 21: Sparse optimization & compressed sensing [...]

Greedy algorithms for sparse optimization

 

Chair: Prateek Jain

 

 

Friday, 10:30 - 10:55 h, Room: H 1028, Talk 1

Pradeep Ravikumar
Nearest neighbor based greedy coordinate descent

Coauthors: Inderjit Dhillon, Ambuj Tewari

 

Abstract:
Increasingly, optimization problems in machine learning, especially those arising from high-dimensional statistical estimation, have a large number of variables. Modern statistical estimators developed over the past decade have statistical or sample complexity that depends only weakly on the number of parameters when there is some structure to the problem, such as sparsity. A central question is whether similar advances can be made in their computational complexity as well. In this talk, we propose strategies that indicate that such advances can indeed be made. In particular, we investigate the greedy coordinate descent algorithm, and note that performing the greedy step efficiently weakens the costly dependence on the problem size provided the solution is sparse. We then propose a suite of methods that perform these greedy steps efficiently by a reduction to nearest neighbor search. We also develop a practical implementation of our algorithm that combines greedy coordinate descent with locality sensitive hashing, using which we are not only able to significantly speed up the vanilla greedy method, but also outperform cyclic descent when the problem size becomes large.

 

 

Friday, 11:00 - 11:25 h, Room: H 1028, Talk 2

Prateek Jain
Orthogonal matching pursuit with replacement

Coauthors: Inderjit S. Dhillon, Ambuj Tewari

 

Abstract:
In this paper, we consider the problem of compressed sensing where the goal is to recover almost all the sparse vectors using a small number of fixed linear measurements. For this problem, we propose a novel partial hard-thresholding operator that leads to a general family of iterative algorithms. While one extreme of the family yields well known hard thresholding algorithms like ITI (Iterative Thresholding with Inversion) and HTP (Hard Thresholding Pursuit), the other end of the spectrum leads to a novel algorithm that we call Orthogonal Matching Pursuit with Replacement (OMPR). OMPR, like the classic greedy algorithm OMP, adds exactly one coordinate to the support at each iteration, based on the correlation with the current residual. However, unlike OMP, OMPR also removes one coordinate from the support. This simple change allows us to prove that OMPR has the best known guarantees for sparse recovery in terms of the Restricted Isometry Property (a condition on the measurement matrix). Our proof techniques are novel and flexible enough to also permit the tightest known analysis of popular iterative algorithms such as CoSaMP and Subspace Pursuit.

 

  Most online loan lenders allow getting New Jersey Loans Online without visiting a bank, straight to your bank account. If you have already decided to take Levitra, be sure to consult a doctor, you don't have any contraindications and act strictly due to a prescription.