Monday, 15:45 - 16:10 h, Room: H 1028


Lin Xiao
A proximal-gradient homotopy method for the sparse least-squares problem

Coauthor: Tong Zhang


We consider the l1-regularized least-squares problem in the context of sparse recovery or compressed sensing. The standard proximal gradient method (iterative soft-thresholding) has low computational cost per iteration but a rather slow convergence rate. Nevertheless, when the solution is sparse, it often exhibits fast linear convergence in the end. We exploit this local linear convergence using a homotopy continuation strategy, i.e., we minimize the objective for a sequence of decreasing values of the regularization parameter, and use an approximate solution at the end of each stage to warm-start the next stage. Similar strategies have been studied in the literature, but there has been no theoretical analysis of their global iteration complexity. We shows that under suitable assumptions for sparse recovery, the proposed homotopy strategy ensures that all iterates along the homotopy solution path are sparse. Therefore the objective function is effectively strongly convex along the path, and geometric convergence at each stage can be established. As a result, the overall iteration complexity of our method is O(log(1/ε)) for finding an ε-optimal solution.


Talk 2 of the invited session Mon.3.H 1028
"Global rate guarantees in sparse optimization" [...]
Cluster 21
"Sparse optimization & compressed sensing" [...]


  The best way to go for you to know the credible Michigan Payday Loans providers. But it is worth noting that these tests were carried out on the blood cells. Therefore, it's too early to say about scientific evidence of Viagra No RX influence on blood clots.