Monday, 15:15 - 15:40 h, Room: H 1028


Wotao Yin
Augmented L1 and nuclear-norm minimization with a globally linearly convergent algorithm

Coauthor: Ming-Jun Lai


L1 minimization tends to give sparse solutions while the least squares (LS) give dense solutions. We show that minimizing the weighted sum of L1 and LS, with an appropriately small weight for the LS term, can efficiently recover sparse vectors with provable recovery guarantees. For compressive sensing, exact and stable recovery guarantees can be given in terms of the null-space property, restricted isometry property, spherical section property, and “RIPless” property of the sensing matrix. Moreover, the Lagrange dual problem of L1+LS minimization is convex, unconstrained, and differentiable; hence, a rich set of classical techniques such as gradient descent, line search, Barzilai-Borwein steps, quasi-Newton methods, and Nesterov’s acceleration can be directly applied. We show that the gradient descent iteration is globally linearly convergent, and we give an explicit rate. This is the first global linear convergence result among the gradient-based algorithms for sparse optimization. We also present an algorithm based on the limited-memory BFGS and demonstrate its superior performance than several existing L1 solvers.


Talk 1 of the invited session Mon.3.H 1028
"Global rate guarantees in sparse optimization" [...]
Cluster 21
"Sparse optimization & compressed sensing" [...]


  personal loans. But at the same time, it acts only with sexual arousal. Viagra has a number of advantages in comparison with injections in the sexual organ or other procedures aimed at treatment of impotency.