Invited Session Mon.2.H 1028

Monday, 13:15 - 14:45 h, Room: H 1028

Cluster 21: Sparse optimization & compressed sensing [...]

Sparse optimization and generalized sparsity models

 

Chair: Gitta Kutyniok

 

 

Monday, 13:15 - 13:40 h, Room: H 1028, Talk 1

Rayan Saab
Recovering compressively sampled signals using partial support information

Coauthors: Michael P. Friedlander, Hassan Mansour, Ozgur Yilmaz

 

Abstract:
In this talk, we address the recovery conditions of weighted l1 minimization for signal reconstruction from compressed sensing measurements when partial support information is available. We show that if at least half of the (partial) support information is accurate, then weighted l1 minimization is stable and robust under weaker conditions than the analogous sufficient conditions for standard l1 minimization. Moreover, weighted l1 minimization provides better bounds on the reconstruction error in terms of the measurement noise and the compressibility of the signal to be recovered. We illustrate our results with numerical experiments.

 

 

Monday, 13:45 - 14:10 h, Room: H 1028, Talk 2

Emmanuel Candes
PhaseLift: Exact phase retrieval via convex programming

Coauthors: Yonina Eldar, Thomas Strohmer, Vladislav Voroninski

 

Abstract:
This talk introduces a novel framework for phase retrieval, a problem which arises in X-ray crystallography, diffraction imaging, astronomical imaging and many other applications. Our approach combines multiple structured illuminations together with ideas from convex programming to recover the phase from intensity measurements, typically from the modulus of the diffracted wave. We demonstrate empirically that any complex-valued object can be recovered from the knowledge of the magnitude of just a few diffracted patterns by solving a simple convex optimization problem inspired by the recent literature on matrix completion. More importantly, we also demonstrate that our noise-aware algorithms are stable in the sense that the reconstruction degrades gracefully as the signal-to-noise ratio decreases. Finally, we present some novel theory showing that our entire approach may be provably surprisingly effective.

 

 

Monday, 14:15 - 14:40 h, Room: H 1028, Talk 3

Gitta Kutyniok
Clustered sparsity

 

Abstract:
The novel research area of compressed sensing surprisingly predicts that high-dimensional signals, which allow a sparse representation by a suitable basis or, more generally, a frame, can be recovered from what was previously considered highly incomplete linear measurements, by using efficient
algorithms. Lately, more attention has been paid to the fact that in most applications the nonzero entries of the sparse vector do not arise in arbitrary patterns, but are rather highly structured. It also became evident that often the interactions between columns of the sensing matrix in ill-posed problems are not arbitrary, but rather geometrically driven.
In this talk, we will introduce what we coin clustered sparsity to provide a meaningful mathematical framework for these considerations. We will discuss sparse recovery results and applications to data separation and inpainting. It is also intriguing to realize that this framework often naturally requires solving a different l1 minimization problem, namely the minimization on the analysis rather than the synthesis side. This is related to the recently introduced
co-sparsity model.

 

  Do you need Missouri Payday Loans as soon as possible? But at the same time, it acts only with sexual arousal. Viagra has a number of advantages in comparison with injections in the sexual organ or other procedures aimed at treatment of impotency.