Invited Session Wed.1.H 2038

Wednesday, 10:30 - 12:00 h, Room: H 2038

Cluster 4: Conic programming [...]

Conic and convex programming in statistics and signal processing II


Chair: Venkat Chandrasekaran



Wednesday, 10:30 - 10:55 h, Room: H 2038, Talk 1

Rachel Ward
Robust image recovery via total-variation minimization

Coauthor: Deanna Needell


Discrete images, composed of patches of slowly-varying pixel values, have sparse or
compressible wavelet representations which allow the techniques from compressed
sensing such as L1-minimization to be utilized. In addition, such images also have
sparse or compressible discrete derivatives which motivate the use of total variation
minimization for image reconstruction. Although image compression is a primary
motivation for compressed sensing, stability results for total-variation minimization
do not follow directly from the standard theory. In this talk, we present numerical
studies showing the benefits of total variation approaches and provable near-optimal
reconstruction guarantees for total-variation minimization using properties of the
bivariate Haar transform.



Wednesday, 11:00 - 11:25 h, Room: H 2038, Talk 2

Joel A. Tropp
Sharp recovery bounds for convex deconvolution, with applications

Coauthor: Michael B. McCoy


Suppose we observe the sum of two structured signals, and we are asked to identify the two components in the mixture. This setup includes the problem of separating two signals that are sparse in different bases and the problem of separating a sparse matrix from a low-rank matrix. This talk describes a convex optimization framework for solving these deconvolution problems and others.
We present a randomized signal model that captures the idea of "incoherence'' between two structures. The calculus of spherical integral geometry provides exact formulas that describe when the optimization problem will succeed (or fail) to deconvolve the component signals with high probability. This approach yields summary statistics that measure the complexity of a particular structured signal. The total complexity of the two signals is the only factor that affects whether deconvolution is possible.
We consider three stylized problems. (1) Separating two signals that are sparse in mutually incoherent bases. (2) Decoding spread-spectrum transmissions in the presence of impulsive noise. (3) Removing sparse corruptions from a low-rank matrix. In each case, the theory accurately predicts performance.



Wednesday, 11:30 - 11:55 h, Room: H 2038, Talk 3

Parikshit Shah
Group symmetry and covariance regularization

Coauthor: Venkat Chandrasekaran


Statistical models that possess symmetry arise in diverse settings such as random fields associated to geophysical phenomena, exchangeable processes in Bayesian statistics, and cyclostationary processes in engineering. We formalize the notion of a symmetric model via group invariance. We propose projection onto a group fixed point subspace as a fundamental way of regularizing covariance matrices in the high-dimensional regime. In terms of parameters associated to the group we derive precise rates of convergence of the regularized covariance matrix and demonstrate that significant statistical gains may be expected in terms of the sample complexity. We further explore the consequences of symmetry on related model-selection problems such as the learning of sparse covariance and inverse covariance matrices.


  Payday Loans In Virginia. If you have already decided to take Levitra, be sure to consult a doctor, you don't have any contraindications and act strictly due to a prescription.