Invited Session Wed.3.H 1028

Wednesday, 15:15 - 16:45 h, Room: H 1028

Cluster 21: Sparse optimization & compressed sensing [...]

Structured models in sparse optimization

 

Chair: John C. Duchi

 

 

Wednesday, 15:15 - 15:40 h, Room: H 1028, Talk 1

Rodolphe Jenatton
Proximal methods for hierarchical sparse coding and structured sparsity

Coauthors: Francis Bach, Julien Mairal, Guillaume Obozinski

 

Abstract:
Sparse coding consists in representing signals as sparse linear combinations of atoms selected from a dictionary. We consider an extension of this framework where the atoms are further assumed to be embedded in a tree. This is achieved using a recently introduced tree-structured sparse regularization norm, which has proven useful in several applications. This norm leads to regularized problems that are difficult to optimize, and we propose in this paper efficient algorithms for solving them. More precisely, we show that the proximal operator associated with this norm is computable exactly via a dual approach that can be viewed as the composition of elementary proximal operators. Our procedure has a complexity linear, or close to linear, in the number of atoms, and allows the use of accelerated gradient techniques to solve the tree-structured sparse approximation problem at the same computational cost as traditional ones using the L1-norm. We also discuss extensions of this dual approach for more general settings of structured sparsity. Finally, examples taken from image/video processing and topic modeling illustrate the benefit of our method.

 

 

Wednesday, 15:45 - 16:10 h, Room: H 1028, Talk 2

Minh Pham
Alternating linearization for structured regularization problems

Coauthors: Xiaodong Lin, Andrzej Ruszczynski

 

Abstract:
We adapt the alternating linearization method for proximal decomposition to structured regularization problems, in particular, to the generalized lasso problems. The method is related to two well-known operator splitting methods, the Douglas-Rachford and the Peaceman-Rachford method, but it has descent properties with respect to the objective function. Its convergence mechanism is related to that of bundle methods of nonsmooth optimization. We also discuss implementation for very large problems, with the use of specialized algorithms and sparse data structures. Finally, we present numerical results for several synthetic and real-world examples, including a three-dimensional fused lasso problem, which illustrate the scalability, efficacy, and accuracy of the method.

 

 

Wednesday, 16:15 - 16:40 h, Room: H 1028, Talk 3

John C. Duchi
Adaptive subgradient methods for stochastic optimization and online learning

Coauthors: Elad Hazan, Yoram Singer

 

Abstract:
We present a new family of subgradient methods that dynamically
incorporate knowledge of the geometry of the data observed in earlier
iterations to perform more informative gradient-based
learning. Metaphorically, the adaptation allows us to find needles in
haystacks in the form of very predictive but rarely seen features. Our
paradigm stems from recent advances in stochastic optimization and
online learning which employ proximal functions to control the
gradient steps of the algorithm. We describe and analyze an apparatus
for adaptively modifying the proximal function, which significantly
simplifies setting a learning rate and results in regret guarantees
that are provably as good as the best proximal function that can be
chosen in hindsight. We give several efficient algorithms for
empirical risk minimization problems with common and important
regularization functions and domain constraints. We experimentally
study our theoretical analysis and show that adaptive subgradient
methods significantly outperform state-of-the-art, yet non-adaptive,
subgradient algorithms.

 

  Florida Payday Loans can help you in trying times, but be sure to know the laws necessary for your loan application. What can cause long-term use of Viagra? In the network and other sources of information, there is no reliable data on the long-term use of Viagra and its negative effects on the body.