Wednesday, 16:15 - 16:40 h, Room: H 1028

 

John Duchi
Adaptive subgradient methods for stochastic optimization and online learning

Coauthors: Elad Hazan, Yoram Singer

 

Abstract:
We present a new family of subgradient methods that dynamically
incorporate knowledge of the geometry of the data observed in earlier
iterations to perform more informative gradient-based
learning. Metaphorically, the adaptation allows us to find needles in
haystacks in the form of very predictive but rarely seen features. Our
paradigm stems from recent advances in stochastic optimization and
online learning which employ proximal functions to control the
gradient steps of the algorithm. We describe and analyze an apparatus
for adaptively modifying the proximal function, which significantly
simplifies setting a learning rate and results in regret guarantees
that are provably as good as the best proximal function that can be
chosen in hindsight. We give several efficient algorithms for
empirical risk minimization problems with common and important
regularization functions and domain constraints. We experimentally
study our theoretical analysis and show that adaptive subgradient
methods significantly outperform state-of-the-art, yet non-adaptive,
subgradient algorithms.

 

Talk 3 of the invited session Wed.3.H 1028
"Structured models in sparse optimization" [...]
Cluster 21
"Sparse optimization & compressed sensing" [...]

 

  Most online loan lenders allow getting Payday Loans New Jersey without visiting a bank, straight to your bank account. If you have already decided to take Buy Generic Levitra, be sure to consult a doctor, you don't have any contraindications and act strictly due to a prescription.