Conference Program


 (Add to Calendar)  Thursday, August 23, 17:00 – 17:50 h, H0104:

Michael P. Friedlander: Data fitting and optimization with randomized sampling

Chair: Luís Nunes Vicente



For many structured data-fitting applications, incremental gradient methods (both deterministic and randomized) offer inexpensive iterations by sampling only subsets of the data. They make great progress initially, but eventually stall. Full gradient methods, in contrast, often achieve steady convergence, but may be prohibitively expensive for large problems. Applications in machine learning and robust seismic inversion motivate us to develop an inexact gradient method and sampling scheme that exhibit the benefits of both incremental and full gradient methods.


Biographical sketch:

Michael P. Friedlander is Associate Professor of Computer Science at the University of British Columbia. He received his PhD in Operations Research from Stanford University in 2002, and his BA in Physics from Cornell University in 1993. From 2002 to 2004 he was the Wilkinson Fellow in Scientific Computing at Argonne National Laboratory. He was a senior fellow at UCLA's Institute for Pure and Applied Mathematics in 2010. He serves on the editorial boards of SIAM J. on Optimization, SIAM J. on Matrix Analysis and Applications, Mathematical Programming Computation, Optimization Methods and Software, and the Electronic Transactions on Numerical Analysis. His research is primarily in developing numerical methods for constrained optimization, their software implementation, and applications in signal processing and image reconstruction.

  Payday Loans Florida can help you in trying times, but be sure to know the laws necessary for your loan application. But it is worth noting that these tests were carried out on the blood cells. Therefore, it's too early to say about scientific evidence of Viagra influence on blood clots.