Tuesday, 15:15 - 15:40 h, Room: H 3503


Yurii Nesterov
Random gradient-free minimization of convex functions


In this talk, we prove the complexity bounds
for methods of Convex Optimization based only on
computation of the function value. The search directions
of our schemes are normally distributed random Gaussian
vectors. It appears that such methods usually need at most
n times more iterations than the standard gradient
methods, where n is the dimension of the space of
variables. This conclusion is true both for nonsmooth and
smooth problems. For the later class, we present also an
accelerated scheme with the expected rate of convergence
O(n2⁄k2), where k is the iteration counter.
For Stochastic Optimization, we propose a zero-order
scheme and justify its expected rate of convergence O(n⁄k1/2). We give also some bounds for the rate of
convergence of the random gradient-free methods to
stationary points of nonconvex functions, both for smooth
and nonsmooth cases. Our theoretical results are supported
by preliminary computational experiments.


Talk 1 of the invited session Tue.3.H 3503
"Novel approaches in derivative-free optimization" [...]
Cluster 6
"Derivative-free & simulation-based optimization" [...]


  payday advance . But at the same time, it acts only with sexual arousal. Viagra Online has a number of advantages in comparison with injections in the sexual organ or other procedures aimed at treatment of impotency.