### Optimization I: Deterministic, Unconstrained Optimization (Introduction to Statistical Computing)

Lecture
17: Deterministic, Unconstrained Optimization. The trade-off of
approximation versus time. Basics from calculus about minima. The
gradient-descent method, its pros and
cons. Taylor
expansion as the motivation for Newton's method; Newton's method as
gradient descent with adaptive step-size; pros and cons. Coordinate descent
instead of multivariate optimization. Nelder-Mead/simplex method for
derivative-free optimization. Peculiarities of optimizing statistical
functionals: don't bother optimizing much within the margin of error;
asymptotic calculations of said margin,
using Taylor
expansion and the rules for adding and multiplying variances.
Illustrations with `optim`.

Optional reading: Francis
Spufford, Red
Plenty
(cf.);
Herbert Simon, The Sciences of the Artificial, especially chapters
5 and 8.

Introduction to Statistical Computing

Posted at October 29, 2012 10:30 | permanent link