### I Taught Him Everything He Knows

"Concentration of measure" is a phenomenon in probability theory where,
roughly speaking, any set which contains a substantial fraction of the
probability can be expanded just a little to yield a set
containing *most* of the probability. Another way to say this is that,
given any reasonably continuous function, the probability that it deviates from
its mean is exponentially small, and the exponential rate *does not*
depend on the precise function. This makes concentration of measure results
extremely useful for questions involving the estimation of complicated and ugly
functions. The classical work in this area proves concentration-of-measure for
various kinds of sequences of independent variables, but for real applications
in statistics, machine learning or physics you'd want to be able to handle
dependence. The natural way to do this would be to look at mixing processes,
which are at least asymptotically independent.

Leo Kontorovich, who was one
of the students in my stochastic processes
class this past spring, now has
a paper summarizing his work on, precisely, concentration of measure for mixing
sequences:

- Leonid Kontorovich, "Metric and Mixing Sufficient Conditions for
Concentration of
Measure", math.PR/0610427
*Abstract*: We derive sufficient conditions for a family
$(X^n,\rho_n,P_n)$ of metric probability spaces to have the measure
concentration property. Specifically, if the sequence $\{P_n\}$ of probability
measures satisfies a strong mixing condition (which we call $\eta$-mixing) and
the sequence of metrics $\{\rho_n\}$ is what we call $\Psi$-dominated, we show
that $(X^n,\rho_n,P_n)$ is a normal Levy family. We establish these properties
for some metric probability spaces, including the possibly novel $X=[0,1]$,
$\rho_n=\ell_1$ case.

This paper serves as a good entry-point to Leo's earlier papers on measure
concentration for hidden Markov processes
(

math.PR/0608064) and Markov
tree processes
(

math.PR/0608511), as well
as his paper
with

Kavita
Ramanan on the general martingale-difference method of which these other
results are special cases
(

math.PR/0609835).

Since I'll be teaching stochastic processes again in the spring, I would
very much *like* to claim that Leo wrote these papers as a direct result
of having taken my class. But the truth is that Leo knew so much about this
already that, so far teaching him everything he knows, I learned almost all
I know about concentration *from him*. But this is one of the
real pleasures of teaching...

Enigmas of Chance;
Corrupting the Young;
Incestuous Amplification

Posted at October 16, 2006 14:00 | permanent link