Chaitin's argument turns on the notion of Kolmogorov complexity (see my review of Badii and Politi's Complexity). In essence, it turns on the fact that a compression algorithm can't compress anything which is more complicated than itself. If one combines this result with the idea that the point of science is to produce concise descriptions of data-sets, and the fact that human beings certainly have a finite algorithmic information-content, we do in fact get what looks like strong limits on science.

Unfortunately for Chaitin, but fortunately for science, his notion of the
acceptable kinds of descriptions is too restricted. Kolmogorov was, after all,
aiming at describing *randomness,* and his complexity measure is
maximized by sequences of independent random variables; but these can be
extremely concisely described, as soon as we decide we don't care about the
exact sequence and would prefer statistics. Since in Real Science we're stuck
with statistics anyway, this really isn't much of a loss. (Flake acknowledges
the importance of statistical descriptions --- pp. 132--4 --- but doesn't
connect this to the methodological claims.)

Some of my research actually turns on trying to exploit these advantages of
statistical over exact descriptions. See Cosma Rohilla Shalizi and James
P. Crutchfield, "Computational
Mechanics: Pattern and Prediction, Structure and Simplicity", Journal
of Statistical Physics **104** (2001): 816--879
= arxiv:cond-mat/9907176.