Notebooks

Complexity Measures

24 Oct 2024 11:07

C'est magnifique, mais ce n'est pas de la science. (Lots of 'em ain't that splendid, either.) This is, in the word of the estimable Dave Feldman (who taught me most of what I know about it, but has rather less jaundiced views), a "micro-field" within the soi-disant study of complexity. Every few months seems to produce another paper proposing yet another measure of complexity, generally a quantity which can't be computed for anything you'd actually care to know about, if at all. These quantities are almost never related to any other variable, so they form no part of any theory telling us when or how things get complex, and are usually just quantification for quantification's own sweet sake.

The first and still classic measure of complexity is that introduced by Kolmogorov, which is (roughly) the shortest computer program capable of generating a given string. This quantity is in general uncomptable, in the sense that there is simply no algorithm which will compute it. This comes from a result in computer science known as the halting problem, which in turn is a disguised form of Gödel's theorem, so this limit is not likely to be broken any time soon. Moreover, the Kolmogorov complexity is maximized by random strings, so it's really telling us what's random, not what's complicated, and it's gradually come to be called the "algorithmic information." It plays a very important role in every discussion of measuring complexity: in a pious act of homage to our intellectual ancestors, it is solemnly taken out, exhibited, and solemnly put away as useless for any practical application.

Generally speaking, complexity measures either take after Kolmogorov complexity, and involve finding some computer or abstract automaton which will produce the pattern of interest, or they take after information theory and produce something like the entropy, which, while in principle computable, can be very hard to calculate reliably for experimental systems. Bennett's "logical depth" is an instance of the former tendency (it's the running time of the shortest program), Lloyd and Pagels's "thermodynamic depth" of the later (it's the entropy of the ensemble of possible trajectories leading to the current state; uncomputable in the weaker sense that you'd have to go all the way back to the beginning of time...). The statistical complexity of computational mechanics partakes of both natures, being the entropy of an abstract automaton; it can actually be calculated. (I presently have a couple of papers incubating where we do calculate the statistical complexity of various real-world processes.)

Even if the complexity measure is uncomputable, it may be possible to say something about how fast it grows, in some well-defined average sense. For instance, the average Kolmogorov complexity per symbol of a random string converges on the entropy per symbol of the string. Similarly, my first published paper was a proof that the rate of increase in thermodynamic depth ("dive") is also an entropy rate, though not the same one. It'd be nice if there was a similar result about logical depth; watch this space for more developments in this exciting nano-field. --- Such results tend to make the complexity measures concerned seem less interesting, but this is just in line with Bohr's dictum, that the task of theoretical science is to turn deep truths into trivialities.

Some queries that might be answered by a good complexity measure

... or, at least: questions which will need a well-defined complexity measure in order to answer. (Based on text which had been in the general notebook on complexity and complex systems since the 1990s, but moved over here as more logical in 2024.)

Is it clear that humans are more complex than whales? Than chimpanzees? Than termites? Than termite mounds?

Is any biological organism more complex than a hurricane? Than the Great Red Spot of Jupiter?

Is there any trend towards greater complexity over time among living things? On the Earth as a whole? The universe as a whole? Is there any deep explanation (élan vital anyone?) or can it be accounted for by the usual suspects --- natural selection, "plenty of room at the top," chance? --- Even if we do come up with a measure of complexity, it will be very difficult to apply to the fossil record, since the soft parts are pretty well gone, and so we can't know (much) about the innards of the brain, or the immune system. (In fact, could you infer the existence of immune systems from the fossil record? This is important, since it's complex if anything is, and if we can't infer it, how do we know there weren't other things, also confined to the soft tissues, which weren't comprably complex?)

Thanks to Michel Decré for correcting my French.


Notebooks: