Complexity Measures
24 Oct 2024 11:07
C'est magnifique, mais ce n'est pas de la science. (Lots of 'em ain't that splendid, either.) This is, in the word of the estimable Dave Feldman (who taught me most of what I know about it, but has rather less jaundiced views), a "micro-field" within the soi-disant study of complexity. Every few months seems to produce another paper proposing yet another measure of complexity, generally a quantity which can't be computed for anything you'd actually care to know about, if at all. These quantities are almost never related to any other variable, so they form no part of any theory telling us when or how things get complex, and are usually just quantification for quantification's own sweet sake.
The first and still classic measure of complexity is that introduced by Kolmogorov, which is (roughly) the shortest computer program capable of generating a given string. This quantity is in general uncomptable, in the sense that there is simply no algorithm which will compute it. This comes from a result in computer science known as the halting problem, which in turn is a disguised form of Gödel's theorem, so this limit is not likely to be broken any time soon. Moreover, the Kolmogorov complexity is maximized by random strings, so it's really telling us what's random, not what's complicated, and it's gradually come to be called the "algorithmic information." It plays a very important role in every discussion of measuring complexity: in a pious act of homage to our intellectual ancestors, it is solemnly taken out, exhibited, and solemnly put away as useless for any practical application.
Generally speaking, complexity measures either take after Kolmogorov complexity, and involve finding some computer or abstract automaton which will produce the pattern of interest, or they take after information theory and produce something like the entropy, which, while in principle computable, can be very hard to calculate reliably for experimental systems. Bennett's "logical depth" is an instance of the former tendency (it's the running time of the shortest program), Lloyd and Pagels's "thermodynamic depth" of the later (it's the entropy of the ensemble of possible trajectories leading to the current state; uncomputable in the weaker sense that you'd have to go all the way back to the beginning of time...). The statistical complexity of computational mechanics partakes of both natures, being the entropy of an abstract automaton; it can actually be calculated. (I presently have a couple of papers incubating where we do calculate the statistical complexity of various real-world processes.)
Even if the complexity measure is uncomputable, it may be possible to say something about how fast it grows, in some well-defined average sense. For instance, the average Kolmogorov complexity per symbol of a random string converges on the entropy per symbol of the string. Similarly, my first published paper was a proof that the rate of increase in thermodynamic depth ("dive") is also an entropy rate, though not the same one. It'd be nice if there was a similar result about logical depth; watch this space for more developments in this exciting nano-field. --- Such results tend to make the complexity measures concerned seem less interesting, but this is just in line with Bohr's dictum, that the task of theoretical science is to turn deep truths into trivialities.
Some queries that might be answered by a good complexity measure
... or, at least: questions which will need a well-defined complexity measure in order to answer. (Based on text which had been in the general notebook on complexity and complex systems since the 1990s, but moved over here as more logical in 2024.)Is it clear that humans are more complex than whales? Than chimpanzees? Than termites? Than termite mounds?
Is any biological organism more complex than a hurricane? Than the Great Red Spot of Jupiter?
Is there any trend towards greater complexity over time among living things? On the Earth as a whole? The universe as a whole? Is there any deep explanation (élan vital anyone?) or can it be accounted for by the usual suspects --- natural selection, "plenty of room at the top," chance? --- Even if we do come up with a measure of complexity, it will be very difficult to apply to the fossil record, since the soft parts are pretty well gone, and so we can't know (much) about the innards of the brain, or the immune system. (In fact, could you infer the existence of immune systems from the fossil record? This is important, since it's complex if anything is, and if we can't infer it, how do we know there weren't other things, also confined to the soft tissues, which weren't comprably complex?)
- See also:
- Algorithmic Information Theory
- Complexity
- Complexity, Entropy and the Physics of gzip
- Computation, Automata, Languages
- Computational Complexity
- Evolution of Complexity
- Information Theory
- Self-Organization
- Recommended, big picture:
- Remo Badii and Antonio Politi, Complexity: Hierarchical Structure and Scaling in Physics [Accurate and insightful discussion of at least a dozen different complexity measures in chs. 8 and 9]
- Bruce Edmonds, Bibliography of Measures of Complexity [Frozen in 1997]
- David Feldman and James P. Crutchfield, "Measures of Statistical Complexity: Why?" Physics Letters A 238 (1998) 244--252
- Recommended, close-ups:
- John Bates and Harvey Shepard, "Measuring Complexity Using Information Fluctuation," Physics Letters A 172 416--425 (1993)
- Charles Bennett
- "Dissipation, Information, Computational Complexity and the Definition of Organization," in David Pines (ed.), Emerging Syntheses in Science
- "On the Nature and Origin of Complexity in Discrete, Homogeneous, Locally-Interacting Systems," Foundations of Physics 16 (1986) 585--592
- "How to Define Complexity in Physics, and Why" in W. H. Zurek (ed.), Complexity, Entropy, and the Physics of Information (1991)
- G. Boffetta, M. Cencini, M. Falcioni and A. Vulpiani, "Predictability: a way to characterize Complexity," nlin.CD/0101029
- P.-M. Binder and Jorge A. Plazas, "Multiscale Analysis of Complex Systems," Physical Review E 63 (2001): 065203(R)
- James P. Crutchfield and Karl Young, "Inferring Statistical Complexity," Physical Review Letters 63 (1989) 105--109
- Daniel Dennett, "Real Patterns" in Brainchildren [Considering the effects of allowing noise on the Kolmogorov measure, and what kinds of patterns one could actually use; see my review of the book]
- Peter Grassberger
- "Toward a Quantitative Theory of Self-Generated Complexity," International Journal of Theoretical Physics 25 (1986) 907--938
- "Randomness, Information, and Complexity," pp. 59--99 of Francisco Ramos-Gómez (ed.), Proceedings of the Fifth Mexican School on Statistical Physics (Singapore: World Scientific, 1989), arxiv:1208.3459 [Nice survey and review of the main complexity measures as of the end of the 1980s.]
- B. A. Huberman and T. Hogg, "Complexity and Adaptation," Physica D 22 (1986) 376--384
- Heike Jänicke, Alexander Wiebel, Gerik Scheuermann and Wolfgang Kollmann, "Multifield Visualization Using Local Statistical Complexity", IEEE Transactions on Visualization and Computer Graphics 13 (2007): 1384--1391 [PDF]
- Rolf Landauer, "A Simple Measure of Complexity," Nature 336 (1988): 306--307 [Commenting on Lloyd and Pagels, in Landauer's usual take-no-prisoners style: "It is one of the remarkably few thrusts in this area which is not conspicuously vacuous, and deserves serious consideration."]
- Wentian Li, "On the Relationship between Complexity and Entropy for Markov Chains and Regular Languages," Complex Systems 5 (1991) 381-389
- Seth Lloyd and Heinz Pagels, "Complexity as Thermodynamic Depth," Annals of Physics 188 (1988): 186--213
- Kristian Lindgren and Mats Nordahl, "Complexity Measures and Cellular Automata," Complex Systems 2 (1988): 409--440
- A. Witt, A. Neiman and J. Kurths, "Characterizing the Dynamics of Stochastic Bistable Systems by Measures of Complexity", Physical Review E 55 (1997): 5050--5059
- Modesty forbids me to recommend:
- James P. Crutchfield and Cosma Rohilla Shalizi, "Thermodynamic Depth of Causal States: When Peddling around in Occam's Pool, Shallowness Is a Virtue," Physical Review E 59 (1999) 275--283 = cond-mat/9808147 [Showing why the thermodynamic depth of Lloyd and Pagels doesn't work as advertized --- unless supplemented with causal states, Jim's own patent remedy for complexity. The journal made us change the subtitle before they'd print it.]
- Robert Haslinger, Kristina Lisa Klinkner and CRS, "The Computational Structure of Spike Trains", Neural Computation 22 (2010): 121--157 = arxiv:1001.0036 [Causal states and statistical complexity in actual rat neurons]
- CRS, "Methods and Techniques of Complex Systems Science: An Overview", chapter 1 (pp. 33--114) in Thomas S. Deisboeck and J. Yasha Kresh (eds.), Complex Systems Science in Biomedicine = nlin.AO/0307015 [Specifically, section 8]
- CRS and James P. Crutchfield, "Computational Mechanics: Pattern and Prediction, Structure and Simplicity," Journal of Statistical Physics 104 (2001): 817--879 = cond-mat/9907176 [Why causal states and statistial complexity are the Way and the Light]
- CRS, Kristina Lisa Klinkner and Robert Haslinger, "Quantifying Self-Organization with Optimal Predictors", Physical Review Letters 93 (2004): 118701 = [Application of causal states to self-organizing cellular automata; possibly the largest systems for which non-silly complexities have actually been estimated from data]
- Disrecommended (random samples from a long list):
- Moshe Koppel, "Complexity, Depth, and Sophistication," Complex Systems 1 (1987) 1087
- R. Mansilla and E. Bush, "Increase of complexity from classical Greek to Latin poetry," cond-mat/0203135
- Stephen Wolfram, A New Kind of Science [Review: A Rare Blend of Monster Raving Egomania and Utter Batshit Insanity]
- To read:
- Samer A. Abdallah, Mark D. Plumbley, "A measure of statistical complexity based on predictive information", arxiv:1012.1890
- V. Afraimovich and G. M. Zaslavsky, "Space-time complexity in Hamiltonian dynamics", Chaos 13 (2003): 519--532
- S. E. Ahnert, I. G. Johnston, T. M. A. Fink, J. P. K. Doye, and A. A. Louis, "Self-assembly, modularity, and physical complexity", Physical Review E 82 (2010): 026117
- P. Allegrini, V. Benci, P. Grigolini, P. Hamilton, M. Ignaccolo, G. Menconi, L. Palatella, G. Raffaelli, N. Scafetta, M. Virgilio and J. Jang, "Compression and diffusion: a joint approach to detect complexity," cond-mat/0202123
- Luis Antunes, Bruno Bauwens, Andre Souto, Andreia Teixeira, "Sophistication vs Logical Depth", arxiv:1304.8046
- Fatihcan Atay, Sarika Jalan and Jürgen Jost, "Randomness, chaos, and structure", arxiv:0711.4293
- Nihat Ay, Markus Mueller, Arleta Szkola
- "Effective Complexity and its Relation to Logical Depth", IEEE Transactions on Information Theory 56 (2010): 4593--4607, arxiv:0810.5663
- "Effective complexity of stationary process realizations", arxiv:1001.2686
- Nihat Ay, Eckehard Olbrich, Nils Bertschinger, and Jürgen Jost, "A geometric approach to complexity", Chaos 21 (2011): 037103
- R. C. Ball, M. Diakonova, R. S. MacKay, "Quantifying Emergence in terms of Persistent Mutual Information", arxiv:1003.3028
- L. Barnett, C. L. Buckley, S. Bullock, "A Graph Theoretic Interpretation of Neural Complexity", Physical Review E 83 (2011): 041906, arxiv:1011.5334
- Claudio Bonanno and Pierre Collet, "Complexity for Extended Dynamical Systems", Communications in Mathematical Physics 275 (2007): 721--748, math/0609681
- Jens Christian Claussen
- "Offdiagonal Complexity: A computationally quick complexity measure for graphs and networks", q-bio.MN/0410024
- "Offdiagonal complexity: A computationally quick network complexity measure. Application to protein networks and cell division", arxiv:0712.4216
- Bernat Corominas-Murtra, Carlos Rodríguez-Caso, Joaquín Goñi, Ricard Solé, "Topological reversibility and causality in feed-forward networks", arxiv:1007.1829
- James P. Crutchfield and Jon Machta (eds.), Randomness, Structure, and Causality: Measures of Complexity from Theory to Applications, special issue of Chaos 21 (2011): 037101
- M. De Lucia, M. Bottaccio, M. Montuori and L. Pietronero, "A topological approach to neural complexity", nlin.AO/0411011 [Related the Sprons-Tononi-Edelman complexity measure for networks to features of network structure, assuming stationary Gaussian processes. Such processes have nothing whatsoever to do with actual nervous systems.]
- S. Drozdz, Jaroslaw Kwapien, J. Speth and M. Wojcik, "Identifying Complexity by Means of Matrices," cond-mat/0112271
- Francisco Escolano, Edwin R. Hancock and Miguel A. Lozano, "Heat diffusion: Thermodynamic depth complexity of networks", Physical Review E 85 (2012): 036206
- Jacob Feldman, "How surprising is a simple pattern? Quantifying 'Eureka!'," Cognition 93 (2004): 199--224 [Claims to (a) have a psychologically valid measure of subjective complexity, and (b) derive a null distribution for it!]
- Surya Ganguli, Dongsung Huh, and Haim Sompolinsky, "Memory traces in dynamical systems", Proceedings of the National Academy of Sciences (USA) 105 (2008): 18970--18975
- Xinwei Gong, Joshua E. S. Socolar, "Quantifying the complexity of random Boolean networks", arxiv:1202.1540
- H. Jänicke and G. Scheuermann, "Steady visualization of the dynamics in fluids using \epsilon-machines", Computers and Graphics 33 (2009): 597--606
- Svante Janson, Stefano Lonardi and Wojciech Szpankowski, "On average sequence complexity", Theoretical Computer Science 326 (2004): 213--227 [Complexity of an individual string as the number of distinct substrings it contains. PDF via Prof. Lonardi]
- Nick S. Jones, "Using the Memories of Multiscale Machines to Characterize Complex Systems", Physical Review Letters 100 (2008): 208702, arxiv:0812.5079
- T. Kahle, E. Olbrich, J. Jost and N. Ay, "Complexity measures from interaction structures", Physical Review E 79 (2009): 026201, arxiv:0806.2552
- James Ladyman and Karoline Wiesner, What Is a Complex System?
- Wolfgang Löhr, "Properties of the Statistical Complexity Functional and Partially Deterministic HMMs", Entropy 11 (2009): 385--401
- Jon Machta
- "Complexity, parallel computation and statistical physics", cond-mat/0510809 [Defines complexity in terms of "depth", in turn in terms of "the number of parallel computational steps needed to simulate" a system. Thermodynamic depth (and a certain paper on it, cough cough) are cited but don't, on a quick skim, seem to be really engaged with. I need to read this carefully.]
- "Natural Complexity, Computational Complexity and Depth", Chaos 21 (2011): 037111, arxiv:1111.2845
- James W. McAllister, "Effective Complexity as a Measure of Information Content", Philosophy of Science 70 (2003): 302--307
- Faheem Mosam, Diego Vidaurre, Eric De Giuli, "Breakdown of random matrix universality in Markov models", Physical Review E 104 (2021): 024305, arxiv:2105.04393 [Relevant here for the application of Bialek/Nemenman/Tishby "predictive information"]
- Frederick J. Newmeyer and Laurel B. Preston (eds.), Measuring Grammatical Complexity
- Milan Palus, "Coarse-grained entropy rate for characterization of complex time series", Physica D 93 (1996): 64--77 [Thanks to Prof. Palus for a reprint]
- Osvaldo A. Rosso and Cristina Masoller, "Detecting and quantifying stochastic and coherence resonances via information-theory complexity measurements", Physical Review E 79 (2009): 040106
- Peter I. Saparin, Wolfgang Gowin, Jürgen Kurths, and Dieter Felsenber, "Quantification of cancellous bone structure using symbolic dynamics and measures of complexity", Physical Review E 58 (1998): 6449--6459
- Abhishek Sharma, Dániel Czégel, Michael Lachmann, Christopher P. Kempes, Sara I. Walker, and Leroy Cronin, "Assembly Theory Explains and Quantifies the Emergence of Selection and Evolution", arxiv:2206.02279
- Alexander F. Siegenfeld, Yaneer Bar-Yam, "A Formal Definition of Scale-dependent Complexity and the Multi-scale Law of Requisite Variety", arxiv:2206.04896
- Vaclav Smil, Power Density: A Key to Understanding Energy Sources and Uses
- Ruedi Stoop, Norbert Stoop and Leonid Bunimovich, "Complexity of dynamics as variability of predictability", Journal of Statistical Physics 114 (2004): 1127--1137
- J. Teo and H. A. Abbass, "Multiobjectivity and Complexity in Embodied Cognition", IEEE Transactions on Evolutionary Computation 9 (2005): 337--360
Thanks to Michel Decré for correcting my French.