Notebooks
http://bactra.org/notebooks
Cosma's NotebooksenComplexity Measures
http://bactra.org/notebooks/2016/01/10#complexity-measures
<P><em>C'est magnifique, mais ce n'est pas de la science.</em> (Lots of 'em
ain't that splendid, either.) This is, in the word of the estimable Dave
Feldman (who taught me most of what I know about it, but has rather less
jaundiced views), a "micro-field" within the soi-disant study
of <a href="complexity.html">complexity</a>. Every few months seems to produce
another paper proposing yet another measure of complexity, generally a quantity
which can't be computed for anything you'd actually care to know about, if at
all. These quantities are almost never related to any other variable, so they
form no part of any theory telling us when or how things get complex, and are
usually just quantification for quantification's own sweet sake.
<P>The first and still classic measure of complexity is that introduced by
Kolmogorov, which is (roughly) the shortest computer program capable of
generating a given string. This quantity is in general uncomptable, in the
sense that there is simply no algorithm which will compute it. This comes from
a result in computer science known as the halting problem, which in turn is a
disguised form of <a href="godels-theorem.html">Gödel's theorem</a>, so
this limit is not likely to be broken any time soon. Moreover, the Kolmogorov
complexity is maximized by random strings, so it's really telling us what's
random, not what's complicated, and it's gradually come to be called the
"algorithmic information." It plays a very important role in every
discussion of measuring complexity: in a pious act of homage to our
intellectual ancestors, it is solemnly taken out, exhibited, and solemnly put
away as useless for any practical application.
<P>Generally speaking, complexity measures either take after Kolmogorov
complexity, and involve finding some computer or abstract automaton which will
produce the pattern of interest, or they take after <a
href="information-theory.html">information theory</a> and produce something
like the entropy, which, while in principle computable, can be very hard to
calculate reliably for experimental systems. Bennett's "logical depth" is an
instance of the former tendency (it's the running time of the shortest
program), Lloyd and Pagels's "thermodynamic depth" of the later (it's the
entropy of the ensemble of possible trajectories leading to the current state;
uncomputable in the weaker sense that you'd have to go all the way back to the
beginning of time...). The statistical complexity of <a
href="computational-mechanics.html">computational mechanics</a> partakes of
both natures, being the entropy of an abstract automaton; it can actually be
calculated. (I presently have a couple of papers incubating where we
<em>do</em> calculate the statistical complexity of various real-world
processes.)
<P>Even if the complexity measure is uncomputable, it may be possible to say
something about how fast it grows, in some well-defined average sense. For
instance, the average Kolmogorov complexity per symbol of a random string
converges on the entropy per symbol of the string. Similarly, my first
published paper was a proof that the rate of increase in thermodynamic depth
("dive") is also an entropy rate, though not the same one. It'd be nice if
there was a similar result about logical depth; watch this space for more
developments in this exciting nano-field. --- Such results tend to make the
complexity measures concerned seem less interesting, but this is just in line
with Bohr's dictum, that the task of theoretical science is to turn deep truths
into trivialities.
<P>See also:
<a href="algorithmic-information-theory.html">Algorithmic Information Theory</a>;
<a href="cep-gzip.html">Complexity, Entropy and the Physics of
<tt>gzip</tt></a>;
<a href="information-theory.html">Information Theory</a>;
<a href="self-organization.html">Self-Organization</a>
<ul>Recommended (mostly pointed out to me by Dave, who, modestly, did
not recommend his own papers):
<li>Remo Badii and Antonio Politi, <cite><a href="../reviews/badii-and-politi/">Complexity: Hierarchical
Structure and Scaling in Physics</a></cite>
[Accurate and insightful discussion of at least a dozen different complexity
measures in chs. 8 and 9]
<li>John Bates and Harvey Shepard, "Measuring Complexity Using
Information Fluctuation," <cite>Physics Letters A</cite> <strong>172</strong>
416--425 (1993)
<li>Charles Bennett
<ul>
<li>"Dissipation, Information, Computational Complexity and
the Definition of Organization," in David Pines (ed.), <cite>Emerging
Syntheses in Science</cite>
<li>"On the Nature and Origin of Complexity in Discrete,
Homogeneous, Locally-Interacting Systems," <cite>Foundations of Physics</cite>
<strong>16</strong> (1986) 585--592
<li>"How to Define Complexity in Physics, and Why" in
W. H. Zurek (ed.), <cite>Complexity, Entropy, and the Physics of
Information</cite> (1991)
</ul>
<li>G. Boffetta, M. Cencini, M. Falcioni and A. Vulpiani,
"Predictability: a way to characterize Complexity," <a
href="http://arxiv.org/abs/nlin.CD/0101029">nlin.CD/0101029</a>
<li>P.-M. Binder and Jorge A. Plazas, "Multiscale Analysis of Complex
Systems," <cite>Physical Review E</cite> <strong>63</strong> (2001): 065203(R)
<li>James P. Crutchfield and Karl Young, "Inferring Statistical
Complexity," <cite>Physical Review Letters</cite> <strong>63</strong> (1989)
105--109
<li><a href="dennett.html">Daniel Dennett</a>, "Real Patterns" in
<cite>Brainchildren</cite> [Considering the effects of allowing noise on the
Kolmogorov measure, and what kinds of patterns one could actually <em>use</em>;
see my <a href="../reviews/brainchildren/">review</a> of the book]
<li>Bruce Edmonds, <a
href="http://www.cpm.mmu.ac.uk/~bruce/combib/">Bibliography of Measures of
Complexity</a> [Frozen in 1997]
<li>David Feldman and James P. Crutchfield, "Measures of Statistical
Complexity: Why?" <cite>Physics Letters A</cite> <strong>238</strong> (1998)
244--252
<li>Peter Grassberger
<ul>
<li>"Toward a Quantitative Theory of Self-Generated
Complexity," <cite>International Journal of Theoretical Physics</cite>
<strong>25</strong> (1986) 907--938
<li>"Randomness, Information, and Complexity," pp. 59--99 of
Francisco Ramos-Gómez (ed.), <cite>Proceedings of the Fifth Mexican
School on Statistical Physics</cite> (Singapore: World Scientific, 1989), <a href="http://arxiv.org/abs/1208.3459">arxiv:1208.3459</a> [Nice
survey and review of the main complexity measures as of the end of the 1980s.]
</ul>
<li>B. A. Huberman and T. Hogg, "Complexity and Adaptation,"
<cite>Physica D</cite> <strong>22</strong> (1986) 376--384
<li>Heike Jänicke, Alexander Wiebel, Gerik Scheuermann and Wolfgang Kollmann, "Multifield Visualization Using Local Statistical Complexity",
<a href="http://dx.doi.org/10.1109/TVCG.2007.70615"><cite>IEEE Transactions on Visualization and Computer Graphics</cite> <strong>13</strong> (2007): 1384--1391</a>
[<a href="http://www.informatik.uni-leipzig.de/bsv/Jaenicke/Papers/vis07.pdf">PDF</a>]
<li>Rolf Landauer, "A Simple Measure of Complexity,"
<cite>Nature</cite> <strong>336</strong> 306--307 [Commenting on Lloyd and
Pagels, in Landauer's usual take-no-prisoners style: "It is one of the
remarkably few thrusts in this area which is not conspicuously vacuous, and
deserves serious consideration."]
<li>Wentian Li, "On the Relationship between Complexity and Entropy
for Markov Chains and Regular Languages," <cite>Complex Systems</cite>
<strong>5</strong> (1991) 381-389
<li>Seth Lloyd and Heinz Pagels, "Complexity as Thermodynamic Depth,"
<cite>Annals of Physics</cite> <strong>188</strong> (1988) 186--213
<li>Kristian Lindgren and Mats Nordahl, "Complexity Measures and
Cellular Automata," <cite>Complex Systems</cite> <strong>2</strong> (1988)
409-440
</ul>
<ul>Modesty forbids me to recommend:
<li>James P. Crutchfield and Cosma Rohilla Shalizi, "Thermodynamic
Depth of Causal States: When Peddling around in Occam's Pool, Shallowness Is a
Virtue</a>," <cite>Physical Review E</cite> <strong>59</strong> (1999) 275--283
= <a href="http://arxiv.org/abs/cond-mat/9808147">cond-mat/9808147</a> [Showing
why the thermodynamic depth of Lloyd and Pagels doesn't work as advertized ---
unless supplemented with causal states, Jim's own patent remedy for complexity.
The journal made us change the subtitle before they'd print it.]
<li>Robert Haslinger, Kristina Lisa Klinkner and CRS, "The
Computational Structure of Spike
Trains", <a href="http://dx.doi.org/10.1162/neco.2009.12-07-678"><cite>Neural
Computation</cite> <strong>22</strong> (2010): 121--157</a>
= <a href="http://arxiv.org/abs/1001.0036">arxiv:1001.0036</a> [Causal
states and statistical complexity in actual rat neurons]
<li>CRS, "Methods and Techniques of Complex Systems Science: An
Overview", chapter 1 (pp. 33--114) in Thomas S. Deisboeck and J. Yasha Kresh
(eds.), <cite>Complex Systems Science in Biomedicine</cite>
= <a href="http://arxiv.org/abs/nlin.AO/0307015">nlin.AO/0307015</a> [Specifically, section 8]
<li>CRS and James P. Crutchfield, "Computational Mechanics: Pattern and
Prediction, Structure and Simplicity," <cite>Journal of Statistical
Physics</cite> <strong>104</strong> (2001): 817--879 = <a
href="http://arxiv.org/abs/cond-mat/9907176">cond-mat/9907176</a> [Why causal
states and statistial complexity are the Way and the Light]
<li>CRS, Kristina Lisa Klinkner and Robert Haslinger, "Quantifying
Self-Organization with Optimal Predictors", <cite>Physical Review
Letters</cite> <strong>93</strong> (2004): 118701 =
<a href="http://arxiv.org/abs/nlin.AO/0409024"></a> [Application of causal
states to self-organizing cellular automata; possibly the largest systems for
which non-silly complexities have actually been estimated from data]
</ul>
<ul>Disrecommended (random samples from a long list):
<li>Moshe Koppel, "Complexity, Depth, and Sophistication,"
<cite>Complex Systems</cite> <strong>1</strong> (1987) 1087
<li>R. Mansilla and E. Bush, "Increase of complexity from classical
Greek to Latin poetry," <a
href="http://arxiv.org/abs/cond-mat/0203135">cond-mat/0203135</a>
<li>Stephen Wolfram, <cite>A New Kind of Science</cite>
[<a href="../reviews/wolfram/">Review: A Rare Blend of Monster Raving Egomania
and Utter Batshit Insanity</a>]
</ul>
<ul>To read:
<li>Samer A. Abdallah, Mark D. Plumbley, "A measure of statistical complexity based on predictive information", <a href="http://arxiv.org/abs/1012.1890">arxiv:1012.1890</a>
<li>V. Afraimovich and G. M. Zaslavsky, "Space-time complexity in
Hamiltonian dynamics", <a
href="http://dx.doi.org/10.1063/1.1566171"><cite>Chaos</cite> <strong>13</strong>
(2003): 519--532</a>
<li>S. E. Ahnert, I. G. Johnston, T. M. A. Fink, J. P. K. Doye, and A. A. Louis, "Self-assembly, modularity, and physical complexity",
<a href="http://dx.doi.org/10.1103/PhysRevE.82.026117"><citE>Physical Review E</cite> <strong>82</strong>
(2010): 026117</a>
<li>P. Allegrini, V. Benci, P. Grigolini, P. Hamilton, M. Ignaccolo,
G. Menconi, L. Palatella, G. Raffaelli, N. Scafetta, M. Virgilio and J. Jang,
"Compression and diffusion: a joint approach to detect complexity," <a
href="http://arxiv.org/abs/cond-mat/0202123">cond-mat/0202123</a>
<li>Luis Antunes, Bruno Bauwens, Andre Souto, Andreia Teixeira, "Sophistication vs Logical Depth", <a href="http://arxiv.org/abs/1304.8046">arxiv:1304.8046</a>
<li>Fatihcan Atay, Sarika Jalan and Jürgen Jost, "Randomness,
chaos, and
structure", <a href="http://arxiv.org/abs/0711.4293">arxiv:0711.4293</a>
<li>Nihat Ay, Markus Mueller, Arleta Szkola
<ul>
<li>"Effective Complexity and its Relation to Logical Depth", <a href="http://dx.doi.org/10.1109/TIT.2010.2053892"><cite>IEEE Transactions on Information Theory</cite> <strong>56</strong> (2010): 4593--4607</a>, <a href="http://arxiv.org/abs/0810.5663">arxiv:0810.5663</a>
<li>"Effective complexity of stationary process realizations", <a href="http://arxiv.org/abs/1001.2686">arxiv:1001.2686</a>
</ul>
<li>Nihat Ay, Eckehard Olbrich, Nils Bertschinger, and Jürgen Jost,
"A geometric approach to complexity", <a href="http://dx.doi.org/10.1063/1.3638446"><cite>Chaos</cite> <strong>21</strong> (2011): 037103</a>
<li>R. C. Ball, M. Diakonova, R. S. MacKay, "Quantifying Emergence in terms of Persistent Mutual Information", <a href="http://arxiv.org/abs/1003.3028">arxiv:1003.3028</a>
<li>L. Barnett, C. L. Buckley, S. Bullock, "A Graph Theoretic
Interpretation of Neural
Complexity", <a href="http://dx.doi.org/10.1103/PhysRevE.83.041906"><cite>Physical
Review E</cite> <strong>83</strong> (2011):
041906</a>, <a href="http://arxiv.org/abs/1011.5334">arxiv:1011.5334</a>
<li>Claudio Bonanno and Pierre Collet, "Complexity for Extended Dynamical Systems", <a href="http://dx.doi.org/10.1007/s00220-007-0313-4"><cite>Communications in Mathematical Physics</cite> <strong>275</strong> (2007): 721--748</a>, <a href="http://arxiv.org/abs/math/0609681">math/0609681</a>
<li>Jens Christian Claussen
<ul>
<li> "Offdiagonal Complexity: A computationally
quick complexity measure for graphs and networks", <a
href="http://arxiv.org/abs/q-bio.MN/0410024">q-bio.MN/0410024</a>
<li>"Offdiagonal complexity: A computationally quick network complexity measure. Application to protein networks and cell division", <a href="http://arxiv.org/abs/0712.4216">arxiv:0712.4216</a>
</ul>
<li>Bernat Corominas-Murtra, Carlos Rodríguez-Caso, Joaquín Goñi, Ricard Solé, "Topological reversibility and causality in feed-forward networks", <a href="http://arxiv.org/abs/1007.1829">arxiv:1007.1829</a>
<li>James P. Crutchfield and Jon Machta (eds.), <cite>Randomness, Structure, and Causality: Measures of Complexity from Theory to Applications</citE>,
special issue of <a href="http://dx.doi.org/10.1063/1.3643065"><cite>Chaos</cite> <strong>21</strong> (2011): 037101</a>
<li>M. De Lucia, M. Bottaccio, M. Montuori and L. Pietronero, "A
topological approach to neural
complexity", <a href="http://arxiv.org/abs/nlin.AO/0411011">nlin.AO/0411011</a>
[Related the Sprons-Tononi-Edelman complexity measure for networks to features
of network structure, assuming stationary Gaussian processes. Such processes
have <em>nothing whatsoever</em> to do with
actual <a href="neural-coding.html">nervous systems</a>.]
<li>S. Drozdz, Jaroslaw Kwapien, J. Speth and M. Wojcik, "Identifying
Complexity by Means of Matrices," <a
href="http://arxiv.org/abs/cond-mat/0112271">cond-mat/0112271</a>
<li>Francisco Escolano, Edwin R. Hancock and Miguel A. Lozano, "Heat diffusion: Thermodynamic depth complexity of networks", <a href="http://dx.doi.org/10.1103/PhysRevE.85.036206"><cite>Physical Review E</cite> <strong>85</strong> (2012): 036206</a>
<li>Jacob Feldman, "How surprising is a simple pattern? Quantifying
'Eureka!'," <a
href="http://dx.doi.org/10.1016/j.cognition.2003.09.013"><cite>Cognition</cite> <strong>93</strong>
(2004): 199--224</a> [Claims to (a) have a psychologically valid measure
of <em>subjective</em> complexity, and (b) derive a null distribution for it!]
<li>Surya Ganguli, Dongsung Huh, and Haim Sompolinsky, "Memory traces in dynamical systems", <a href="http://dx.doi.org/10.1073/pnas.0804451105"><cite>Proceedings of the National Academy of Sciences</cite> (USA) <strong>105</strong> (2008): 18970--18975</a>
<li>Xinwei Gong, Joshua E. S. Socolar, "Quantifying the complexity of random Boolean networks", <a href="http://arxiv.org/abs/1202.1540">arxiv:1202.1540</a>
<li>H. Jänicke and G. Scheuermann, "Steady visualization of the dynamics in fluids using \epsilon-machines", <a href="http://dx.doi.org/10.1016/j.cag.2009.06.003"><cite>Computers and Graphics</cite> <strong>33</strong> (2009): 597--606</a>
<li>Svante Janson, Stefano Lonardi and Wojciech Szpankowski, "On
average sequence complexity", <a
href="http://dx.doi.org/10.1016/j.tcs.2004.06.023"><cite>Theoretical Computer
Science</cite> <strong>326</strong> (2004): 213--227</a> [Complexity of an
individual string as the number of distinct substrings it contains. <a
href="http://www.cs.ucr/edu/~stelo/papers/tcs04.pdf">PDF</a> via Prof. Lonardi]
<li>Nick S. Jones, "Using the Memories of Multiscale Machines to Characterize Complex
Systems", <cite>Physical Review Letters</cite> <strong>100</strong> (2008): 208702, <a href="http://arxiv.org/abs/0812.5079">arxiv:0812.5079</a>
<li>T. Kahle, E. Olbrich, J. Jost and N. Ay, "Complexity measures from interaction structures", <a href="http://dx.doi.org/10.1103/PhysRevE.79.026201"><cite>Physical Review E</cite> <strong>79</strong> (2009): 026201</a>,
<a href="http://arxiv.org/abs/0806.2552">arxiv:0806.2552</a>
<li>Wolfgang Löhr, "Properties of the Statistical Complexity Functional and Partially Deterministic HMMs", <a href="http://dx.doi.org/10.3390/e110300385"><cite>Entropy</cite> <strong>11</strong> (2009): 385--401</a>
<li>Jon Machta
<ul>
<li>"Complexity, parallel computation and statistical
physics", <a href="http://arxiv.org/abs/cond-mat/0510809">cond-mat/0510809</a>
[Defines complexity in terms of "depth", in turn in terms of "the number of
parallel computational steps needed to simulate" a system. Thermodynamic depth
(and a certain paper on it, cough cough) are cited but don't, on a quick skim,
seem to be really engaged with. I need to read this carefully.]
<li>"Natural Complexity, Computational Complexity and Depth",
<a href="http://dx.doi.org/10.1063/1.3634009"><cite>Chaos</cite> <strong>21</strong>
(2011):
037111</a>, <a href="http://arxiv.org/abs/1111.2845">arxiv:1111.2845</a>
</ul>
<li>James W. McAllister, "Effective Complexity as a Measure of
Information Content", <cite>Philosophy of Science</cite> <strong>70</strong>
(2003): 302--307
<li>Frederick J. Newmeyer and Laurel B. Preston (eds.), <cite>Measuring Grammatical Complexity</cite>
<li><a href="http://www.cs.cas.cz/~mp">Milan Palus</a>, "Coarse-grained
entropy rate for characterization of complex time series", <cite>Physica
D</cite> <strong>93</strong> (1996): 64--77 [Thanks to Prof. Palus for a
reprint]
<li>Osvaldo A. Rosso and Cristina Masoller, "Detecting and quantifying stochastic and coherence resonances via information-theory complexity measurements", <a href="http://dx.doi.org/10.1103/PhysRevE.79.040106"><cite>Physical Review E</cite> <strong>79</strong> (2009): 040106</a>
<li>Peter I. Saparin, Wolfgang Gowin, Jürgen Kurths, and Dieter
Felsenber, "Quantification of cancellous bone structure using symbolic dynamics
and measures of complexity", <a
href="http://dx.doi.org/10.1103/PhysRevE.58.6449"><cite>Physical Review
E</cite> <strong>58</strong> (1998): 6449--6459</a>
<li>Vaclav Smil, <a href="http://mitpress.mit.edu/9780262029148">Power Density:
A Key to Understanding Energy Sources and Uses</a></cite>
<li>Ruedi Stoop, Norbert Stoop and Leonid Bunimovich, "Complexity of
dynamics as variability of predictability", <a
href="http://dx.doi.org/10.1023/B:JOSS.0000012519.93677.15"><cite>Journal of
Statistical Physics</cite> <strong>114</strong> (2004): 1127--1137</a>
<li>J. Teo and H. A. Abbass, "Multiobjectivity and Complexity in
Embodied Cognition", <a
href="http://dx.doi.org/10.1109/TEVC.2005.846902"><cite>IEEE Transactions on
Evolutionary Computation</cite> <strong>9</strong> (2005): 337--360</a>
</ul>
<P>Thanks to Michel Decré for correcting my French.