Notebooks
http://bactra.org/notebooks
Cosma's NotebooksenPhysics of Computation and Information
http://bactra.org/notebooks/2022/01/30#physics-computation-information
<P>First: what does physics say about computation and communication? That is,
what constraints do physical laws put on <em>realizable</em> computers? (See
below.)
<P>Second: What, if anything, do the theories of computation and information
say about physics? I am particularly thinking of attempts to derive physical
laws from information theory, none of which look the least bit convincing to
me. The hope, I guess, is that what looks like physics, like a more-or-less
contigent fact about the world, will turn out to be really math, something
which would have to be true in any world which we deal with more-or-less
statistically. As I said, I'm not familiar with any attempt to do this --- to
get "it from bit," as Wheeler says --- which looks at all convincing. The only
thing which comes close to being an exception is the use of the <a
href="max-ent.html">method of maximum entropy</a> in statistical mechanics.
But I'd argue this is deceptive: maximum-entropy distributions are ones with
minimal interaction between their variables. The fact that they work for many
but not all physical situations tells us in many cases we can find independent
or nearly-independent variables to work with --- i.e., maxent works, when it
does, because of contingent facts about the physical world, not out of some
mathematical necessity. But that would take us into an argument about the
foundations of <a href="stat-mech.html">statistical mechanics</a>, which God
forbid.
<P>("Computational physics" in the sense of the journal classifications ---
using computers to do calculations on physical problems --- is a third subject
altogether. I find it about as interesting as the work which goes into
compiling a handbook of integrals and formulas --- which is to say, I'm glad
somebody else does it.)
<P>Third: using ideas from physics (especially statistical mechanics) to
analyze problems of computation, e.g., the appearance of phase transitions
in optimization problems.
<P>Fourth, fundamental physical limits on computation. The
outstanding example would be Landauer's principle (which now
<a href="landauers-principle.html">gets its own notebook</a>): erasing one bit
produces \( kT \ln 2 \) joules of heat, where \( T \) is the absolute
temperature and \( k \) is Boltzmann's constant, and erasure is necessary so
that the computation goes forward from inputs to outputs, and not the reverse.
(Or is it? Couldn't you just <em>ignore</em> the bits you keep around so as to
have a reversible computation?) Others? Limits on bit storage per unit
phase-space? Per unit mass? Limits on time needed to perform one logical
operation? (See Lloyd's article in <cite>Nature,</cite> below, for discussion
and references of these points. I'm not quite sure that he's right about the
speed limitation.)
<P>All physically-implementable computers would seem to have only finite
memory. Therefore they cannot <em>really</em> be anything more than finite
state machines, though their memories may be so large and so structured that
devices of higher computational power are good approximations to them. Is
there any way out of this conclusion? What does it imply for physics (if
anything)? Of course, this in no way impunges on the <em>mathematical</em>
soundness of notions of infinity. (I have an amusing proof that 1 is the
largest integer for those who feel otherwise.)
<P><em>See also:</em>
<a href="computation.html">Computation</a>;
<a href="information-theory.html">Information Theory</a>;
<a href="landauers-principle.html">Landauer's Principle</a>;
<a href="physics.html">Physics</a>;
<a href="quantum-mechanics.html">Quantum Mechanics</a>
<ul>Recommended, big picture:
<li>Greg Egan [I'd say that Egan's novels are as good as the scientific
literature, but when it comes to knowledge, sophistication and imagination,
they're actually significiantly better than much of it.]
<ul>
<li><cite>Distress</cite>
<li><cite>Permutation City</cite>
</ul>
<li>Neil Gershenfeld, <cite>The Physics of Information
Technology</cite> [Superb. He should not be able to teach as much as he does,
assuming as little on the reader's part as he does, in as little space as he
does; but somehow the trick is pulled off.]
<li>Owen Maroney, <a href="http://plato.stanford.edu/entries/information-entropy/">"Information Processing and Thermodynamic Entropy"</a>,
<a href="http://plato.stanford.edu/index.html">Stanford Encyclopedia of Philosophy</a>
<li>Cristopher Moore, "Computational Complexity in Physics," <a
href="http://arxiv.org/abs/cond-mat/0109010">cond-mat/0109010</a>
<li>Cristopher Moore and Stephan Mertens, <cite><a href="http://www.nature-of-computation.org/">The Nature of Computation</a></cite> [Cris and Stephan were kind enough to let me read this in manuscript; it's magnificent. Review: <a href="../reviews/nature-of-computation.html">Intellects Vast and Warm and Sympathetic</a>]
<li>W. H. Zurek (ed.), <cite>Complexity, Entropy, and the Physics of
Information</cite>
</ul>
<ul>Recommended, close-ups:
<li>Scott Aaronson, "NP-complete Problems and Physical Reality", <a
href="http://arxiv.org/abs/quant-ph/0502072">quant-ph/0502072</a>
<li>David Albert, <cite>Time and Chance</cite> [For the discussions of
Maxwellian and pseudo-Maxwellian demons]
<li>John Earman and John Norton, "Exorcist XIV: The wrath of
Maxwell's Demon"
<ol>
<li>"From Maxwell to Szilard", <cite>Studies in the
History and Philosophy of Modern Physics</cite> <strong>29</strong> (1998): 435--471</a>
<li>"From Szilard to Landauer and beyond", <cite>Studies in
the History and Philosophy of Modern Physics</cite> <strong>30</strong>
(1999): 1--40
</ol>
<li><a href="http://info.phys.unm.edu/">Information Physics</a> at the
University of New Mexico
<li>Seth Lloyd, "Ultimate Physical Limits to Computation,"
<cite>Nature</cite> <strong>406</strong>(2000): 1047--1054
<li>Norm Margolus and L. B. Levitin, "The Maximum Speed of Dynamical
Evolution," <cite>Physica D</cite> <strong>120</strong>(1998): 188--195, <a
href="http://arxiv.org/abs/quant-ph/9710043">quant-ph/9710043</a>
<li>O. C. Martin, R. Monasson and R. Zecchina, "Statistical mechanics
methods and phase transitions in optimization problems," <a
href="http://arxiv.org/abs/cond-mat/0104428">cond-mat/0104428</a>
<li>John D. Norton
<ul>
<li>"Eaters of the Lotus: Landauer's Principle and the
Return of Maxwell's Demon", <a
href="http://philsci-archive.pitt.edu/archive/00001729/">phil-sci 1729</a>
<li>"Waiting for Landauer", <a href="http://philsci-archive.pitt.edu/8635/">phil-sci/8635</a>
<li>"All Shook Up: Fluctuations, Maxwell’s Demon and the Thermodynamics of Computation", <a href="http://dx.doi.org/10.3390/e15104432"><cite>Entropy</cite> <strong>15</strong> (2013): 4432--4483</a> [<a href="http://www.pitt.edu/~jdnorton/papers/Max_Demon_Entropy.pdf">PDF reprint via Prof. Norton</a>]
</ul>
<li>Orly Shenker, "Logic and Entropy", <a
href="http://philsci-archive.pitt.edu/archive/00000115/">phil-sci 115</a>
[Claims Landauer's principle is wrong]
<li>Orly Shenker and Meir Hemmo, "Maxwell's
Demon", <a
href="http://philsci-archive.pitt.edu/archive/00003795/">phil-sci/3795</a>
[preprint in the evil Word]
</ul>
<ul>Not recommended:
<li>Kurt Jacobs, "Quantum measurement and the first law of thermodynamics: The energy cost of measurement is the work value of the acquired information", <a href="http://dx.doi.org/"><cite>Physical Review E</cite>
<strong>86</strong> (2012): 040106(R)</a> [This makes a big deal about how
it's <em>measurement</em> that is thermodynamically costly, not erasure. But
the actual conclusion is that if you want a <em>cycle</em>, you need to erase
the measurement which is (by Landauer, if we believe that) costly. But then
the <em>measurement</em> has no thermodynamic cost, just the erasure. In other
words, everyone has been right. Perhaps I'm missing something, but this seems
like a big ball of nothing.]
</ul>
<ul>Actively dis-recommended:
<li>B. Roy Frieden, <citE>Physics from Fisher Information: A
Unification</cite> [Attempt to derive physics from information theory.
I think this is a bad book, but (immodestly) I do recommend
my review of it: <a href="../reviews/physics-from-fisher-info/">Laboring to
Bring Forth a Mouse</a>]
</ul>
<ul>To read [thanks to Erik Tellgren for references on Maxwell's demon]:
<li>Samson Abramsky, "A structural approach to reversible computation",
<a href="http://dx.doi.org/10.1016/j.tcs.2005.07.002"><cite>Theoretical
Computer Science</cite> <strong>347</strong> (2005): 441--464</a>
<li>R. Balian, "Information in statistical physics", <a
href="http://arxiv.org/abs/cond-mat/0501322">cond-mat/0501322</a>
<li>Dina Barak-Pelleg, Daniel Berend, J.C. Saunders, "A Model of Random Industrial SAT", <a href="http://arxiv.org/abs/1908.00089">arxiv:1908.00089</a>
<li>A. C. Barato, D Hartich, U. Seifert, "Information-theoretic vs. thermodynamic entropy production in autonomous sensory networks",
<a href="http://dx.doi.org/10.1103/PhysRevE.87.042104"><cite>Physical Review E</cite> <strong>87</strong> (2013): 042104</a>,
<a href="http://arxiv.org/abs/1212.3186">arxiv:1212.3186</a>
<li>Charles H. Bennett, "Notes on Landauer's principle, reversible
computation, and Maxwell's Demon", <cite>Studies In History and Philosophy of
Science Part B</cite> <strong>34</strong> (2003): 501--510
<li>Brillouin, <cite>Science and Information Theory</cite>
<li>J. Bub, "Maxwell's Demon and the Thermodynamics of Computation",
<cite>Studies In History and Philosophy of Science B</cite> <strong>32</strong>
(2001): 569--579
<li>John C. Collins, "On the Compatibility Between Physics and
Intelligent Organisms," <a
href="http://arxiv.org/abs/physics/0102024">physics/0102024</a> [Claims to have
a truly elegant refutation of Penrose]
<li>S. N. Coppersmith, "Using the Renormalization Group to
Classify Boolean Functions", <a href="http://dx.doi.org/10.1007/s10955-008-9486-2"><cite>Journal of Statistical Physics</cite> <strong>130</strong> (2008):
1063--1085</a>
<li>Surya Ganguli and Haim Sompolinsky, "Statistical Mechanics of
Compressed
Sensing", <a href="http://dx.doi.org/10.1103/PhysRevLett.104.188701"><cite>Physical
Review Letters</cite> <strong>104</strong> (2010): 188701</a>
<li>Gramss, Bornholdt, Gross, Mitchell and Pellizzari (eds.),
<cite>Non-Standard Computation: Molecular Computation --- Cellular Automata ---
Evolutionary Algorithms --- Quantum Computers</cite>
<li>Anthony J. G. Hey (ed.), <cite>Feynman and Computation</cite>
<li>Shiro Ikeda, Toshiyuki Tanaka and Shun-ichi Amari, "Stochastic
Reasoning, Free Energy, and Information
Geometry", <a
href="http://neco.mitpress.org/cgi/content/abstract/16/9/1779"><cite>Neural
Computation</cite> <strong>16</strong> (2004): 1779--1810</a>
<li>Antonio Iovanella, Benedetto Scoppola and Elisabetta Scoppola,
"Some Spin Glass Ideas Applied to the Clique Problem",
<a href="http://dx.doi.org/10.1007/s10955-006-9255-z"><cite>Journal of
Statistical Physics</citE> <strong>126</strong> (2007): 895--915</a>
<li>Dominik Janzing, "On the Computational Power of Molecular Heat
Engines", <a href="http://dx.doi.org/10.1007/s10955-005-8015-9"><cite>Journal
of Statistical Physics</cite> <strong>122</strong> (2006): 531--566</a>
<li>Adel Javanmard, Andrea Montanari, and Federico Ricci-Tersenghi,
"Phase transitions in semidefinite relaxations", <a href="http://dx.doi.org/10.1073/pnas.1523097113"><cite>Proceedings of the National Academy of Sciences</cite> <strong>113</strong> (2016): E2218--E2223</a>
<li>Javier Anta
<ul>
<li><cite>Historical and Conceptual Foundations of Information Physics</cite> [Ph.D. dissertation, Universitat de Barcelona, 2021; available from <a href="https://philpapers.org/rec/JAVHAC">philpapers.org</a>]
<li>"A Philosopher against the Bandwagon: Carnap and the Informationalization of Thermal Physics", <a href="https://doi.org/10.1086/718416"><cite>HOPOS: The Journal of the International Society for the History of Philosophy of Science</cite> <strong>forthcoming</strong></a>
</ul>
<li>Harvey S. Leff and Andrew F. Rex (eds.), <cite><a href="https://www.jstor.org/stable/j.ctt7zts1p">Maxwell's Demon: Entropy, Information, Computing</a></cite> [A collection of classic
papers with commentary]
<li>Lev B. Levitin, "Energy Cost of Information Transmission (Along the
Path to Understanding)," <citE>Physica D</cite> <strong>120</strong>(1998):
162--167
<li>Lev B. Levitin and Tommaso Toffoli, "Thermodynamic Cost of
Reversible Computing", <a
href="http://dx.doi.org/10.1103/PhysRevLett.99.110502"><cite>Physical Review
Letters</citE> <strong>99</strong> (2007): 110502</a>
<li>Seth Lloyd
<ul>
<li>"Use of Mutual Information to Decrease Entropy ---
Implications for the Second Law of Thermodynamics," <cite>Physical Review
A</cite> <strong>39</strong> (1989): 5378--5386
<li>"Computational capacity of the universe," <a
href="http://arxiv.org/abs/quant-ph/0110141">quant-ph/0110141</a> [Already at
the abstract I have doubts. I'm not quibbling with idea that there's a certain
minimal amount of time needed to perform (the equivalent of) logic operations,
or phase-space needed to store information. But given that the most plausible
hypothesis for the composition of the universe is presently "90% of all mass
is something we can't see", well, I don't think this is a profitable
calculation to make]
</ul>
<li>Dibyendu Mandal, H. T. Quan, and Christopher Jarzynski, "Maxwell’s Refrigerator: An Exactly Solvable Model", <a href="http://dx.doi.org/10.1103/PhysRevLett.111.030602"><cite>Physical Review Letters</cite> <strong>111</strong> (2013): 030602</a>
<li>O. J. E. Maroney
<ul>
<li>"Does a Computer have an Arrow of Time?",
<a href="http://arxiv.org/abs/0709.3131">0709.3131</a>
<li>"The (absence of a) relationship between thermodynamic and logical reversibility", <a href="http://arxiv.org/abs/0406137">arxiv:0406137</a>
<li>"Generalising Landauer's Principle", <cite>Physical
Review E</cite> <strong>79</strong> (2009): 031105, <a href="http://arxiv.org/abs/quant-ph/0702094">arxiv:quant-ph/0702094</a>
</ul>
<li>Pankaj Mehta and David J. Schwab, "Energetic costs of cellular computation", <a href="http://dx.doi.org/10.1073/pnas.1207814109"><cite>Proceedings of the National Academy of Sciences</cite>(USA) <strong>109</strong> (2012):
17978--17982</a>
<li>Marc Mezard and Andrea Montanari, <cite><a href="http://www.oup.com/us/catalog/general/subject/Mathematics/ComputationalMathematics/?view=usa&ci=9780198570837">Information,
Physics, and Computation</a></cite>
<li>Caterina E. Mora and Hans J. Briegel, "Algorithmic Complexity and
Entanglement of Quantum States", <a
href="http://dx.doi.org/10.1103/PhysRevLett.95.200503"><cite>Physical Review
Letters</cite> <strong>95</strong> (2005): 200503</a>
<li>Martin Niss, "Brownian Motion as a Limit to Physical Measuring Processes: A Chapter in the History of Noise from the Physicists' Point of View",
<a href="https://doi.org/10.1162/POSC_a_00190"><cite>Perspectives on Science</cite> <strong>24</strong> (2016): 29--44</a>
<li>Allon Percus, Gabriel Istrate
and <a href="http://www.santafe.edu/~moore/">Cristopher Moore</a>
(eds.), <cite><a href="http://www.oup.com/isbn/0-19-517738-X">Computational Complexity and Statistical Physics</a></cite>
<li>A. R. Plastino and A. Daffertshofer, "Liouville Dynamics and the
Conservation of Classical Information", <a
href="http://dx.doi.org/10.1103/ PhysRevLett.93.138701"><cite>Physical Review
Letters</cite> <strong>93</strong> (2004): 138701</a>
<li>Takahiro Sagawa and Masahito Ueda
<ul>
<li>"Jarzynski Equality with
Maxwell's
Demon", <a href="http://arxiv.org/abs/cond-mat/0609085">cond-mat/0609085</a>
<li>"Minimal Energy Cost for Thermodynamic Information Processing: Measurement and Information Erasure",
<a href="http://dx.doi.org/10.1103/PhysRevLett.102.250602"><cite>Physical
Review Letters</cite> <strong>102</strong> (2009):
250602</a>, <a href="http://arxiv.org/abs/0809.4098">arxiv:0809.4098</a>
</ul>
<li>Matthias Scheutz, "When Physical Systems Realize Functions...",
<cite>Minds and Machines</cite> <strong>9</strong> (1999): 161--196 ["standard
notions of computation together with a 'state-to-state correspondence view of
implementation' cannot overcome difficulties posed by Putnam's Realization
Theorem and that, therefore, a different approach to implementation is
required. The notion 'realization of a function', developed out of physical
theories, is then introduced as a replacement for the notional pair,
'computation-implementation'. After gradual refinement, taking practical
constraints into account, this notion gives rise to the notion 'digital system'
which singles out physical systems that could be actually used, and possibly
even built."]
<li>Tony Short, James Ladyman, Berry Groisman and Stuart Presnell,
"The Connection between Logical and Thermodynamical Irreversibility",
<a href="http://philsci-archive.pitt.edu/archive/00002374/">phil-sci 2374</a>
<li>Tommaso Toffoli, Silvio Capobianco, Patrizia Mentrasti, "When--and
how--can a cellular automaton be rewritten as a lattice gas?",
<a href="http://arxiv.org/abs/0709.1173">0709.1173</a>
<li>Steven Weinstein, "Objectivity, Information, and Maxwell's Demon",
<citE>Philosophy of Science</cite> <strong>70</strong> (2003): 1245--1255
<li>Michael M. Wolf, Frank Verstraete, Matthew B. Hastings, and J. Ignacio Cirac, "Area Laws in Quantum Systems: Mutual Information and Correlations",
<a href="http://dx.doi.org/10.1103/PhysRevLett.100.070502"><cite>Physical Review Letters</citE>
<strong>100</strong> (2008): 070502</a>
<li>David Wolpert, "On the Computational Capabilities of Physical
Systems," <a href="http://arxiv.org/abs/physics/0005058">physics/0005058</a>
(pt. I, "The Impossibility of Infallible Computation") and <a
href="http://arxiv.org/abs/physics/0005059">physics/0005059</a> (pt. II,
"Relationship with Conventional Computer Science")
</ul>