Physics of Computation and Information
21 Aug 2024 15:22
First: what does physics say about computation and communication? That is, what constraints do physical laws put on realizable computers? (See below.)
Second: What, if anything, do the theories of computation and information say about physics? I am particularly thinking of attempts to derive physical laws from information theory, none of which look the least bit convincing to me. The hope, I guess, is that what looks like physics, like a more-or-less contigent fact about the world, will turn out to be really math, something which would have to be true in any world which we deal with more-or-less statistically. As I said, I'm not familiar with any attempt to do this --- to get "it from bit," as Wheeler says --- which looks at all convincing. The only thing which comes close to being an exception is the use of the method of maximum entropy in statistical mechanics. But I'd argue this is deceptive: maximum-entropy distributions are ones with minimal interaction between their variables. The fact that they work for many but not all physical situations tells us in many cases we can find independent or nearly-independent variables to work with --- i.e., maxent works, when it does, because of contingent facts about the physical world, not out of some mathematical necessity. But that would take us into an argument about the foundations of statistical mechanics, which God forbid.
("Computational physics" in the sense of the journal classifications --- using computers to do calculations on physical problems --- is a third subject altogether. I find it about as interesting as the work which goes into compiling a handbook of integrals and formulas --- which is to say, I'm glad somebody else does it.)
Third: using ideas from physics (especially statistical mechanics) to analyze problems of computation, e.g., the appearance of phase transitions in optimization problems.
Fourth, fundamental physical limits on computation. The outstanding example would be Landauer's principle (which now gets its own notebook): erasing one bit produces \( kT \ln 2 \) joules of heat, where \( T \) is the absolute temperature and \( k \) is Boltzmann's constant, and erasure is necessary so that the computation goes forward from inputs to outputs, and not the reverse. (Or is it? Couldn't you just ignore the bits you keep around so as to have a reversible computation?) Others? Limits on bit storage per unit phase-space? Per unit mass? Limits on time needed to perform one logical operation? (See Lloyd's article in Nature, below, for discussion and references of these points. I'm not quite sure that he's right about the speed limitation.)
All physically-implementable computers would seem to have only finite memory. Therefore they cannot really be anything more than finite state machines, though their memories may be so large and so structured that devices of higher computational power are good approximations to them. Is there any way out of this conclusion? What does it imply for physics (if anything)? Of course, this in no way impunges on the mathematical soundness of notions of infinity. (I have an amusing proof that 1 is the largest integer for those who feel otherwise.)
- See also:
- Computation
- Information Theory
- Landauer's Principle
- Physics;
- Quantum Mechanics
- When Do Physical Systems Compute?
- Recommended, big picture:
- Greg Egan [I'd say that Egan's novels are as good as the scientific
literature, but when it comes to knowledge, sophistication and imagination,
they're actually significiantly better than much of it.]
- Distress
- Permutation City
- Neil Gershenfeld, The Physics of Information Technology [Superb. He should not be able to teach as much as he does, assuming as little on the reader's part as he does, in as little space as he does; but somehow the trick is pulled off.]
- Owen Maroney, "Information Processing and Thermodynamic Entropy", in the Stanford Encyclopedia of Philosophy
- Cristopher Moore, "Computational Complexity in Physics," cond-mat/0109010
- Cristopher Moore and Stephan Mertens, The Nature of Computation [Cris and Stephan were kind enough to let me read this in manuscript; it's magnificent. Review: Intellects Vast and Warm and Sympathetic]
- W. H. Zurek (ed.), Complexity, Entropy, and the Physics of Information
- Recommended, close-ups (see also under Landauer's Principle):
- Scott Aaronson, "NP-complete Problems and Physical Reality", quant-ph/0502072
- David Albert, Time and Chance [For the discussions of Maxwellian and pseudo-Maxwellian demons]
- Information Physics at the University of New Mexico
- Seth Lloyd, "Ultimate Physical Limits to Computation," Nature 406(2000): 1047--1054
- Norm Margolus and L. B. Levitin, "The Maximum Speed of Dynamical Evolution," Physica D 120(1998): 188--195, quant-ph/9710043
- O. C. Martin, R. Monasson and R. Zecchina, "Statistical mechanics methods and phase transitions in optimization problems," cond-mat/0104428
- Warren D. Smith, "Fundamental Physical Limits on Computation", technical report, NEC research, 1995 [Can't find this online now (2023) but I learned a lot from it in graduate school...]
- Not recommended:
- Kurt Jacobs, "Quantum measurement and the first law of thermodynamics: The energy cost of measurement is the work value of the acquired information", Physical Review E 86 (2012): 040106(R) [This makes a big deal about how it's measurement that is thermodynamically costly, not erasure. But the actual conclusion is that if you want a cycle, you need to erase the measurement which is (by Landauer, if we believe that) costly. But then the measurement has no thermodynamic cost, just the erasure. In other words, everyone has been right. Perhaps I'm missing something, but this seems like a big ball of nothing.]
- Actively dis-recommended:
- B. Roy Frieden, Physics from Fisher Information: A Unification [Attempt to derive physics from information theory. I think this is a bad book, but (immodestly) I do recommend my review of it: Laboring to Bring Forth a Mouse]
- To read [thanks to Erik Tellgren for references on Maxwell's demon]:
- Samson Abramsky, "A structural approach to reversible computation", Theoretical Computer Science 347 (2005): 441--464
- R. Balian, "Information in statistical physics", cond-mat/0501322
- Dina Barak-Pelleg, Daniel Berend, J.C. Saunders, "A Model of Random Industrial SAT", arxiv:1908.00089
- A. C. Barato, D Hartich, U. Seifert, "Information-theoretic vs. thermodynamic entropy production in autonomous sensory networks", Physical Review E 87 (2013): 042104, arxiv:1212.3186
- Charles H. Bennett, "Notes on Landauer's principle, reversible computation, and Maxwell's Demon", Studies In History and Philosophy of Science Part B 34 (2003): 501--510
- Brillouin, Science and Information Theory
- J. Bub, "Maxwell's Demon and the Thermodynamics of Computation", Studies In History and Philosophy of Science B 32 (2001): 569--579
- John C. Collins, "On the Compatibility Between Physics and Intelligent Organisms," physics/0102024 [Claims to have a truly elegant refutation of Penrose]
- S. N. Coppersmith, "Using the Renormalization Group to Classify Boolean Functions", Journal of Statistical Physics 130 (2008): 1063--1085
- Surya Ganguli and Haim Sompolinsky, "Statistical Mechanics of Compressed Sensing", Physical Review Letters 104 (2010): 188701
- Gramss, Bornholdt, Gross, Mitchell and Pellizzari (eds.), Non-Standard Computation: Molecular Computation --- Cellular Automata --- Evolutionary Algorithms --- Quantum Computers
- Anthony J. G. Hey (ed.), Feynman and Computation: xploring the Limits of Computers
- Shiro Ikeda, Toshiyuki Tanaka and Shun-ichi Amari, "Stochastic Reasoning, Free Energy, and Information Geometry", Neural Computation 16 (2004): 1779--1810
- Antonio Iovanella, Benedetto Scoppola and Elisabetta Scoppola, "Some Spin Glass Ideas Applied to the Clique Problem", Journal of Statistical Physics 126 (2007): 895--915
- Dominik Janzing, "On the Computational Power of Molecular Heat Engines", Journal of Statistical Physics 122 (2006): 531--566
- Adel Javanmard, Andrea Montanari, and Federico Ricci-Tersenghi, "Phase transitions in semidefinite relaxations", Proceedings of the National Academy of Sciences 113 (2016): E2218--E2223
- Javier Anta
- Historical and Conceptual Foundations of Information Physics [Ph.D. dissertation, Universitat de Barcelona, 2021; available from philpapers.org]
- "A Philosopher against the Bandwagon: Carnap and the Informationalization of Thermal Physics", HOPOS: The Journal of the International Society for the History of Philosophy of Science forthcoming
- Harvey S. Leff and Andrew F. Rex (eds.), Maxwell's Demon: Entropy, Information, Computing [A collection of classic papers with commentary]
- Lev B. Levitin, "Energy Cost of Information Transmission (Along the Path to Understanding)," Physica D 120(1998): 162--167
- Lev B. Levitin and Tommaso Toffoli, "Thermodynamic Cost of Reversible Computing", Physical Review Letters 99 (2007): 110502
- Seth Lloyd
- "Use of Mutual Information to Decrease Entropy --- Implications for the Second Law of Thermodynamics," Physical Review A 39 (1989): 5378--5386
- "Computational capacity of the universe," quant-ph/0110141 [Already at the abstract I have doubts. I'm not quibbling with idea that there's a certain minimal amount of time needed to perform (the equivalent of) logic operations, or phase-space needed to store information. But given that the most plausible hypothesis for the composition of the universe is presently "90% of all mass is something we can't see", well, I don't think this is a profitable calculation to make]
- Dibyendu Mandal, H. T. Quan, and Christopher Jarzynski, "Maxwell’s Refrigerator: An Exactly Solvable Model", Physical Review Letters 111 (2013): 030602
- O. J. E. Maroney
- "Does a Computer have an Arrow of Time?", 0709.3131
- "The (absence of a) relationship between thermodynamic and logical reversibility", arxiv:0406137
- "Generalising Landauer's Principle", Physical Review E 79 (2009): 031105, arxiv:quant-ph/0702094
- Pankaj Mehta and David J. Schwab, "Energetic costs of cellular computation", Proceedings of the National Academy of Sciences(USA) 109 (2012): 17978--17982
- Marc Mezard and Andrea Montanari, Information, Physics, and Computation
- Caterina E. Mora and Hans J. Briegel, "Algorithmic Complexity and Entanglement of Quantum States", Physical Review Letters 95 (2005): 200503
- Martin Niss, "Brownian Motion as a Limit to Physical Measuring Processes: A Chapter in the History of Noise from the Physicists' Point of View", Perspectives on Science 24 (2016): 29--44
- Allon Percus, Gabriel Istrate and Cristopher Moore (eds.), Computational Complexity and Statistical Physics
- A. R. Plastino and A. Daffertshofer, "Liouville Dynamics and the Conservation of Classical Information", Physical Review Letters 93 (2004): 138701
- Tomasso Toffoli
- "Action, or the fungibility of computation", pp. 349--392 in Hey (ed.) Feynman and Computation (above) [PDF preprint]
- "What Is the Lagrangian Counting?", International Journal of Theoretical Physics 42 (2003): 363--381
- Tommaso Toffoli, Silvio Capobianco, Patrizia Mentrasti, "When--and how--can a cellular automaton be rewritten as a lattice gas?", 0709.1173
- Steven Weinstein, "Objectivity, Information, and Maxwell's Demon", Philosophy of Science 70 (2003): 1245--1255
- Michael M. Wolf, Frank Verstraete, Matthew B. Hastings, and J. Ignacio Cirac, "Area Laws in Quantum Systems: Mutual Information and Correlations", Physical Review Letters 100 (2008): 070502
- David Wolpert, "On the Computational Capabilities of Physical Systems," physics/0005058 (pt. I, "The Impossibility of Infallible Computation") and physics/0005059 (pt. II, "Relationship with Conventional Computer Science")