Notebooks

Physics of Computation and Information

25 Jul 2023 10:38

First: what does physics say about computation and communication? That is, what constraints do physical laws put on realizable computers? (See below.)

Second: What, if anything, do the theories of computation and information say about physics? I am particularly thinking of attempts to derive physical laws from information theory, none of which look the least bit convincing to me. The hope, I guess, is that what looks like physics, like a more-or-less contigent fact about the world, will turn out to be really math, something which would have to be true in any world which we deal with more-or-less statistically. As I said, I'm not familiar with any attempt to do this --- to get "it from bit," as Wheeler says --- which looks at all convincing. The only thing which comes close to being an exception is the use of the method of maximum entropy in statistical mechanics. But I'd argue this is deceptive: maximum-entropy distributions are ones with minimal interaction between their variables. The fact that they work for many but not all physical situations tells us in many cases we can find independent or nearly-independent variables to work with --- i.e., maxent works, when it does, because of contingent facts about the physical world, not out of some mathematical necessity. But that would take us into an argument about the foundations of statistical mechanics, which God forbid.

("Computational physics" in the sense of the journal classifications --- using computers to do calculations on physical problems --- is a third subject altogether. I find it about as interesting as the work which goes into compiling a handbook of integrals and formulas --- which is to say, I'm glad somebody else does it.)

Third: using ideas from physics (especially statistical mechanics) to analyze problems of computation, e.g., the appearance of phase transitions in optimization problems.

Fourth, fundamental physical limits on computation. The outstanding example would be Landauer's principle (which now gets its own notebook): erasing one bit produces \( kT \ln 2 \) joules of heat, where \( T \) is the absolute temperature and \( k \) is Boltzmann's constant, and erasure is necessary so that the computation goes forward from inputs to outputs, and not the reverse. (Or is it? Couldn't you just ignore the bits you keep around so as to have a reversible computation?) Others? Limits on bit storage per unit phase-space? Per unit mass? Limits on time needed to perform one logical operation? (See Lloyd's article in Nature, below, for discussion and references of these points. I'm not quite sure that he's right about the speed limitation.)

All physically-implementable computers would seem to have only finite memory. Therefore they cannot really be anything more than finite state machines, though their memories may be so large and so structured that devices of higher computational power are good approximations to them. Is there any way out of this conclusion? What does it imply for physics (if anything)? Of course, this in no way impunges on the mathematical soundness of notions of infinity. (I have an amusing proof that 1 is the largest integer for those who feel otherwise.)

See also: Computation; Information Theory; Landauer's Principle; Physics; Quantum Mechanics


Notebooks: