May 31, 2006

Books to Read While the Algae Grow in Your Fur, May 2006

Attention conservation notice: I have no taste.

Naomi Novik, Throne of Jade
Mind candy. Sequel to His Majesty's Dragon, and much the same remarks apply. — Sequel.
Manuel De Landa, A Thousand Years of Nonlinear History
Take Daniel Dennett's philosophy (mechanical materialism brought up to date), add a course of reading in the better world historians (Braudel, McNeill, Crosby), economists (North, Simon; also the less defensible parts of Jane Jacobs) and sociolinguists (Labov), and then translate into the Deleuze-and-Guattari dialect of post-structuralist, which adds absolutely nothing to the argument. (Fortunately, in a nice display of code-switching, De Landa for the most part writes clearly, if quite abstractly and academically, restricting the Deleuzisms to clearly-delineated sections. This only makes it easier to see that they are completely superfluous.) Deserves a full review (after all, I got this review copy in 1998), not that I'm apt to have the time in the near future...
Tim Powers, Declare
Mind candy. The occult Lovecraftian inner truth of the Cold War, with Kim Philby as emissary to the Old Ones. Superb. (Only: it really doesn't fit well that Roman Catholicism is also supposed to be true.)
Rebecca Solnit, River of Shadows: Eadweard Muybridge and the Technological Wild West
This is about the future we now live, when it wasn't widely distributed yet — and how it began in California, specifically the San Francisco area, circa 1870--1890. (One of the few connections Solnit doesn't make is to William Everdell's theory of modernism as discontinuity and collage, though it would seem to fit her argument perfectly.) One place her argument fails, however, is in persuading me that Muybridge's character had any influence on the subsequent development of cinematography — that if he had been a different person, things would have really turned out differently in any important respect, that much the same process wouldn't've been invented by other people, finding much the same uses, just as, say, television was invented by several very different people more or less simultaneously. (As one of Lem's characters argues in His Master's Voice, science and technology are ergodic processes, in which individuals' influences are transient fluctuations.) But Muybridge's life story (and the intersecting stories Solnit weaves through it) is well worth knowing for its own sake.
M. S. Bartlett, An Introduction to Stochastic Processes, with Special Reference to Methods and Applications
Old-fashioned British (Fisherian) statistics — it's from 1955! — with all the weaknesses (in mathematical sophistication, and attention to rigor) and strengths (attention to empirical applicability, preference for straightforward techniques over abstraction for its own sake) of that tradition. His treatment of the mean-square ergodic theorem (which someone seems to have ripped off), for example, is very nearly saying how to calculate the ergodic limit, and not an exercise in the spectral analysis of unitary operators. And, while he just takes it as obvious that you should do statistical inference for stochastic processes by maximizing the likelihood, he does consider inference for stochastic processes, because ultimately he's a statistican with data to analyze, and not a probabilist with theorems to prove.
Amir Dembo and Ofer Zeitouni, Large Deviations Techniques and Applications
Very nice textbook, probably ideal for a year-long course on large deviations theory. Especially strong on projective limits, and on applications to signal processing and information theory.
Frank den Hollander, Large Deviations
Maybe the best first introduction to large deviations theory I've seen. I'd have prefered a bit more functional analysis and a bit less combinatorics in the first two chapters (really!), but it's excellent, and I've stolen from it shamelessly. I especially like the treatment of the Gartner-Ellis theorem, and devoting the whole second half to interesting applications.
Richard S. Ellis, Entropy, Large Deviations, and Statistical Mechanics
In addition to being an excellent exposition of the rigorous theory of large deviations (especially for physicists, naturally!), this is also one of the most conceptually satisfying approaches to the foundations of statistical mechanics. In particular, it makes good probabilistic sense of the method of maximum entropy, without invoking weird sub-Bayesian ideas about statistical inference. (Namely, maximum Gibbs-Shannon entropy drops out as an approximate consequence of large deviations theory, when considering a small part of a large system, becoming exact only in the thermodynamic limit. As Ellis says, the core of this idea goes back to Boltzmann.)

Books to Read While the Algae Grow in Your Fur; Enigmas of Chance; Writing for Antiquity; The Great Transformation; Scientifiction and Fantastica; The Beloved Republic; Philosophy; Complexity; Pleasures of Detection, Portraits of Crime; Cthulhiana

Posted at May 31, 2006 23:59 | permanent link

Three-Toed Sloth