July 24, 2006

Critical Sensation

Attention conservation notice: 1500 words on psychophysics and the statistical mechanics of disordered excitable media. Also, it was cross-posted to Crooked Timber, where I am guest-blogging this week, so you've seen it already.

First off, I should thank Henry and the rest of the Timberites for the kind invitation to guest-post, and that very warm introduction. In exchange, I'm going to blog more or less as I usually would, only here. This means some big bricks of posts about "complex systems", so called, which is or was my scientific field, more or less; and also any miscellaneous outrages which catch my eye this week. Mounting my usual hobby-horses on this stage is a poor exchange for their generosity, but mounting hobby-horses is why I started blogging in the first place, and anyway I'm big on conscienceless exploitation of cooperators.

Today I want to talk (below the line) about some recent work in the statistical mechanics of disordered systems, which might help explain how our sense organs work, and actually involves some good uses of the self-organized criticality and power laws; tomorrow or the day after I'll get to the smoldering question of "Why Oh Why Can't We Have Better Econophysics?"


Folklore says that the dark-adapted human eye can detect a single photon; this isn't quite true, but we can consciously detect a few tens of photons, and some species are that sensitive. Of course, we can see not only in the dark but also during broad daylight, but then the number of photons falling on every part of the retina is huge; the eye isn't overwhelmed and saturated, though now one or ten photons more or less makes no discernible difference. In the jargon, the eye, and the other sensory organs, have both a large "dynamic range" (we can see in the dark and in the daylight), and "nonlinear response" (changes which are noticeable in the dark aren't against a high-intensity background). Some version of these facts, including the basic (power-law) form of the relationship between physical stimulus intensity and perceived sensory magnitude, have been known since the nineteenth century. This makes it all the more puzzling that sensory neurons show a linear response over a narrow dynamic range, beyond which they saturate.

You could evade this difficulty by having lots of neurons with different operating ranges, so that raising stimulus intensity saturated some but activated others. The problem is that there don't seem be that wide a spectrum of operating ranges for individual neurons. In a recent paper, Osame Kinouchi and Mauro Copelli (who blog together at Semciência) offer another way, which has to do with the way sensory neurons interact with each other in a network.

Osame Kinouchi and Mauro Copelli, "Optimal dynamical range of excitable networks at criticality", Nature Physics 2 (2006): 348--351; free preprint version at q-bio.NC/0601037 *
Abstract: A recurrent idea in the study of complex systems is that optimal information processing is to be found near phase transitions. However, this heuristic hypothesis has few (if any) concrete realizations where a standard and biologically relevant quantity is optimized at criticality. Here we give a clear example of such a phenomenon: a network of excitable elements has its sensitivity and dynamic range maximized at the critical point of a non-equilibrium phase transition. Our results are compatible with the essential role of gap junctions in olfactory glomeruli and retinal ganglionar cell output. Synchronization and global oscillations also emerge from the network dynamics. We propose that the main functional role of electrical coupling is to provide an enhancement of dynamic range, therefore allowing the coding of information spanning several orders of magnitude. The mechanism could provide a microscopic neural basis for psychophysical laws.

Neurons, like muscle cells, are "excitable", in that the right stimulus will get them to suddenly expend a lot of energy in a characteristic way — muscle cells twitch, and neurons produce an electrical current called an action potential or spike. Kinouchi and Copelli use a standard sort of model of an excitable medium of such cells, which distinguish between the excited state, a sequence of "refractory" states where the neuron can't spike again after it's been excited, and a resting or quiescent state when the right input could get it to fire. (These models have a long history in neurodynamics, the study of heart failure, cellular slime molds, etc.) Normally, in these models the cells are arrayed in some regular grid, and the probability that a resting cell becomes excited goes up as it has more excited neighbors. This is still true in Kinouchi and Copelli's model, only the arrangement of cells is now a simple random graph. Resting cells also get excited at a steady random rate, representing the physical stimulus.

Kinouchi and Copelli argue that the key quantity in their model is how many cells are stimulated into firing, on average, by a single excited cell. If this "branching ratio" is less than one, an external stimulus will tend to produce a small, short-lived burst of excitation, and there will be no spontaneous activity; the system is sub-critical. If the branching ratio is greater than one, outside stimuli produce very large, saturating waves of excitation, and there's a lot of self-sustained activity, making it hard to use a super-critical network as a detector. At the critical point, however, where each excited cell produces, on average, exactly one more excited cell, waves of excitation eventually die out, but they tend to be very long-lived, and in fact their distribution follows a power law.

(People who teach courses on random processes are very fond of branching processes, because the basic model can be solved exactly with hundred-year-old math, but there are endless ramifications, and some of the applications are very technically sweet. Like most mathematical scientists, Kinouchi has certain tools he tends to return to, and critical branching processes are one of them.)

As Kinouchi and Copelli say in their abstract, the idea that the critical point, where things are just about to go unstable, is a useful place for processing or transmitting information is a persistent theme of complex systems. (You could, arguably, even trace a version of the idea back to William James's Principles of Psychology.) It has also, before this, been one of the weakest of our ideas. The original work from the 1980s on "evolving to the edge of chaos" has proved impossible to replicate, I would even say experimentally refuted. (Why the phrase and idea continue to propagate is another question for another time.) Stu Kauffman's studies of models of gene regulatory networks certainly suggests that information moved through these most easily near their critical point, but I don't think anyone has done a careful information-theoretic analysis of that. In any case, E. coli doesn't care about the bandwidth of its regulatory network: it cares about reliably making lactase when it only has lactose to eat, i.e., specific adaptive functions. Prior to this, I can only think of one situation where the idea has been made precise and has strong evidence to back it up (namely, this paper), but that's a purely mathematical exercise of no biological relevance.

What Kinouchi and Copelli have done is very different: they've actually identified something biologically important which is maximized at the critical branching ratio, namely the dynamic range. The network as a whole responds to the stimulus, and its dynamic range can be many orders of magnitude wider than that of its component cells. It is this enhancement which is maximized at the critical branching ratio, and falls off sharply for networks which are even a little sub- or super- critical. As a bonus, the shape of the response function is of the correct power-law form, though, in their model, the exact exponent isn't right. Modifying the network structure, or some model details, changes the exponent, but the dynamic range is still sharply peaked at the critical branching ratio.

There are a lot of other nice things about this paper, which I won't get in to least I repeat it all, but I will point out one thing: while their central qualitative results are pretty robust to small tweaks, there are some details of their model which make it a fair caricature of some excitable media, but not all. These are quite deliberately matched to properties of the olfactory system and the retina, but wouldn't work in, say, the cortex, where the dynamics of excitation are different. So this isn't an "over-universal" model, but one of particular phenomena produced by particular mechanisms. In fact, looking at olfaction, they are able to make a prediction about the effects of knocking out specific genes which are involved in the fast, symmetrical electric couplings they assume. Nobody seems to have done those experiments yet, but, at least to this non-biologist, it seems feasible, and, now, very interesting.

*: Here's an anecdote illustrating how broken academic publishing is. Kinouchi and Copelli work at the University of Saõ Paulo, which doesn't, for reasons of economy, subscribe to Nature Physics. To get an electronic copy of their own published paper, they were forced to write correspondents at other universities. I couldn't help them, because my school doesn't feel like it can afford to subscribe to Nature Physics either.

Complexity Minds, Brains, and Neurons;

Posted at July 24, 2006 16:20 | permanent link

Three-Toed Sloth