Attention conservation notice: A post of positively Holbonic length (over 2700 words), occasioned by physicists and sociologists squabbling over the turf of studying social networks. Includes a lengthy self-quotation and defenses of the author's friends. Plus, the title is really bad. You must have something better to do than read this.
Eszter Hargittai over at at Crooked Timber is a bit miffed about physicists working on social networks, the specific occasion for the indignation being a preprint on arxiv.org on social networks in the Eurovision song contest; as she points out, Kieran Healy tossed that idea out as a joke more than a year ago. She goes on from there to complain about physicists' habit of invading the field, ignoring all previous work and re-inventing sundry wheels. She posts an informative graph of the citation pattern of the small worlds literature, where physicists show up as one cohesive community (colored black) and sociologists as another (colored white), with only a handful of links between them, and closes by offering some choice recent quotes from John Scott, who I know of as the author of a useful textbook on social network analysis, which includes a nice historical survey.
I was going to let this one pass by, but Henry Farrell, in the comments, asked me to weigh in — which isn't such a compelling reason, when there's real work to be done on deadline (no offense to Henry), but I've been meaning to vent about some of this stuff for a while, and it gives me an excuse to do so. With any luck, I'll manage to offend absolutely everyone!
First off, I agree with a lot of what Hargittai says, and with the quotes she gives from Scott. My fellow physicists, for a number of reasons, have a very bad habit of trying to take up new subject-matter and not learning what's already known about it. Some years ago, in fact, Bill Tozier and I wrote a paper about physicists' tendency to do this in the area of biological evolution, and our guess as to the mechanism. I still think that's right, so I'll indulge in the dreadful vice of public self-quotation.
Mutatis mutandis, I think the same mechanism is at work in our incursions into economics ("econophysics"), network analysis, social psychology, etc.
- A physicist runs across or concocts from whole cloth a mathematical model which is simple, neat, and contains a great many variables of the same sort.
- The physicists has heard of Darwin (1859), and may even have read Dawkins (1985) or some essays by Gould, but wouldn't know Fisher (1958), Haldane (1932) and Wright (1986) from the Three Magi, and doesn't dream that such a subject as mathematical evolutionary biology exists.
- The physicist is aware that lots of other physicists are interested in annexing biology as a province of statistical physics.
- The physicist interprets his multitude of variables as species or (if slightly more sophisticated) as genotypes, and proclaims that he has found "Darwin's Equations" (cf. Bak et al. (1994)), or, more modestly, has made an important step towards eventually finding those equations.
- His paper is submitted for review to other physicists, who are just as ignorant of biology as he, but see that it's about equivalent to the other papers on evolution by physicists. They publish it.
- The paper is read by other physicists, because at least it's not another derivation of specific heats on some convoluted lattice under a Hamiltonian named for some Central European worthy now otherwise totally forgotten. Said physicists think this is cutting-edge evolutionary theory.
- Some of those physicists will know or discover simple, neat models with lots of variables of the same type.
The thing is, the quality of that work is highly, highly variable. Most of what physicists do in all these areas is at best uninteresting and derivative, but most of what all academics do in all areas of research is at best uninteresting and derivative. My impressions, as a reader of the literature, and as a referee for a lot of physics journals, are as follows. We do some good and interesting stuff in theoretical biology, especially in some specialized corners of evolutionary theory (viral quasi-species, hypercycles) and ecology, and on allometric scaling; a lot of what we do on networks is also good, and not just a re-invention of the wheel — I'll get back to that. (Though a part of me wants to ask whether there wouldn't be a certain comfort in a genuine re-invention, in seeing that even from a completely different starting point, you wind up with the same concepts?) In the (very distinct) areas of artificial neural networks and neural coding in real animals, we're actually quite good. On the other hand, our tendency to hallucinate power laws is a disgrace (as I've written here before), and there are times when I think that the best thing which could happen to econophysics would be for someone to come along and rescue its fallen practitioners by making honest quants of them. What explains this variation, I don't know — it's obviously correlated with how well physicists know the non-physics literature, but that might not be the cause; nonetheless I'm pretty sure it's real. In many cases, its effects are annoying-to-dreadful. I don't bother to read Physica A any more, because the overwhelming majority of its papers seem either sound-but-boring vanilla statistical mechanics, or wrong-headed at best. There is a reason why I find myself writing posts titled "I don't know you people", and why my first faculty job is going to be in statistics, not physics (and it's not that I can't get stuff into physics journals, thanks).
(I will not repeat my speculations on the causes which are leading us to do more of this in recent years. I do now have an outline for On the Genealogy of Complexity, and if you're really interested and willing to keep its contents confidential, I'd be happy to send it to you for comments. Nor will I renew my grumbling about theoretical physicists not learning statistics, since that's tiresome and anyway I have another post about that in prep. I will point out an aspect of the division of labor, however: in physics proper, the task of comparing theoretical predictions to real-world data traditionally falls to experimentalists, or interface specialists known as "phenomenologists".)
As for networks, I should declare that while I've never published anything in the area myself (popularization doesn't count), two of my good friends and collaborators are reasonably prominent in that area, and I know quite a few others personally. (I don't speak for my friends, of course, so the blame for what follows is just mine.) So there's probably an element of "don't say my friends are dumb!" in my remarks — as in many academic discussions, of course. But I've already agreed that a huge chunk of what shows up at arxiv.org — and even what gets through peer review — is not especially good, so this isn't intended as a blanket defense of work done by physicists, or near-physicists, on networks. (I say "or near-physicists" because I imagine someone like Duncan Watts is probably counted as a physicist in the graph Hargittai posted, since his doctorate is from Cornell's department of theoretical and applied mechanics, i.e., applied math, though he's now a professor of sociology at Columbia. Actually, come to think of it, what color do Duncan's papers have in that graph?)
Having now thoroughly cleared my throat, let me say what those genuine contributions are.
One thing I don't think the physicists can really claim as a contribution is computational data analysis on really large networks; while things like studying the collaboration networks of physics or biomedicine are impressive through sheer scale (as well as through results), something like Woody Powell et al.'s study of the American biotechnology industry is certainly in the same league. Another place where novelty can't be claimed is the idea of networks with power-law degree distributions forming through preferential attachment; Barabasi and Albert re-invented this in one of their 1999 paper on network growth, but the fundamental mechanism — multiplicative growth producing highly skewed distributions — was apparently first discovered by Herbert Simon in the 1950s ("On a Class of Skew Distribution Functions", Biometrika 42 (1955): 425--440), and applied to citation networks by Derek de Solla Price in the 1970s ("A General Theory of Bibliometric and Other Cumulative Advantage Processes", Journal of the American Society for Information Science 27 (1976): 292--306). The first people to point this out were themselves physicists. (To compare a small thing to a great one, if all western thought is a series of footnotes to Plato, then complex systems is a series of footnotes to Models of Man and The Sciences of the Artificial.)
(As another parenthetical remark: Price, by all accounts one of the great sociologists of science, was, as I've said before, originally a physicist. Like many of us, he seems to have always remained one at heart; he described his classic book Little Science, Big Science as an exercise in statistical mechanics: "[M]y first lecture is concerned with the volume of science, the second with the velocity distribution of its molecules, the third with the way in which the molecules interact with one another, and the fourth in deriving the political and social properties of this gas" (p. vii). Perhaps part of the difference between sociologists' reception of Price and that of later physicists, is that there were next to no sociologists studying these issues before Price!)
So where does all this leave us? Obviously, I wish physicists would bother to master the existing literature in new areas, before we start building models there. It's highly unlikely that all of the previous scholars who worked on the subject were idiotic or totally misguided — and even if they were, it's important to be able to say so with a clean conscience. As a physicist working in non-traditional areas myself, I find it both acutely embarrassing and professionally harmful when others of my tribe make dreadful howlers, or re-enact elementary discoveries. (There is a reputational externality here.) At the same time, I think it would be a shame if the offenses many physicists commit against properly scholarly procedure and etiquette lead others — in this case, sociologists — to dismiss our efforts completely, partly because that would be unfair to individuals, and partly because very interesting results, which seem to me to be relevant to sociology, have come from individuals who I've heard express the most withering (and completely unjustified) contempt for that field and its practitioners. (I can't, obviously, name names.)
After some thought, I am unable to come up with a flaw in the following simple plan (which means there are probably many): if you are a physicist and found you have written a paper on topic X, send it to a journal of X-ology. If X is, by tradition, a part of physics, by all means send it to Physical Review E. If, on the other hand, X is a topic in social science, then send it to a social science journal. Only if X isn't physics, but also isn't really, or isn't just (say) an analysis of social structure, because it's also an analysis of metabolic pathways, and says something new about nonequilibrium phase transitions, and says how to get a free pony, only then does it make sense again to send it to PRE — or Nature, especially if you have a good picture of the pony. (Even then, if we had successful complex systems journals, I'd say send it there.) As precedent, I would point to the way we helped invent molecular biology, publishing not in our own journals but in things like the Journal of Biological Chemistry. If you are worried about finding a social science journal which will not reject your contributions just because of your background and approach, let me take this opportunity to plug the new Structure and Dynamics: e-Journal of Anthropological and Related Sciences. As a recently co-opted member of the editorial board, I can promise that your manuscript will receive extensive criticism from referees from both mathematical-physical and social-scientific backgrounds — though whether the net effect is to make the review process unusually well-informed or completely blockheaded is obviously not for me to say.
Update: See The Structure and Strangeness of Interdisciplinary Research for a follow-up; there will probably be more.
Manual trackback: Dubbings and Diversions; MoneyScience; Structure + Strangeness; Idiolect; hakank.blogg; Easily Distracted; Crooked Timber; Knowledge Problem; Preposterous Universe; Something Similar; Michael Nielsen; Nanopolitan
Posted at May 20, 2005 00:03 | permanent link