When you want a smart take on how the more robust findings of cognitive psychology relate to the social organization of interdisciplinary knowledge-production, where do you turn but to the Central Intelligence Agency?
I'm feeling lazy, so I'll quote extensively. (All ellipses are mine.)
Intelligence analysis, like other complex tasks, demands considerable expertise. It requires individuals who can recognize patterns in large data sets, solve complex problems, and make predictions about future behavior or events. To perform these tasks successfully, analysts must dedicate a considerable number of years to researching specific topics, processes, and geographic regions. ...The very method by which one becomes an expert explains why experts are much better at describing, explaining, performing tasks, and problem-solving within their domains than are novices, but, with a few exceptions, are worse at forecasting than actuarial tables based on historical, statistical models.
A given domain has specific heuristics for performing tasks and solving problems. These rules are a large part of what makes up expertise. In addition, experts need to acquire and store tens of thousands of cases within their domains in order to recognize patterns, generate and test hypotheses, and contribute to the collective knowledge within their fields. In other words, becoming an expert requires a significant number of years of viewing the world through the lens of one specific domain. It is the specificity that gives the expert the power to recognize patterns, perform tasks, and solve problems.
Paradoxically, it is this same specificity that is restrictive, narrowly focusing the expert's attention on one domain to the exclusion of others. It should come as little surprise, then, that an expert would have difficulty identifying and weighing variables in an interdisciplinary task such as forecasting an adversary's intentions. ...One obvious solution to the paradox of expertise is to assemble an interdisciplinary team. Why not simply make all problem areas or country-specific data available to a team of experts from a variety of domains? ...
Ignoring potential security issues, there are practical problems with this approach. First, each expert would have to sift through large data sets to find data specific to her expertise....
Second, during the act of scanning large data sets, the expert inevitably would be looking for data that fit within her area of expertise. Imagine a chemist who comes across data that show that a country is investing in technological infrastructure, chemical supplies, and research and development.... The chemist recognizes that these are the ingredients necessary for a nation to produce a specific chemical agent, which could have a military application or could be benign. The chemist then meshes the data with an existing pattern, stores the data as a new pattern, or ignores the data as an anomaly.
The chemist, however, has no frame of reference regarding spending trends in the country of interest. The chemist does not know if this is an increase, a decrease, or a static spending pattern—answers that the economist could supply immediately. There is no reason for the chemist to know if a country's ability to produce this chemical agent is a new phenomenon. Perhaps the country in question has been producing the chemical agent for years and these data are part of some normal pattern of behavior.
One hope is that neither expert treats the data set as an anomaly, that both report it as significant. Another hope is that each expert's analysis of the data... will come together at some point. The problem is at what point? Presumably, someone will get both of these reports somewhere along the intelligence chain. Of course, the individual who gets these reports may not be able to synthesize the information. That person is subject to the same three confounding variables described earlier: processing time, pattern bias, and heuristic bias. Rather than solving the paradox of expertise, the problem has merely been shifted to someone else in the organization.
In order to avoid shifting the problem from one expert to another, an actual collaborative team could be built. Why not explicitly put the economist and the chemist together to work on analyzing data? The utilitarian problems with this strategy are obvious. Not all economic problems are chemical and not all chemical problems are economic. Each expert would waste an inordinate amount of time. Perhaps one case in one hundred would be applicable to both experts; during the rest of the day, the experts would drift back to their individual domains, in part because that is what they are best at and in part just to stay busy.
Closer to the real world, the same example may also have social, political, historical, and cultural aspects.... In order for collaboration to work, each team would have to have experts from many domains working together on the same data set.
Successful teams have very specific organizational and structural requirements.... Effective teams require cohesion, formal and informal communication, cooperation, and shared mental models, or similar knowledge structures. While cohesion, communication, and cooperation might be facilitated by specific work practices, creating shared mental models, or similar knowledge structures, is not a trivial task. Creating shared mental models may be possible with an air crew or a tank crew, where an individual's role is clearly identifiable as part of a larger team effort—like landing a plane or acquiring and firing on a target. Creating shared mental models in an intelligence team is less likely, given the vague nature of the goals, the enormity of the task, and the diversity of individual expertise. Moreover, the larger the number of team members, the more difficult it is to generate cohesion, communication, and cooperation. Heterogeneity can also be a challenge: It has a positive effect on generating diverse viewpoints within a team, but requires more organizational structure than does a homogeneous team.
Without specific processes, organizing principles, and operational structures, interdisciplinary teams will quickly revert to being just a room full of experts who ultimately drift back to their previous work patterns. That is, the experts will not be a team at all; they will be a group of experts individually working in some general problem space. ...Intelligence analysis uses a wide variety of expertise to address a multivariate and complex world. Each expert uses his or her own heuristics to address a small portion of that world. Intelligence professionals have the perception that somehow all of that disparate analysis will come together at some point, either at the analytic team level, through the reporting hierarchy, or through some computational aggregation.
The intelligence analyst is affected by the same confounding variables that affect every other expert: processing time, pattern bias, and heuristic bias. This is the crux of the paradox of expertise. Domain experts are needed for describing, explaining, and problem solving; yet, they are not especially good at forecasting because the patterns they recognize are limited to their specific fields of study. They inevitably look at the world through the lens of their own domain's heuristics.
What is needed to overcome the paradox of expertise is a combined approach that includes formal thematic teams with structured organizational principles; technological systems designed with significant input from domain experts; and a cadre of analytic methodologists. Intelligence agencies continue to experiment with the right composition, structure, and organization of analytic teams; they budget significant resources for technological solutions; but comparatively little is being done to advance methodological science.
Advances in methodology are primarily left to the individual domains. But relying on the separate domains risks falling into the same paradoxical trap that currently exists. What is needed is an intelligence-centric approach to methodology, an approach that will include the methods and procedures of many domains and the development of heuristics and techniques unique to intelligence. In short, intelligence analysis needs its own analytic heuristics designed, developed, and tested by professional analytic methodologists. This will require using methodologists from a variety of other domains and professional associations at first, but, in time, the discipline of analytic methodology will mature into its own sub-discipline with its own measures of validity and reliability.
I have to say it's a bit obscure to me how Johnston thinks the development of intelligence-specific methods will rectify the central problem he diagnoses. (He might just mean that it can't possibly be fixed without such methodology.) That said, the whole thing's well worth reading, especially if you're interested in the earlier discussions of heuristic diversity, or interdisciplinary science. According to this, Johnston, a post-doc at the CIA's Center for the Study of Intelligence, is by training an anthropologist, and has a forthcoming book (based on his dissertation?) titled The Culture of Analytic Tradecraft: An Ethnography of the Intelligence Community, which I'd now like to read...
The archive of declassified Studies in Intelligence articles, 1955--1976, has a lot of interesting stuff in it too, though the transcription into HTML is occasionally shaky, and it's not convenient to link directly to articles.
Update, 25 August: Henry Farrell writes to point to a forthcoming paper in Studies in Intelligence, D. Calvin Andrus's "The Wiking and the Blog: Toward a Complex Adaptive Intelligence Community". I haven't had a chance to read it yet, but it might be worthwhile. And, yes, this post was missing for a few days. I could tell you what happened, but then I'd have to...
(Profuse thanks to K. for pointing out Johnston's paper and discussing it with me.)
Posted at August 19, 2005 20:15 | permanent link