This action might not be possible to undo. Are you sure you want to continue?
Think different, CIA - The Boston Globe
THIS STORY HAS BEEN FORMATTED FOR EASY PRINTING
Think different, CIA
One of the biggest challenges for American intelligence? The way the brain works.
By Robert Jervis | January 17, 2010 What’s wrong with American intelligence? That question became tragically urgent at the end of last year, first with the failed attempt to blow up Northwest Flight 253, and then the deadly suicide bombing that killed seven CIA officers in eastern Afghanistan. These events put intelligence at the top of the national agenda and have been followed, predictably, by an outcry that our intelligence system needs to be overhauled. Leaders and critics, from the president on down, are calling for a host of solutions: more people on no-fly lists, tighter control of visas, more thorough airport screening, better tracking of suspects. In sum, the thinking goes, we need to gather more information, then work harder to connect the dots. Those impulses are understandable, but they miss the most important problem. From studying many individual cases, and conducting detailed post-mortems of US intelligence failures in the cases of Iraqi weapons of mass destruction and the 1979 Iranian Revolution, I have found that many common assumptions about why our intelligence fails are misguided. The problems with our intelligence system aren’t primarily problems with information. They are problems with how we think. To try to build a perfect intelligence system - one that will never miss a terrorist, that intercepts every dangerous e-mail - is an impossibility. We face skilled adversaries who are trying to deceive us. Hiding is often easier than finding. If people are often surprised to discover that their own spouses have been cheating on them, how could we expect governments to have a perfect understanding of what others are doing? The problem isn’t usually - or at least isn’t only - too little information, but too much, most of it ambiguous, contradictory, or misleading. The blackboard is filled with dots, many of them false, and they can be connected in innumerable ways. Only with hindsight does the correct pattern leap out at us, and to fix what “broke” the last time around only guarantees you have solved yesterday’s problem. Far more important, and useful, is to address the flaws in how we interpret and use the intelligence that we already gather. Intelligence analysts are human beings, and many of their failures follow from intuitive ways of thinking that, while allowing the human mind to cut through reams of confusing information, often end up misleading us. This isn’t a problem that occurs only with spying. It is central to how we make sense of our everyday lives, and how we reach decisions based on the imperfect information we have in our hands. And the best way to fix it is to craft policies, institutions, and analytical habits that can compensate for our very understandable flaws. Humans have survived and evolved by being able to quickly make sense of the massive amounts of information that our eyes and ears receive. We excel at perceiving patterns and making up stories that bring the coherence that we need in order to act. Organizations do the same things on a larger scale. This isn’t a bad thing - without powerful filtering mechanisms and shortcuts, we would be lost. But these intuitive ways of thinking also lead us into traps. The first and most important tendency is that our minds are prone to see patterns and meaning in our world quite quickly, and then tend to ignore information that might disprove them. Premature cognitive closure, to use the phrase employed by psychologists, lies behind many intelligence failures. For example, in the summer of 2001, the United States gained an intelligence windfall when it intercepted a shipment of aluminum tubes bound for Iraq. It is now clear that they were to be used in rockets, but in part because the first CIA official to analyze them was an expert in centrifuges, he and the organization immediately concluded that they were designed for uranium enrichment - and thus Saddam must have been actively pursuing nuclear weapons. This “finding” remained a centerpiece of intelligence arguments until the post-war survey discovered the truth. If the pattern hadn’t imposed itself so quickly, and the CIA had been able to keep an open mind about the meaning of those tubes, analysts might have made better use of other information that pointed to
Think different, CIA - The Boston Globe
different conclusions. Second, people pay more attention to visible information than to information generated by an absence. In a famous Arthur Conan Doyle story, it took the extraordinary skill of Sherlock Holmes to see that an important clue in the case was a dog not barking. The equivalent, in the intelligence world, is information that should be there but is not. In another example from Iraq, in 2002 the CIA made vigorous efforts to uncover evidence of WMD programs, interviewing informed Iraqis outside the country and even sending people into Iraq to talk to their relatives who were scientists and technicians. No credible reports came in to confirm that Saddam was working on weapons of mass destruction, and so the analysts felt they had nothing to report. But they were missing another kind of evidence they were generating: If Iraq really did have active WMD programs, then such extensive interviews should have provided at least some confirming evidence. Its absence, they should have realized, was important information in itself. Third, conclusions often rest on assumptions that are not readily testable, and may even be immune to disproof. Intelligence noted that the evidence for Iraq’s WMD programs was only scattered and sketchy, but assumed that this was the result of an extensive deception program by Saddam. This was not an unreasonable assumption Iraq did engage in a lot of deception - but this line of reasoning also made the programs’ existence impossible to disprove. Neither analysts nor policy makers understood the extent to which their views rested on a belief that no new information could dislodge. Another common error is to believe that the adversary sees the world as you do - a failing that is especially strong when the adversary’s beliefs are strange or extreme, but can also happen when the adversary has tactical goals that you hadn’t expected. Saddam’s view that it was more important to bluff Iran by pretending to have active WMD programs than to avoid an American invasion by coming clean was bizarre, and no observers seem to have grasped it. In other cases, intelligence believes that an adversary must be stymied because it has no obvious path forward - but this very situation gives the adversary incentives to seek alternative strategies, and intelligence is often slow to appreciate this. The United States was taken by surprise at Pearl Harbor partly because it knew that Japan understood that it could not win an all-out war - and overlooked the possibility that Japan would therefore follow a different approach of striking such a sharp blow against the United States that it would be willing to suffer a limited defeat and accept a negotiated settlement. Israel was similarly surprised when Syria and Egypt attacked Sinai in October 1973 - their intelligence knew that Egypt could not throw Israel out of the peninsula. What they missed was that instead of being deterred by the strength of the Israeli army, Egypt’s president, Anwar Sadat, decided to seek a more limited military victory that would shake up Israel, bring in the United States, and lead to fruitful negotiations. Our minds are, then, very good at forming a coherent picture, but less good at challenging it, questioning its assumptions, and coming up with alternative explanations. We are quick and often assured, but we are not selfcorrecting. If there are flaws in the way that we think, then gathering more and more information isn’t a solution. What our intelligence system really needs is ways to avoid becoming trapped by the natural tendency to leap to conclusions and stick with them. This is true in other fields as well, which is why so much of professional and scientific training is designed to reduce the errors made by fallible people using weak information. If individuals cannot avoid jumping to conclusions, there are ways for organizations to make up for this. They can systematically solicit the views of people with different perspectives, for example, or use devil’s advocates who will challenge established views. To compensate for the tendency to rely on implicit understandings, intelligence analysts can be pushed to fully explain their reasoning, allowing others if not themselves to probe the assumptions that often play a large and unacknowledged role in their conclusions. To better recognize the significance of absences, analysts can learn to think explicitly about what evidence should be appearing if their beliefs are correct. Gaps do not automatically mean that the established ideas are wrong, but they may signal a flaw in the prevailing thesis. Analysts can also be trained to consider, explicitly, what evidence could lead them to change their minds - not only alerting themselves to the possibility that the necessary information might be missing, but also providing an avenue for others to find evidence that might overturn established views. Analysts should think more broadly and imaginatively about how adversaries are likely to respond, especially when it appears as though they have few alternatives and may be pushed into tactically surprising acts. As the inevitable changes to our intelligence are debated and put into place, we need - all of us, from voters to
Think different, CIA - The Boston Globe
CIA analysts to the president - to avoid the easy temptation to assume the future will be just like the past. One of the failures of intelligence before the 1991 Gulf War was the fact that we underestimated the Iraqi WMD programs. Twelve years later, overestimating those programs didn’t fix the problem; it only created new ones. The reality is that we have to learn to live with errors - just not as many as we have been committing. Robert Jervis is a professor of international politics at Columbia University and a consultant to the intelligence community. His book , ”Why Intelligence Fails: Lessons from the Iranian Revolution and the Iraq War,” will be published by Cornell University Press in March.
© Copyright 2010 The New York Times Company