Skip to main content
book excerpt: thinking, fast and slow

The new book by Nobel Laureate Daniel Kahneman, shown here at Princeton in 2002, is called Thinking, Fast and Slow.BRIAN BRANCH-PRICE/The Associated Press

Everything makes sense in hindsight, a fact that financial pundits exploit every evening as they offer convincing accounts of the day's events. And we cannot suppress the powerful intuition that what makes sense in hindsight today was predictable yesterday. The illusion that we understand the past fosters overconfidence in our ability to predict the future.



We think that we should be able to explain the past by focusing on either large social movements and cultural and technological developments or the intentions and abilities of a few great men. The idea that large historical events are determined by luck is profoundly shocking, although it is demonstrably true.

It is hard to think of the history of the 20th century, including its large social movements, without bringing in the role of Hitler, Stalin and Mao Zedong. But there was a moment in time, just before an egg was fertilized, when there was a 50-50 chance that the embryo that became Hitler could have been a female. Compounding the three events, there was a probability of one-eighth of a 20th century without any of the three great villains and it is impossible to argue that history would have been roughly the same in their absence.

The fertilization of these three eggs had momentous consequences, and it makes a joke of the idea that long-term developments are predictable.

Yet the illusion of valid prediction remains intact, a fact that is exploited by people whose business is prediction – not only financial experts but pundits in business and politics too. Television and radio stations and newspapers have their panels of experts whose job it is to comment on the recent past and foretell the future. Viewers and readers have the impression that they are receiving information that is somehow privileged, or at least extremely insightful. And there is no doubt that the pundits and their promoters genuinely believe they are offering such information.

Philip Tetlock, a psychologist at the University of Pennsylvania, explained these so-called expert predictions in a landmark 20-year study, which he published in his 2005 book Expert Political Judgment: How Good Is It? How Can We Know?

Tetlock interviewed 284 people who made their living "commenting or offering advice on political and economic trends." He asked them to assess the probabilities that certain events would occur, both in areas of the world in which they specialized and in regions about which they had less knowledge. Would Mikhail Gorbachev be ousted in a coup? Would the United States go to war in the Persian Gulf? Which country would become the next big emerging market? In all, Tetlock gathered more than 80,000 predictions. He also asked the experts how they reached their conclusions, how they reacted when proved wrong and how they evaluated evidence that did not support their positions.

Respondents were asked to rate the probabilities of three alternative outcomes in every case: the persistence of the status quo, more of something such as political freedom or economic growth, or less of that thing.

The results were devastating. The experts performed worse than they would have if they had simply assigned equal probabilities to each of the three potential outcomes. In other words, people who spend their time, and earn their living, studying a particular topic produce poorer predictions than dart-throwing monkeys who would have distributed their choices evenly over the options. Even in the region they knew best, experts were not significantly better than non-specialists.

Those who know more forecast very slightly better than those who know less. But those with the most knowledge are often less reliable. The reason is that the person who acquires more knowledge develops an enhanced illusion of her skill and becomes overconfident.

"In this age of academic hyperspecialization, there is no reason for supposing that contributors to top journals, distinguished political scientists, area study specialists, economists, and so on – are any better than journalists or attentive readers of The New York Times in 'reading' emerging situations," Tetlock writes.



Tetlock also found that experts resisted admitting that they had been wrong, and when they were compelled to admit error, they had a large collection of excuses.



Experts are led astray not by what they believe, but by how they think, says Tetlock. He uses the terminology from Isaiah Berlin's essay on Tolstoy, The Hedgehog and the Fox. Hedgehogs "know one big thing" and have a theory about the world; they account for particular events within a coherent framework, bristle with impatience toward those who don't see things their way, and are confident in their forecasts. They are also especially reluctant to admit error. For hedgehogs, a failed prediction is almost always "off only on timing" or "very nearly right." They are opinionated and clear, which is exactly what television producers love to see on programs. Two opposing hedgehogs make for a good show.

Foxes, by contrast, are complex thinkers. They don't believe that one big thing drives the march of history. Instead, the foxes recognize that reality emerges from the interactions of many different agents and forces, including blind luck, often producing large and unpredictable outcomes.

It was the foxes who scored best in Tetlock's study, although their performance was still very poor. And they are less likely than hedgehogs to be invited to participate in television debates.







Excerpted from Thinking, Fast and Slow. Copyright © 2011 Daniel Kahneman. Published by Doubleday Canada, an imprint of the Doubleday Canada Publishing Group, which is a division of Random House of Canada Limited. Reproduced by arrangement with the Publisher. All rights reserved.

Interact with The Globe