My fellow oncologists may find it strange to see a review of a book which was written by a psychologist who won a Nobel Prize in Economics for the work on which the book is based. Not so. Thinking Fast and Slow is an illuminating and profound text that provides fundamental insights into how we think.
In the 1970s, Daniel Kahneman and his colleague Amos Tversky began to explore the supposed rationality of economic decision-making. Kahneman and Tversky, who would have certainly shared the prize were it not for Tversky's untimely death, began to identify what they termed "cognitive biases." These are consistent errors in reasoning that result in warped conclusions. Out of this work grew their comprehensive analysis of how we reason. Two fundamentally different systems are in play-what they called simply System 1 and System 2.
System 1 is fast, effortless, works on intuition, jumps to conclusions, constructs plausible stories from whatever data is available, and cannot be turned off. It operates automatically and without any self-awareness or control. Presumably it has developed evolutionarily as a way for us to instantaneously assess our environment and recognize both risks and opportunities.
Although we would all like to believe otherwise, System 1 is the dominant system that controls most of our decisions. For the most part it works well, but it is responsible for many of our irrational conclusions. As Kahneman says, it is the "secret author of many of the choices and judgments you make."
In contrast is System 2. It is slow, deliberating, and cautious. It requires focus, concentration, and effort. It takes over when stuff becomes difficult. But it is hard work and it tends to tire easily. As a result System 2 will frequently defer to System 1-sometimes resulting in unsound decisions.
Kahneman presents many compelling examples of these two systems in the text. One of the better examples of System 1 relates to this well-known optical illusion:
Using System 1 you look at the figure, make no measurements, and conclude instantly and confidently that the lower line is longer. However, if you measure the lines (using System 2) they are the same length. But amazingly, no matter how many times you return to the figure, System 1 always tells you the lower line is longer. System 1 cannot be turned off.
In another example in the book, System 2 is well illustrated by calculating, say, 13 x 27. Here a quick intuitive answer is not forthcoming. This problem requires your attention and singular focus. You stop everything else, use your basic mathematics, your pupils dilate, and you work on this particular problem until the task is completed. It takes much more effort than using System 1.
But although we believe that this is the system we most often use to make decisions, it is not. As Kahneman says: "System 2 is like a supporting character who believes herself to be the lead actor." A simple thought experiment will illustrate: "A bat and ball cost $1.10. The bat costs one dollar more than the ball. How much does the ball cost?" Our quick System 1 answer is 10 cents. However, if we pause, reason, and use System 2, we see that the correct answer is 5 cents. Unfortunately we prefer to use the easy System 1, and as Kahneman notes, over 50 percent of students at Harvard, MIT, and Princeton gave the intuitive and incorrect answer.
Kahneman is quick to point out that the two systems do not reside in any specific portion of the brain. They cannot be identified or localized by a PET scan. They are what he calls "useful fictions" that describe how we think.
Well, if they are useful fictions, are they credible? Through an extensive series of examples, thought experiments, and personal stories, Kahneman builds a powerful case for the two models. His experiments deviate from what we would generally regard as classical and definitive clinical trials. There are no randomized groups with substantial numbers of comparable and well-balanced representative subjects. Most of his studies are small thought experiments using small numbers of students-hardly the design we would find acceptable in medicine.
Why then, do they seem so persuasive? They are convincing because the reader can engage in the same experiment-and at least this reader was frequently caught in the same "cognitive biases" as the participants in the studies. The work becomes compelling because it can be self-validated by the reader.
Example Using Kidney Cancer
Although we medical professionals would like to believe like the economists, that we are rational and driven by System 2, we succumb to all of the errors generated by System 1. Kahneman illustrates this with an example using kidney cancer.
Fact: Counties with the lowest incidence of kidney cancer are mostly rural, sparsely populated, and located in the Midwest, South, and rural parts of the West. What is the explanation?
System 2 searches our memory bank briefly for useful data and leaves the task to System 1, which immediately postulates a plausible cause and concludes that the low incidence results from a healthy lifestyle and safer environment. In actuality, though, the small populations of these counties neither cause nor prevent cancer; it merely allows wide incidence variations as a result of small numbers. There is no cause for the low incidence to be found.
The flaws in reasoning caused by System 1 and its generational link to self-protection can be seen in other areas as well. As we saw in The Signal and the Noise reviewed earlier (OT 9/25/15 issue), the public perception of risk is unduly influenced by dramatic events and grotesque imagery. Kahneman agrees and notes that death from disease is 18 times more common than death by accident, but the two are judged by the public to be equally likely.
One of the nice touches of the book for science wonks is the reprinting of Kahneman's two famous papers: "Judgment under Uncertainty," published in Science, and "Choices, Values and Frames" in American Psychologist. The first illustrates how decisions made under uncertainty fail to conform to normal economic models, and the second presents an alternative model called "prospect theory." These two papers provide the nucleus of work on which the Nobel Prize in Economic Sciences was awarded to Daniel Kahneman in 2002.
Thinking Fast and Slow is a beautifully written and lucid text. It is deftly presented, richly embellished, convincingly analyzed, and full of provocative and entertaining experiments.
You will finish the book filled with novel insights but chastened by the "systematic errors in the thinking of normal people" and more skeptical of the overconfidence generated by your slick and dominating System 1.
FARRAR, STRAUS AND GIROUX, ISBN 0374533555, AVAILABLE IN HARDCOVER, PAPERBACK, KINDLE, AND AUDIO EDITIONS
More OT Book Reviews!
Find the full collection of Bob Young's OT book reviews online: bit.ly/OTCollections-Books