Linda is thirty-one years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in antinuclear demonstrations.
Which alternative is more probable?
a) Linda is a bank teller.
b) Linda is a bank teller and is active in the feminist movement.
If you answered b), then you have fallen victim to the conjunction fallacy. Answer b) may have an air of plausibility about it, but it is a clear violation of the laws of probability – every feminist bank teller is a bank teller; if you add a detail, you must lower the probability. However, if you did get the wrong answer, you are in good company. 85% of students at Stanford’s Graduate School of Business got the answer wrong. This is the academic crème de la crème of the United States. What’s more they must have had extensive training in probability. I tried it on one of my classes and two out of three got it wrong.
Welcome to the world of Daniel Kahneman, an Israeli-American psychologist who won the 2002 Nobel Memorial Prize in Economic Sciences. The 79-year-old is famous for his work with Amos Tversky on the cognitive basis for common human errors which arise from heuristics and biases. He is currently a psychology professor at Princeton.
Kahneman was born in Tel Aviv in 1934, where his mother was visiting relatives. However, he spent his childhood years in Paris, France, where his parents had emigrated from Lithuania in the early 1920s. Indeed, they were in Paris when it was occupied by Nazi Germany in 1940. They all managed to survive the war except his father, who died of diabetes when Kahneman was ten. The family moved to British Mandatory Palestine in 1948, just prior to Israel’s independence.
Today I want to look at his most recent book, Thinking Fast and Slow, which came out a couple of years ago. I have recently finished reading it and wanted to share some of its insights with you.
How we think
In the book’s first section, Kahneman describes the two different ways the brain forms thoughts. System One operates automatically and quickly on a more emotional level, whereas System Two is slower, more calculating and conscious, working on a more rational level. These systems do not reside in specific areas of the brain. It would be more accurate to think of it as a metaphor for the way we think. For example, 2 x 2 = 4 is System One, while 23 x 35 = 805 would be System Two. Both are necessary – problems occur when you use the wrong system. In particular, we have a tendency to use the error-prone System One instead of the more reliable System Two. Why is System One in charge? According to Kahneman it is because System Two is lazy; activating System Two is costly in time and also in calories. Thinking is hard work, and so we try to economise on thinking.
Now I am going to look at what can go wrong when System One is not up to the job:
Kahneman coined the acronym WYSIATI, which stands for what you see is all there is. This is how when we are forming a hypothesis we take it for granted that we have enough information. Margaret Thatcher used to say that she could judge a person within just ten seconds of meeting them. I have a colleague to thank for this wonderful anecdote about the rugby player Victor Ubogu. The former England prop had the following exchange with the Bath coach Jack Rowell:
VO: “Why do people take an instant dislike to me?”
JR: “Because it saves time.”
You do indeed save time. Taking these short cuts enables us to think fast and make sense of partial information in a complex world. Much of the time what we perceive is close enough to reality to allow us to make reasonable decisions. But it can lead to terrible mistakes.
The Priming Effect
This is when exposure to a stimulus influences a response to a later stimulus. The most curious is an experiment by John Bargh, a psychologist at New York University. One group of college students were asked to arrange brief sentences including the words Florida, forgetful, bald, grey, or wrinkle. The other half were presented with none of these words. On completing their task, the students were told to walk down the corridor to another room. The experimenters recorded the time the students took to walk this short distance. Surprisingly, the students in the first group walked more slowly than those in the second group. This has been dubbed “the Florida effect.” The unconscious association of terms commonly associated with being old actually had an effect on the students’ walking pace.
The Halo Effect
This is the belief that because people are good at doing A they will be good at doing B, C and D. This can be for both positive and negative attributions. Imagine that we meet a woman called Joan at a party and you find her friendly and easy to talk to. What we tend to do then is to extrapolate this information to conclude that Joan would be generous if a charity asked her to contribute even though you know virtually nothing about her generosity or lack thereof. And now believing Joan to be generous, we probably like her even more than we did before, because now we have added generosity to her appealing traits.
The Anchoring Effect
This refers to the human tendency to rely too heavily on the first piece of information offered – the “anchor” – when making decisions. This information can have absolutely nothing to do with the question in hand. My favourite example from the book is with some German judges who had more than fifteen years of experience on the bench. They were first provided with a description of a woman who had been caught shoplifting. They then rolled a pair of dice that were loaded so that every roll would result in either a three or a nine. First the judges were asked whether they would sentence the woman to a term in prison greater or lesser, in months than the number showing on the dice. Then they were asked to give their specific sentence; the judges who had rolled a nine proved stricter – their average sentence was eight months, whereas the average of those who rolled a three was just five months.
The book does make you a bit sceptical about the judicial system. If you ever end up in jail, you would definitely want to have your parole hearing straight after lunch. Kahneman cites a study showing that prisoners’ chances reached about 65 per cent after the judges had eaten a meal, but by the time the next meal was due they were close to zero Kahneman’s explanation is rather sobering: judges who were tired and hungry were less likely to grant parole because they didn’t have the energy to make decisions.
The Framing Effect
This is the way in which people react differently to a particular choice depending on whether it is presented as a loss or as a gain. In one experiment subjects were asked whether they would opt for surgery if the “survival” rate was 90%, while others were told that the mortality rate was 10%. The first framing increased acceptance, even though the probability was no different.
Kahneman puts our difficulties with statistics under the microscope. A group of researchers were investigating what a made a successful school. They found that smaller schools tended to have more impressive results. For instance, of 1,662 schools in Pennsylvania, six of the top fifty were small, which is four times the number there should be. This led the researchers to conclude that smaller schools are better than larger ones. However it was the smaller number of pupils at these schools which skewed the numbers. If the statisticians had asked about the characteristics of the worst schools, they would have found that bad schools also tend to be smaller than average. The truth is that small schools are not better on average; they are simply more variable. My favourite example of how a small sample size can skew data is the average starting salary of students of Cultural Geography at the University of North Carolina – it was well over $100,000. This might sound like the perfect degree until you realise that one of the students was Michael Jordan.
Kahneman criticises the tendency of the business media to lionise successful companies and their CEOs. A study of Fortune’s Most Admired Companies found that over a twenty-year period, the firms with the worst ratings went on to earn much higher stock returns than the most admired firms. In statistics this is known as regression to the mean, the tendency of outstanding but lucky results to return to statistical norms over time. CEOs’ actions can make a difference, but the statistics reveal that their effect is much less than the business media would have us believe. Being generous, the correlation between the success of the firm and the quality of its CEO might be as high as .30. Thus, if you were to compare two firms, a correlation of .30 implies that you would find the stronger CEO leading the stronger firm in about 60% of the pairs, which is just 10% better than tossing a coin. Does this really justify the hero worship of CEOs we so often see?
In economics and decision theory, loss aversion refers to people’s tendency to strongly prefer avoiding losses to acquiring gains. Some studies suggest that losses are twice as powerful, psychologically, as gains. It was Kahneman and Tversky who first demonstrated this bias. It was first proposed as an explanation for the endowment effect – the fact that people place a higher value on a good that they own than on an identical good that they do not own. Loss aversion is also at play when it comes to the difficulty we sometimes experience with cutting our losses. This is because cutting our losses – though it will be the best option for avoiding bigger losses in the long run -entails taking a hit in the moment, which is tough for System One. The thought of accepting the large sure loss is too painful, and the hope of turning it around is so appealing.
I think that it’s a fascinating read. Kahneman has said that his work is a challenge to libertarianism. That may be the case, but government bureaucracies are run by humans too, they will have their own biases too. I may look at that question next week. Kahneman is actually sceptical about his ability to make better decisions:
“I have made much more progress in recognizing the errors of others than my own.”
I am also reminded of one of Nassim Nicholas Taleb’s aphorisms:
The characteristic feature of the loser is to bemoan, in general terms, mankind’s flaws, biases, contradictions, and irrationality without exploiting them for fun and profit.