Skip to content

Thinking Fast and Slow Part III: Overconfidence


Part 3 covers overconfidence as influenced by Nassim Taleb’s Black Swan. We tend to think we understand things after the fact, but it is frequently just made up explanations for random fluctuations (narrative fallacies). This leads in part to feeling that our explanations and our expertise are valid.

Simple mathematical formulas often are able to consistently outperform experts because even experts tend to overweight more salient (availability heuristic) factors, instead of giving more weight to more important factors. Expert intuition can be trusted when there is ample room for practice and clear, continual feedback (like in chess). Confidence is a poor indicator of accuracy, but underconfidence may indicate inaccuracy, since people tend to be overconfident.

To help avoid overconfidence, the “outside view” is a useful tool. It involves factoring in other cases that are similar, instead of looking solely at one’s own case. This can help remedy the planning fallacy. Finally, start-up businesses suffer from overconfidence due to many factors- neglecting similarly motivated competitors, neglecting base rates, ignoring unforeseen challenges.

Kindle Notes:

Narrative fallacies arise inevitably from our continuous attempt to make sense of the world. The explanatory stories that people find compelling are simple; are concrete rather than abstract; assign a larger role to talent, stupidity, and intentions than to luck; and focus on a few striking events that happened rather than on the countless events that failed to happen. Any recent salient event is a candidate to become the kernel of a causal narrative. Taleb suggests that we humans constantly fool ourselves by constructing flimsy accounts of the past and believing they are true (3340).

After Nixon’s return from his travels, Fischh off and Beyth asked the same people to recall the probability that they had originally assigned to each of the fifteen possible outcomes. The results were clear. If an event had actually occurred, people exaggerated the probability that they had assigned to it earlier (3410).

There is a clear outcome bias. When the outcomes are bad, the clients often blame their agents for not seeing the handwriting on the wall—forgetting that it was written in invisible ink that became legible only afterward (3423).

Knowing the importance of luck, you should be particularly suspicious when highly consistent patterns emerge from the comparison of successful and less successful firms. In the presence of randomness, regular patterns can only be mirages (3489).

Note: Cream always rises to the top?

On average, the gap in corporate profitability and stock returns between the outstanding firms and the less successful firms studied in Built to Last shrank to almost nothing in the period following the study. The average profitability of the companies identified in the famous In Search of Excellence dropped sharply as well within a short time. A study of Fortune’s “Most Admired Companies” finds that over a twenty-year period, the firms with the worst ratings went on to earn much higher stock returns than the most admired firms (3494).

Confidence is a feeling, which reflects the coherence of the information and the cognitive ease of processing it. It is wise to take admissions of uncertainty seriously, but declarations of high confidence mainly tell you that an individual has constructed a coherent story in his mind, not necessarily that the story is true (3573).

Philip Tetlock, a psychologist at the University of Pennsylvania, explored these so-called expert predictions in a landmark twenty-year study, which he published in his 2005 book Expert Political Judgment: How Good Is It? How Can We Know?Read more at location (3692).

The results were devastating. The experts performed worse than they would have if they had simply assigned equal probabilities to each of the three potential outcomes. In other words, people who spend their time, and earn their living, studying a particular topic produce poorer predictions than dart-throwing monkeys who would have distributed their choices evenly over the options. Even in the region they knew best, experts were not significantly better than nonspecialists.Read more at location (3701).

The first lesson is that errors of prediction are inevitable because the world is unpredictable. The second is that high subjective confidence is not to be trusted as an indicator of accuracy (low confidence could be more informative) (3726).

If subjective confidence is not to be trusted, how can we evaluate the probable validity of an intuitive judgment? When do judgments reflect true expertise? When do they display an illusion of validity? The answer comes from the two basic conditions for acquiring a skill: an environment that is sufficiently regular to be predictable an opportunity to learn these regularities through prolonged practice When both these conditions are satisfied, intuitions are likely to be skilled (4066).

Remember this rule: intuition cannot be trusted in the absence of stable regularities in the environment (4091).

Amos and I coined the term planning fallacy to describe plans and forecasts that are unrealistically close to best-case scenarios could be improved by consulting the statistics of similar cases (4247).

when people are asked about a task they find difficult (for many of us this could be “Are you better than average in starting conversations with strangers?”), they readily rate themselves as below average. The upshot is that people tend to be overly optimistic about their relative standing on any activity in which they do moderately well  (4420).

Physicians also reported their confidence. The result: “clinicians who were ‘completely certain’ of the diagnosis antemortem were wrong 40% of the time.” Here again, expert overconfidence is encouraged by their clients: “Generally, it is considered a weakness and a sign of vulnerability for clinicians to appear unsure. Confidence is valued over uncertainty and there is a prevailing censure against disclosing uncertainty to patients.” Experts who acknowledge the full extent of their ignorance may expect to be replaced by more confident competitors, who are better able to gain the trust of clients (4473).

Note: More hopelessness for democracy.

The premise of the session is a short speech: “Imagine that we are a year into the future. We implemented the plan as it now exists. The outcome was a disaster. Please take 5 to 10 minutes to write a brief history of that disaster” (4504).

Leave a Comment

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: