Thinking, fast and slow: two systems, two species

1780061_10153857896850405_79374561_o

Two systems

System 1 – automatic, fast; System 2 – effortful, slow.

System 1 is “the stranger in you”, which controls much of what we do, although we are seldom consciously aware of its activities. It provides the impressions that often turn into our beliefs, and is the source of the impulses that often become our choices and our actions. It is the origin of many of the systematic errors in our intuition.

Two species

Econs – live in the land of theory; Humans – act in the real world.

The most common mistake that well-meaning policymakers make: assuming Humans are Econs.

In Economics and Decision Theory, “rationality” is defined as logical coherence, whether reasonable or not. Econs are rational by this definition, but there is overwhelming evidence that Humans cannot be. An Econ would not be susceptible to priming, 1 narrow framing, 2 the inside view, 3 preference reversals, 4 or WYSIATI, which Humans cannot consistently avoid.

What you see is all there is?

“Will Mindik be a good leader? She is intelligent and strong…”

If you are normal, an answer came quickly to mind, and it was yes. You picked the best answer based on the very limited information available. But what if the next two adjectives were “corrupt” and “cruel”?

If you are normal, you did not start by asking “What would I need to know before I formed an opinion about the quality of someone’s leadership?” System 1 got to work on its own from the first adjective: “intelligent” is good, “intelligent and strong” is very good. This is the best story that can be constructed from two adjectives, and System 1 delivered it with great cognitive ease.

Jumping to conclusions on the basis of limited evidence is a favorite hobby of System 1. Our lazy System 2 (our rational mind that thinks it’s in charge, but isn’t) normally just goes along – hence our bias favoring first impressions, for example.

It is the consistency of the information that matters for a good story, not its completeness. Knowing little makes it easier to fit everything we know into a coherent pattern. Never mind that a little learning is a dangerous thing (and never mind the irony that this Pope quote is often misquoted).

WYSIATI explains why the we can think fast, and how we are able to make sense of partial information in a complex world. Much of the time, the story System 1 puts together is close enough to reality to support reasonable action. But WYSIATI also explains many errors of judgment and choice:

  • Overconfidence: the confidence we have in our beliefs depends mostly on the quality of the story we can tell by what we see, even when we see little. We often fail to see that critical information is missing. And we tend to suppress doubt and ambiguity.
  • Framing effects: different ways of presenting the same information often evoke different emotions. We are more likely to buy food that is “90% fat free” than “10% fat”.
  • Base-rate neglect: if we are told that Steve is meek and tidy, and we have to guess if he’s a librarian or a farmer, we are more likely to guess “librarian”, although the ratio of male farmers to male librarians is 20:1.

from Thinking, Fast and Slow, by Daniel Kahneman

Notes:

  1. Priming: a memory effect in which exposure to one stimulus influences a response to another stimulus. After hearing the word “eat”, we are more likely to complete so_p as “soup” rather than “soap”. If asked to think of something of which we are ashamed, we are more likely to say “soap”. We are less likely to cheat in a room that has a poster of a pair of eyes than a bed of roses. Our System 2 will most likely deny that our System 1 can be so easily manipulated. It believes that IT is in charge, and that it knows the reasons for its choices. Our System 2 is probably wrong.
  2. Framing effect: drawing different conclusions from the same information, depending on how or by whom that information is presented.
  3. The inside view – the view held by closely associated people (f.e within a project) on the risks for cost increases, time schedule delays and benefit shortfalls.
  4. Preference reversal is a phenomenon widely observed in experiments designed to test the validity of the assumptions usually taken as underlying economic theories of behavior and of welfare. In choices between pairs of simple monetary gambles, it has been found that individuals choose bets involving high probabilities of small gains (so-called P-bets) rather than bets offering a smaller chance of a larger prize (so-called $-bets) even though they attach a higher monetary value to the $-bets. This evidence has generated a controversy as to whether the preferences that are assumed to underlie people’s choices are better seen as context-free (the usual economic point of view, in which the means through which a preference is elicited is supposed to be irrelevant) or context-sensitive (the usual psychological point of view, in which the experimental means can affect the outcome).

Comments are closed