Kahneman, D. (2013). Thinking, fast and slow. New York, NY: Farrar, Straus and Giroux.
Introduction
- Accurate intuition of experts is usually the result of prolonged practice, rather than a reliance on heuristics
- Herbert Simon quote: “The situation has provided a cue. This cue has given the expert access to information stored in the memory and the information provided the answer. Intuition is nothing more and nothing less than recognition”
- Valid intuition happens when experts have learned to recognize familiar cues in situations and act out in the appropriate manner
- Without experience, intuition is likely to be inaccurate
- Humans, when faced with a difficult question, use heuristics to answer a simpler question
PART 1
Chapter 1
- Mental work is usually deliberate, orderly and effortful (System 2)
- System 1 is fast and automatic while System 2 is orderly
- System 1 is instinctive (similar behavior in lower animals) while System 2 requires attention and is disrupted when attention is denied
- Intense focus (effortful System 2) affects attention to other stimuli, e.g., the Gorilla Experiment
- System 1 generates impression while System 2 turns these impressions, intuitions and intentions into beliefs
- System 2 arises when one’s mental model of the world is violated/unexpected. It also helps self-motivating behavior
- System 1 is susceptible to biases. It is impulsive, while System 2 is responsible for self-control
- Muller-Lyer illusion – Interesting how the visual illusion is still perceived as an illusion (System 1) – even though System 2 knows the reality
Chapter 2
- System 2 is lazy and there is a reluctance to apply effort to get it to function
- Eckhard Hess quote: “The pupil of the eye is the window to the soul”
- Pupils dilate in mental effort and stop when the individual resolves the task or gives up
- Mundane conversation (a System 2 task) is conducted at a comfortable pace which is preferred by System 2
- During intense mental activity, System 2 focuses the bulk of the attention of the task while allotting lesser levels of attention to less important tasks
- With increase in skill and proficiency, demand for attention reduces and System 2 is able to return to its preferred pedestrian rate
- Law of Least Effort: If there are several ways to attain a goal, most people prefer the easiest route
- More effort is placed on System 2 whenever the individual is under time pressure
Chapter 3
- System 2 likes to work at a pedestrian rate and when a demand is placed on it, less attention is devoted to other tasks which are taken over by the more impulsive System 1
- Anything that places more demand on System 2 will allow System 1 to take over
- Ego depletion: After using System 2 on Task A, one is not willing to use System 2 again on Task B. Note that ego depletion is not cognitive busyness. The former can be increased with the right incentives while in the latter, using short term memory cannot be incentivized to improve performance
- People with active minds do not allow System 1 to take over the “bat and ball” problem, while those with a lazy System 2 accept System 1’s intuition
- If people believe a conclusion, they also accept the supporting arguments, even if they are wrong. All these are a waste of a lazy System 2
- Intelligence is not just the ability to reason, but the ability to recall relevant material in the memory an deploy attention as needed (System 2’s task)
Chapter 4
- Associative activation: When an idea triggers other ideas in the brain. For instance, seeing “banana” and “vomit” brought up memories of similar events in the past and also prepares one for future possible events that have now become subjectively more likely
- Embodied cognition: You think with the body and brain
- Despite multiple ideas being activated in the mind, only a few register in to one’s consciousness
- Priming is an example of determinism (behaviorism perspective)
- System 2 thinks it’s in charge and it knows the reason for action. Priming phenomenon triggers associative activation that influences System 1 (intuitive, impulsive “dark matter”) and System 2 has no access to them. [Note: Studies on priming have replicated poorly in recent years]
- System 1 provides impressions that turn to beliefs, impulses that turn into choices and actions. It triggers associative activation that links the past, present and future expectations
Chapter 5
- System 2 enjoys cognitive ease. In the event of cognitive strain, System 1 mobilizes System 2. Cognitive strain occurs when effort is needed and there are unmet demands
- Cognitive ease usually facilitates creativity as System 2 is not under stress. Cognitive strain, on the other hand, promotes the activity of System 2 – however this compromises intuition and creativity
- Greater cognitive ease is experienced when stimuli that have been seen before are seen again. The cognitive ease is triggered by the feeling of familiarity. The way the stimuli is presented influences the degree of cognitive ease that is experienced
- The impression of familiarity is produced by System 1 and System 2 makes judgement based on that impression
- Repeating falsehood reduces cognitive ease and System 2 assimilates that as belief
- Persuasive stimuli in the form of messages should be
- Legible
- Use simple language
- Memorable (rhymes are good)
- Names that are easy to pronounce
- Most people are guided by System 1 and not know where their impressions come from
- Cognitive ease is associated with good feelings (System 1)
- Mere Exposure Effect: Repetition of stimuli promotes cognitive ease which in turn is likely to produce positive emotions (System 1)
- Happy mood reduces the control of System 1 on performance
Chapter 6
- System 1 maintains and updates your model of the world
- Your model is made up of associated ideas that determine your interpretation of the present, as well as your expectations
- Norm Theory: Very little repetition is needed for a new experience to be deemed normal. We have norms for different categories and these norms provide background for the detection of anomalies
- System 1 tries to identify cause and effect automatically while System 2 accepts the explanation
Chapter 7
- When uncertain, System 1 bets on an answer and the bets are guided by experience
- System 1 does not keep track of alternatives, while doubt requires mental effort (a function of System 2)
- To understand a statement, you have to know what the idea means if it is true.
- When System 2 is busy, we tend to believe almost anything
- Confirmation bias: People tend to look for data that is compatible with beliefs they currently hold
- Halo Effect: Tendency to like (or dislike) everything about a person (including things that have not been observed)
- Information that is not retrieved might as well be non-existent. System 1 constructs the best possible story that incorporates ideas activated and not those inactivated
- System 1’s mantra is “What You See Is What You Get (WYSIWYG)”
- Knowing little makes it easy to fit everything that one knows into a coherent pattern
- Overconfidence: Not allowing for the possibility that evidence critical to judgement is missing
- Framing Effect: Different ways of presenting the same information evokes different emotions (and responses)
- Base-rate neglect: Forgetting the denominator of the population you are interested in
Chapter 8
- System 2 receives questions or generates them: It does this by directing attention and searching the memory for answers
- System 1 continually generates assessment of a situation without focusing on a specific intention or using up effort
- System 1 substitutes one judgement (a hard one) with another (usually a simpler one)
- For System 1, good moods and cognitive ease are indicative of safety and familiarity
- System 1 also functions in providing rapid judgement in determining whether a person is a friend or foe. People sometimes use this in determining who to vote for
- Basic assessments automatically done by System 1 during language use include:
- Computation of similarity and representativeness
- Attribution of causality
- Evaluation of the availability of association and exemplars
- System 1 represents categories by a prototype/exemplar and can do well with averages, but not sums (An example is people choosing to donate the same amount to save 2000, 20000 or 20000 birds)
- System 1 has two categories of assessment:
- Continuous routine assessments (e.g., seeing the shape of objects in 3d)
- Voluntary assessments (e.g., assessing how happy you are)
- System 1 also does intensity matching
Chapter 9
- People’s normal disposition is to have an intuitive feeling/opinion about everything that comes one’s way
- People tend to have answers to questions they do not understand completely, and they do this by relying on evidence they can neither explain nor defend
- System 1 replaces a hard question with an easier one. This is called substitution
- Heuristic: This is a simple procedure that helps find adequate, though imperfect answers to difficult questions
- Mental shotgun refers to the tendency to compute more than necessary/intended, e.g., replacing hard questions with an easier one
- Intensity matching: Automatically creating scales of intensity of emotions to judge something
- A judgement that is based on a substitution will invariably be biased
- Affect heuristic: People allowing their likes and dislikes to determine their beliefs about the world
- Self-criticism is one of the functions of System 2, however, when it comes to affect (emotions), its search for emotions is constrained to the information that is consistent with existing beliefs
PART II
Chapter 10
- System 1 likes to make causal connections, even when those connections are not there
- A random event doesn’t have an explanation. But a collection of random events may behave in a regular fashion
- System 1 will produce representations of reality that makes sense (often basing this on the Halo Effect, as well as the assumption that the Law of Large Numbers work for small samples
- System 1 likes looking for causal relationships when in the real sense, nothing in particular causes an event to happen – chance selects it from a series of alternatives
- Random processes produce many sequences that convince people that the process is not random after all
- We pay more attention to the content of a message, rather than their reliability. As a result, we end up with a view of the world that is simpler than what the data can justify
Chapter 11
- Two forms of anchoring:
- One that occurs in the process of adjustment (System 1)
- One that occurs due to the priming effect (System 2)
- In System 2, anchoring is a deliberate attempt to find reasons to move away from the anchor
- People adjust less when their memory resources are depleted
- In System 1, anchoring could occur with no corresponding subjective experience
- System 1 tries to conduct a world where the anchor is a true number. It does this via associative coherence, where any prime will evoke information compatible with it
- Searching for arguments against the anchor is useful during negotiations and is controlled by System 2
- Anchoring occurs when one is unconscious (in the case of priming) and it also occurs when you are aware because you can no longer imagine how you would have thought in the absence of the anchor
Chapter 12
- Availability heuristic: Judging frequency by the ease with which instances come to mind
- Availability heuristic replaces “size or frequency of event” with “how easily can I recall this instance”
- Make efforts to reconsider your impressions by asking “Am I believing this because of recent events?”
- Self-assessment is dominated by the ease with which certain examples come to mind
- The proof that you truly understand a pattern of behavior is that you know how to reverse it
- When people find that fluency (i.e., ability to recall the instance of an event) is worse, they doubt the frequency of the event was as high as they initially thought
- Judgements are no longer influenced by ease of retrieval if the difficulty of recall is attributed to other random/false explanations
- System 1 sets expectations and is surprised when those expectations are violated
- System 2 can reset the expectations of System 1 so that events that used to be surprising to System 1 are no longer surprising
Chapter 13
- Our expectation about the frequency of an event is distorted by the prevalence and the emotional intensity of the messages to which we are exposed
- Affect heuristic: Simple question of “How do I feel about it?” replaces the more difficult question “What do I think about it?”
- The availability cascade: When bias flows into policy. The importance of an idea is judged by the fluency and the emotional charge of the idea that comes to mind
- The mind has limited abilities to deal with small risks: We either ignore them totally or give them too much weight. Nothing in-between
- Probability neglect: Exaggerating ease of recalling disaster, but ignoring their frequency in the grand scheme of things
Chapter 14
- Base rates: The denominator
- Questions about probabilities trigger shotguns that cause us to answer a simpler question
- Representativeness bias: using stereotype characteristics of a group to make decisions about an individual without considerations about base rates
- Enhanced activation of System 2 reduces representative heuristic
- Two factors cause System 2 to fail: ignorance and laziness
- When you have doubts about the quality of an evidence; let your judgement of probability stay close to the base rate
- Bayesian statistics helps with base rate calculations.
- Keys to Bayesian reasoning:
- Anchor judgements on a plausible base rate
- Question the diagnosticity of your evidence (is it valid? Is it measuring what it should?)
Chapter 15
- Conjunction fallacy: People judge a conjunction of 2 events (A and B) to be more probable than one of the event (A) in direct comparison
- Representative outcomes combine personality description to produce coherent stories which are not necessarily the most probable, but most plausible
- Large sets are values more than smaller sets in joint evaluation but less in single evaluations
Chapter 16
- Two types of base rates:
- Statistical base rates: Facts about a population to which a case belongs; not relevant to individual case
- Causal base rates: Changes views about how the individual case came to be
- System 1 represents categories as norms
- Neglecting valid stereotypes may result in suboptimal judgements
- Inferences drawn from causal base rates include:
- Stereotypical trait
- Feature of a situation that affects an individual’s outcome
- People do not draw from base-rate information an inference that conflicts with their other beliefs
- Nisbett & Burgida (1975): People feel relieved of responsibility when they know others have heard the same request for help
- In the absence of any useful information, the Bayesian solution is to stay with the base rates
- People exempt themselves from the experimental conclusions that surprise them
- People are naturally unwilling to reduce the particular from the general but they are willing to infer the general from the particular (Nisbett & Burgida, 1975)
- The test of learning psychology is not whether you have learned a new fact, but whether your understanding of situations have changed
Chapter 17
- The more extreme the original score, the more regression we expect, because an extremely good score suggests a very lucky day
- Whenever correlation between 2 scores are imperfect, there will be regression to the mean
- System 2 finds the relationship between correlation and regression difficult to understand and learn partly because of System 1’s desire to find causal interpretations
Chapter 18
- Intuitive predictions are insensitive to actual predictive quality of evidence
- When people are asked for prediction, they replace the question with one about evaluation of evidence
- Intuitive predictions need to be corrected because they are not regressive and are therefore blind
Chapter 19
- Narrative fallacy: Flawed stories about the past shape views of the world and of the future
- The human mind does not deal with nonevents. We tend to exaggerate the role of skill and underestimate the part that luck played in the outcome
- People build the best possible story from the information available to them. And if it is a good story, they believe it
- People have an unlimited ability to ignore their ignorance
- We understand the past less than we believe we do
- Once you have a new view of the world, you forget what you used to believe before your mind changed
- Hindsight bias: The tendency to revise the history of one’s beliefs in the light of what actually happened
- We blame decision makers for good decisions that worked out badly and give them little credit when good decisions turn out good
- The worse the consequence, the greater the hindsight bias
- The illusion that one understood the past feeds the further illusion that one can predict and control the future
Chapter 20
- Confidence is a feeling which reflects the coherence of the information and the cognitive ease of processing it
- People ignore base rate information when it clashes with their personal impressions from experience
- The illusion that we understand the past fosters overconfidence in our ability to predict the future
- We think we are able to explain the past by focusing on social movements, technological developments and the abilities of certain great men. Not true.
- Errors of prediction are inevitable because their world is unpredictable
- High subjective confidence (System 1) should not be trusted as an indicator of accuracy
- Short-term trends can be forecast and behaviors can be predicted with fair accuracy from previous behaviors. But behavior in tests and in the real world are determined by context-specific factors
Chapter 21
- Domains with significant degree of uncertainty and unpredictability are called “low-validity environments”. In these environments, accuracy of experts was matched or even exceeded by a simple algorithm
- Sometimes, complexity reduces validity and many times, experts introduce unnecessary complications
- Additionally, humans find it difficult to make consistent decisions. When provided with the same information, humans will give conflicting answers
- Decisions are context-dependent (System 1)
- Slight changes in context can impact decisions greatly
- When predictability is poor, inconsistency is destructive
- To maximize predictive accuracy, final decisions should be left to formulas, especially in low-validity environments [Note: Building from first-principles]
- Intuition adds value after a disciplined collection of objective information via algorithms
Chapter 22
- In his book, Sources of Power, Gary Klein posited that experts do not limit their options to a pair. Rather, they draw from a repertoire of patterns (System 1’s associative memory) and mentally simulate different options (System 2)
- The situation has provided a cue (discriminative stimuli); the cue has given the expert access to information stored in the memory (repertoire) and the information provides an answer. Intuition is nothing more and nothing less than recognition [Kahneman’s twist to Simon’s quote earlier in the book]
- Acquiring expertise in a difficult skill is harder and slower than learning to read because hard skill usually consists of more “letters” in the “alphabet” and the “words” contain more “letters”
- How skill is developed
- An environment that is regular to be predictable
- An opportunity to learn these regularities through prolonged practice
- Intuitions cannot be trusted in the absence of stable regularities in the environment
- Whether professionals gave a chance to develop intuitive expertise depends on the quality and speed of feedback, as well as the sufficient opportunity to practice
- If the environment is regular and the decisionmaker has had a chance to learn its regularities, the associative machinery will recognize situations and generate quick and accurate predictions
- Klein summarized all these in the Recognition-Primed Decision (RPD) model
Chapter 23
- Inside view vs Outside view. People who have information about an individual case (inside view) are reluctant to know the statistics of the class to which the case belongs (outside vide)
- Planning fallacy: A focus on unrealistic best-case scenarios which could be improved by consulting statistic on similar cases
- Taking the outside view is the cure for the planning fallacy
Chapter 24
- People who have the greatest influence on the lives of others are likely to be optimistic and overconfident and to take more risks than they realize
- An optimistic temperance encourages persistence in the face of obstacles
- People may achieve higher average returns by selling their skills to employees than by setting out on their own
- Hubris hypothesis: Executives of an acquiring firm are less competent than they think they are
- Overconfidence: In estimating a quantity, choosing to rely on information that easily comes to mind and constructing a coherent story around it that makes sense
- Inadequate appreciation of the uncertainty of the environment leads economic agents to take risks that they should avoid
- An appreciation of uncertainty is a cornerstone of rationality
- The premortem is a partial remedy for overconfident optimism
Chapter 25
- People are neither fully rational nor completely selfish and their stakes are anything but stable
- Gambles represent the fact that consequences of choices are never certain
- Bernoulli’s theory did not consider reference points
Chapter 26 – Prospect Theory
- In the utility theory, the utility of gains and losses are allowed to only differ in their signs (+ or -). There is no way to represent the fact that the disutility of losing $500 could be greater than the utility of winning the same amount – though, of course it is.
- One’s attitude to gains and losses does not arise from an evaluation of one’s wealth. Rather, people simply prefer winning to losing and dislike losing more than winning
- The missing variable in Bernoulli’s model was the reference point – the earlier state relative to which gains or losses are evaluated
- The 3 cognitive features of the Prospect Theory include:
- Neutral reference point, also called the adaptation level
- Principle of diminishing sensitivity
- Loss aversion – treating threats more than opportunities facilitate survival (evolutionary framework)
- In mixed gambles, where both a gain and a loss are possible, loss aversion causes extremely risk-averse choices. In bad scenarios where a sure loss is compared to a larger loss that is probable, diminishing sensitivity causes risk seeking
- Criticism of prospect theory – It cannot account for disappointment and regret.
Chapter 27
- The indifference curves assume that utility is determined by the present situation
- The standard utility theory also assumes that preferences are stable over time (i.e., points on the indifference curve will provide the same utility over time). On the other hand, prospect theory asserts that people on points in an indifference curve will eventually prefer the status quo
- Indifference curves do not predict two things:
- Tastes are not fixed
- The disadvantages of a change loom larger than its advantages, consequently inducing a bias that favors the status quo
- Utility theory proposes that your utility for a state of affairs depend on that state and not on your history
- Endowment effect: Willingness to buy or sell is dependent on reference point. If the item is owned, one considers the pain of giving it up. If the item is not owned, one considers the pleasure of owning it
- Experienced traders ignore the endowment effect. They have learned to ask the right question: “How much do I want X, compared to other things I could have instead?”
- In prospect theory, being poor is living below one’s reference point. They are always in losses
Chapter 28
- The amygdala is activated in response to threats
- In many situations, the boundary between bad and good is a reference point that changes over time and depends on immediate circumstances
- One of the ways negativity dominance is expressed is loss aversion
Chapter 29
- When you form a global evaluation of a complex subject, you assign weights to its characteristics. Some characteristics influence your assessments more than others do.
- The more probable an outcome, the more weight it should have. The expected value of a gamble is the average of its outcomes, each weighted by its probability. This is the expectancy principle
- Possibility effect: Highly unlikely outcomes are weighted more disproportionately than they deserve
- Certainty effect: Outcomes that are almost certain are given less weight than their probabilities justify
- Contrary to the expectation principle, the decision weights that people assign to outcomes are not identical to the probabilities of those outcomes.
- People attach values to gains and losses, rather than to wealth and decision weights they assign to outcomes are different from probabilities
- Diminishing sensitivity makes the sure loss more aversive and the certainty effect makes the gamble less aversive. In the same vein, when outcomes are positive, the sure thing is more attractive, while the gamble is less attractive
- Paying a premium to avoid a small risk of a large loss is costly
Chapter 30
- People overestimate the probabilities of unlikely events, and they overweigh the unlikely events in their decisions. This is because of cognitive ease, focused attention and confirmation bias
- A rich and vivid representation of outcome reduces the role of probability in the evaluation of an uncertain prospect
- Denominator neglect: A focus on the numerator but not the denominator
- Low probability events are more heavily weighed when described in weighted frequencies rather than in abstract terms
Chapter 31
- People tend to be risk averse in the domain of gains and risk seeking in the domains of loss
- Every simple choice formulated in terms of gains and losses can be deconstructed in innumerable ways into a combination of choices, yielding preferences that are likely to be inconsistent
- It is costly to be risk averse for gains and risk seeking for losses. This makes you more liable to pay a premium to obtain a sure gain rather than face a gamble, and also willing to pay a premium to obtain a sure loss
- People consider decisions in two main kinds of frames:
- Narrow framing, which is a sequence of two simple decisions considered separately
- Broad framing, which is a single comprehensive decision with 4 options
- Deliberate avoidance of exposure to short-term outcomes improves the quality of both decisions and outcomes
Chapter 32
- Disposition effect: People sell winners rather than losers.
- The rational behavior would be to sell the stock that is less likely to do well in the future
- An argument against seeking winners is that stocks that have recently gained in value are more likely to go on gaining for a short while
- Sunk cost error: The decision to continue investing in a losing account when better investments are available
- Intense regret is what you experience when you can most easily imagine yourself doing something other than what you did
- People expect to have stronger emotional reactions (including regret) to an outcome that is produced by action than to the same outcome when it is produced by inaction
- The taboo tradeoff against accepting any increase in risk is not an efficient way to use the safety budget
Chapter 33
- Preference reversal occurs because joint evaluation focuses attention on an aspect of the situation which is less salient in a single evaluation
- Judgement and preference are coherent within categories but potentially incoherent when the objects evaluated belong to different categories
Chapter 34
- For Econs, the objects of their choice are states of the world which are not affected by the words used to describe them
- Amygdala – Emotional arousal functions. Active when the choice conforms to the frame
- Anterior cingulate – Conflict and self-control. Active when one doesn’t do what comes naturally
- Frontal area – Combines emotions and reasoning to guide decision making
- Decision makers tend to prefer the sure thing over the gamble when outcomes are good and reject the sure thing and accept the gamble when outcomes are negative
- Broader frames and inconclusive accounts generally lead to more rational decisions
Chapter 35
- Peak end rule: Global retrospective rating is predicted by average level of pain at the worst moment of the experience and at the end
- Duration neglect: The duration of a procedure has no effect on the total ratings of the total pain
- Experiencing self is different from the Remembering self. Memories are all we keep from our experience of living and the only perspective we can adopt as we think about our lives is that of the remembering self
- The memory that the remembering self keeps is a representative moment influenced by the peak and the end
Chapter 36
- The Remembering self composes stories and keeps them for future reference
Chapter 37
- The easiest way to increase happiness is to control your use of time
Chapter 38
- Affective forecasting: The belief that the rate of X is high, but the statistics do not apply to one
- Mood heuristic is one way people answer the life satisfaction question
- The score one assigns to one’s life is determined by a small sample of highly available ideas and not a careful weighting of all domains of one’s life
- A hybrid view of both the Experiencing and Remembering selves should be considered in defining happiness
- Focusing illusion: Nothing in life is as important as you think it is when you are thinking about it
- The Remembering self is subject to massive focusing illusions about life that the experiencing self endures quite comfortably
- Miswanting: Bad choices arise from errors of affective forecasting.
Conclusion
- “And of course you also remember that the two systems do not really exist in the brain or anywhere else. “System 1 does X” is a shortcut for “X occurs automatically.” And “System 2 is mobilized to do Y” is a shortcut for “arousal increases, pupils dilate, attention is focused, and activity Y is performed.”