Skip to main content

Security Analysis Text: Appendix II Heuristics and Cognitive Biases

Security Analysis Text
Appendix II Heuristics and Cognitive Biases
  • Show the following:

    Annotations
    Resources
  • Adjust appearance:

    Font
    Font style
    Color Scheme
    Light
    Dark
    Annotation contrast
    Low
    High
    Margins
  • Search within:
    • Notifications
    • Privacy
  • Project HomeSecurity Analysis
  • Projects
  • Learn more about Manifold

Notes

table of contents
  1. Preface and Introductory Materials
  2. Accessibility Statement
  3. Chapter 1 Introducing Security Analysis
  4. Chapter 2 Why a Critical-Thinking Framework?
  5. Chapter 3 Foundations of Security Analysis
  6. Chapter 4 Purpose and Questions
  7. Chapter 5 Information and Context
  8. Chapter 6 Points of View and Assumptions
  9. Chapter 7 Conceptualizing the Problem
  10. Chapter 8 Alternatives
  11. Chapter 9 Interpretation and Inferences
  12. Chapter 10 Implications and Consequences
  13. Chapter 11 Written Reports and Verbal Briefings
  14. Appendix I Informal Logic Fallacies
  15. Appendix II Heuristics and Cognitive Biases
  16. Appendix III Analyzing Political Culture

Appendix II

Heuristics and Cognitive Biases

Introduction

In his book Thinking, Fast and Slow,1 Daniel Kahneman aims to make psychology, perception, irrationality, decision making, errors of judgment, cognitive science, intuition, statistics, uncertainty, illogical thinking, and behavioral economics all easy for the masses to grasp. Kahneman’s book, which all security analysts should read, is about the biases of people’s intuition. That is, people assume certain things without having thought through them carefully. Kahneman calls those assumptions heuristics,2 which lead to biases in thinking. He spends over 400 pages providing examples of how certain heuristics can lead to muddled thinking and gives each heuristic a name such as “confirmation bias,” “cognitive ease,” “halo effect,” “availability bias,” and so forth. This appendix provides a summary of Kahneman’s heuristics, associated biases, and the potentials for error in thinking that heuristics and biases can cause.3

Kahneman highlights how a person’s brain works within two abstract systems: one that thinks fast, System 1; and one that thinks slowly, System 2.4 System 1 (fast) thinking operates automatically, intuitively, involuntarily, and effortlessly—as when people drive, recall their age, or go through their morning routine to get ready for school or work. System 1 thinking tends to jump to quick solutions. System 2 (slow) thinking requires slowing down to solve problems through deliberate, reasoned, focused thinking such as when calculating a math problem, choosing where to invest money, or filling out a complicated form. System 2 thinking tends not to jump to quick solutions. These two systems often conflict with one another. System 1 thinking operates with heuristics that may lead to inaccurate conclusions, while System 2 thinking requires mental effort because it evaluates the situation and potential heuristics, but may still be error prone. Kahneman’s book reveals how to recognize situations in which mistakes are likely and provides guidance to avoid significant mistakes when stakes are high.5 This is the essence of critical thinking.

System 2 thinking affects people’s bodies (dilates pupils), attention (limits observation), and energy (depletes resources). Because System 2 thinking takes mental effort, people are prone to use System 1 thinking—the path of least resistance. Kahneman highlights how laziness is built deep into human nature, causing thinking to often default to the easiest path to reach a solution.6 People use System 1 thinking to accomplish routine tasks. They use System 2 thinking to manage complicated tasks. Thinking fast says, “I need groceries.” Thinking slow says, “I will not try to remember what to buy but write myself a shopping list.”7

People on a leisurely stroll will stop walking when asked to complete a difficult mental task. Calculating while walking is an energy drain. This is why being interrupted while concentrating is frustrating, why people forget to eat when focused on an interesting project, why multi-tasking while driving is dangerous, and why resisting temptation is extra hard when a person is stressed. Self-control shrinks when people are tired, hungry, or mentally exhausted. Because of this reality, humans are prone to let System 1 thinking take over intuitively and impulsively. Kahneman argues people often do not take the time and effort to think slowly through problems. He also cites how intelligence is not only the ability to reason; it is also the ability to find relevant material in memory and apply attention when needed.8 Accessing memory takes effort; but, by not doing so, people are prone to make mistakes in judgment.

Kahneman also offers how one or more of the heuristics, summarized in the remainder of this appendix, will be at work in any decision process. More of the heuristics likely will be at play in most System 1 thinking situations. While fewer heuristics likely will be at work in System 2 thinking, there may still be some there. Critical thinking seeks to generate more System 2 thinking, which will hopefully overcome many of the thinking biases. Kahneman describes the workings of the mind as an uneasy interaction between the two abstract systems, which are summarized in Figure II.1.9 The way to block errors originating in System 1 thinking is simple in principle: recognize the signs a person is in a “cognitive minefield,” slow down, and ask for reinforcement from System 2 thinking.10

Figure II.1

Summary of System 1 (Fast) Thinking and System 2 (Slow) Thinking

System 1

System 2

Uses subconscious values, drives, beliefs that

influence “gut reactions.”

Articulates judgments, makes choices, endorses

or rationalizes ideas and feelings.

Jumps to conclusions regarding causality.

Makes up stories to either confirm or deny those

conclusions.

Operates effortlessly.

Requires conscious effort to engage.

Can be wrong but is more often right.

Can be right or wrong depending on the level of

effort.

Heavily influenced by heuristics.

Examines those heuristics when so inclined.

Frequently Encountered Heuristics

Heuristic #1: Priming. Conscious and subconscious exposure to an idea “primes” people to think about an associated idea. Things outside human conscious awareness can influence how they think. For example, if a person has been talking about food, they will fill in the blank SO_P with a U; but, if they have been talking about cleanliness, they will fill in the blank SO_P with an A. These subtle influences also affect behavior in ways people do not realize.11 People reading about the elderly will unconsciously walk slower, and people who are asked to walk slower will more easily recognize words related to old age. People asked to smile find jokes funnier, while people asked to frown find disturbing pictures more disturbing. It is true that if humans behave in certain ways, their thoughts and emotions will eventually catch up. People cannot only feel their way into behavior but also behave their way into feelings. Potential for error—people often are not objective, rational thinkers. Multiple factors can influence judgment, attitude, and behavior that people are not even aware of.

Heuristic #2: Cognitive ease. Things that are easier to compute, more familiar, and easier to read seem truer than things that require hard thought, are novel, or are hard to see. Kahneman offers that predictable illusions inevitably occur if a person’s judgment is based on a condition of cognitive ease or strain.12 How does a person know a statement is true? If it is strongly linked by logic or association to other beliefs or preferences a person holds, supported by evidence, or comes from a source they trust and like, they will feel a sense of cognitive ease and assess the statement as true.13 Because things that are familiar seem truer, it is common for teachers, advertisers, marketers, authoritarian tyrants, and even cult leaders to repeat their messages endlessly. This is related to the Repetition informal logic fallacy in Appendix I. Cognitive ease is also related to “cognitive dissonance,” a condition where a person faces two conflicting ideas in their brains and, in order to relieve the dissonance, they select the idea that is already familiar or seems truer—and subsequently discard the other idea. Potential for error—if people are comfortable reading or hearing a lie often enough, they tend to believe it.

Heuristic #3: Coherent stories (Associative coherence). To make sense of the world, people often tell themselves stories about what is going on, such as making associations between events, circumstances, and regular occurrences. The more these events fit into their stories, the more normal they seem. Things that do not occur as expected take people by surprise, so people create stories to make them fit. Examples include phrases such as “everything happens for a purpose,” “that person acted out of character,” or “that was so weird it cannot be random chance.” Abnormalities, anomalies, and incongruities in daily living beg for coherent explanations. Often those explanations involve either (1) assuming intention, “it was meant to happen;” (2) assuming causality, “they are homeless because they lack ambition;” or (3) interpreting providence, “there is a divine purpose in everything.” Humans are programmed from birth to have impressions of causality, which do not depend on reasoning about conditions of causation.14 The mind is ready and even eager to identify agents (decision makers), assign them personality traits and specific intentions, and view their actions as expressing individual propensities.15 This is related to the Fundamental attribution bias described in Figure 2.3. Potential for error—people tend to posit intention and agency where none exists, confuse causality with correlation, and make more out of coincidences than is statistically warranted.

Heuristic #4: Confirmation (affirmation) bias. This is the tendency to find confirming evidence for an existing belief while overlooking conflicting evidence. Jumping to conclusions is efficient if the conclusions are likely to be correct and the costs of an occasional mistake acceptable, and if the jump saves time and effort. Jumping to conclusions is risky when the situation is unfamiliar, the stakes are high, and there is no time to collect more information.16 System 1 thinking fills in ambiguity with automatic guesses and interpretations that fit into Coherent stories; this type of thinking rarely considers other interpretations. When System 1 thinking makes a mistake, System 2 thinking jumps in to slow the thinking down and consider alternative explanations. System 1 thinkers can be gullible and biased to believe the familiar, while System 2 thinking is in charge of doubting and unbelieving. System 2 thinking; however, is sometimes busy, often lazy, and will defer to System 1 thinking when it can.17 Potential for error—people are prone to over-estimate the probability of unlikely events (irrational fears) and accept uncritically every suggestion matching their pre-formed views of a situation.

Note: Confirmation (affirmation) bias can be troublesome, especially when employed in conjunction with Cognitive ease (cognitive dissonance). Combining lies or misinformation with confirmation bias and cognitive ease can result in gross thinking errors and lead to conditions supporting addictions, cults, and conspiracy theories. Refer to Critical Belief Analysis in Chapter 6.

Heuristic #5: Halo effect. This is the strong tendency to like or dislike everything about a person, including things people have not observed.18 The warm emotion or “halo” attributed to a person, place, or thing predisposes people to like everything about that person, place, or thing. Good first impressions tend to positively color later negative impressions and, conversely, negative first impressions can negatively color later positive impressions. The first to speak their opinion in a meeting can influence others’ opinions. A list of positive adjectives to describe a person influences how people interpret negative adjectives that come later in the list. Likewise, use of negative adjectives early can color later positive adjectives. The problem with all these examples is that a person’s intuitive judgments can be impulsive, not clearly thought through, or critically examined. To remind System 1 thinking to stay objective, to resist jumping to conclusions, and to enlist the evaluative skills of System 2 thinking, Kahneman coined the abbreviation, “WYSIATI” (what you see is all there is).19 In other words, do not lean on information based on first impressions or intuitions; stay focused on the hard data observed. Potential for error—people should try to combat overconfidence by basing their beliefs on critical thinking, not on subjective feelings. People can increase clear thinking by expressing doubt and ambiguity.

Heuristic #6: Judgment. System 1 thinking relies on intuition, a basic assessment of what is going on inside and outside the mind. It is prone to ignore “sum-like variables”—such as when dealing with mutually exclusive events or conditional probability events (Chapter 7).20 People often fail to accurately calculate sums but rely instead on often-unreliable intuitive averages by automatically and subconsciously rating the relative merits of a thing by matching dissimilar traits. Moreover, people are prone to evaluate a decision without distinguishing which variables are most important; this is called the “mental shotgun” approach.21 Basic intuitive assessments can easily replace the hard work System 2 thinking must do to make judgments. Potential for error—without a person assessing the process of their thinking, they may make bad decisions.

Heuristic #7: Substitution. When confronted with a perplexing problem, question, or decision, people tend to make life easier for themselves by answering a substitute or simpler question. In other words, instead of estimating the probability of a certain complex outcome, people rely on an estimate of another less-complex outcome. For example, instead of grappling with the mind-bending philosophical question, “What is happiness?” a person may resort to answering the easier question, “What is my mood right now?”22 Even though anxious people may activate System 2 thinking, they often still obsess over and second guess every decision, fear, or risk. It is surprising how often System 1 thinking works just fine for highly anxious people. Even chronic worriers function effortlessly in many areas of life while System 1 thinking is running in the background. They walk, eat, sleep, breath, make choices, make judgments, trust, and engage in enterprises without fear, worry, or anxiety. Why? They replace vexing problems with easier problems. Potential for error—people will be unlikely to get around to answering the harder questions.

Heuristic #8: Affect. Emotions influence judgment, which influences behavior. People frequently let their likes and dislikes influence their beliefs about the world.23 Potential for error—people can let their emotional preferences cloud their judgment and either underestimate or overestimate risks and benefits.

Heuristics Leading to Key Biases

Heuristic #9: Law of small numbers. Often, small statistical samples are more prone to interpretation resulting in extreme outcomes than large samples, because people tend to lend the outcomes of small samples more credence than statistics warrant. System 1 thinking is impressed with the outcome of small samples but should not be. Small samples are not necessarily representative of large samples, which usually are more precise. People err when they intuit rather than compute, which may add significant bias to their thinking.24 This is related to the Part-to-the-whole informal logic fallacy in Appendix I. Also see Chapter 3 for more information on sampling theory. Potential for error—people make decisions on insufficient data.

Heuristic #10: Confidence over doubt. System 1 thinking suppresses ambiguity and doubt by constructing coherent stories from pieces of data. System 2 thinking is a person’s inner skeptic, weighing those stories, doubting them, and suspending judgment. But, because disbelief requires lots of mental effort, System 2 thinking sometimes allows people to slide into a state of false certainty. Because the human brain is a pattern-recognition device, people tend to attribute causality where none exists. Regularities can occur at random. A coin flip of 50 heads in a row seems unnatural but, if one were to flip a coin billions and billions of times the odds are that 50 heads in a row would eventually happen. When people detect what appears to be a rule, they may reject the idea that the process is truly random.25 Attributing oddities to chance takes work; it is easier to attribute them to some intelligent force in the universe. Kahneman highlights how some outcomes may be due to blind luck.26 There are many facts in this world that occur by chance and do not lend themselves to explanations. Potential for error—making connections where none exist.

Heuristic #11: Anchoring effect. This is the subconscious phenomenon of making incorrect estimates due to previously heard or seen quantities or information.27 For example, people feel as though 35 mph is fast if they’ve been driving 10 mph but slow if they just got off the freeway doing 65 mph. Or, buying a house for $200K seems high if the asking price was raised from $180K but low if the asking price was lowered from $220K. Potential for error—people are more susceptible to suggestions than they realize.

Note: There was a classic case of the Anchoring effect in the October 2002 D.C. Sniper case. One or more snipers were terrorizing the metropolitan Washington D.C. area, randomly killing 10 and injuring three citizens. Reports from the crime scenes indicated several sightings of a “white van” fleeing most shootings, but there were a few other reports of a “blue sedan” fleeing the scenes. Because criminal profilers in this case anchored on the white van reports, local law enforcement focused its efforts primarily on finding a white van. In the end, two snipers were captured in a blue sedan.

Heuristic #12: Availability heuristic. When asked to estimate numbers like the frequency of divorces in Hollywood, the number of dangerous plants, or the number of deaths by plane crashes, the ease with which people retrieve an answer influences the size of their answer.28 People are prone to give bigger answers to questions that are easier to retrieve, especially when a person has had a related emotional personal experience. For example, a person who got mugged overestimates the frequency of muggings, one exposed to news about school shootings overestimates the number of gun crimes, and one who does chores at home overestimates the percentage of the housework he/she does. Potential for error—under or overestimating the frequency of an event based on ease of retrieval rather than statistical calculation adds bias to thinking.

Heuristic #13: Availability cascades. When media outlets report information that overwhelms a person’s statistical senses, his/her ability to objectively assess a situation can be distorted. For example, a recent plane crash can cause people to think air travel is more dangerous than car travel. This can then start a negative feedback loop, which can create a cascade of fear. In other words, this can become a situation where the emotional tail wags the rational dog.29 Potential for error—overreacting to a situation or problem simply because people hear a disproportionate number of negative stories can bias thinking.

Heuristic #14: Representativeness. Similar to profiling or stereotyping, Representativeness is the intuitive leap to make judgments based on how similar something is to something a person likes; this is usually done without taking into consideration other factors such as probability (likelihood), statistics (base rate), or sampling sizes. For example, just because a person likes the design of a book cover does not mean they will like the contents. To overcome biases generated by the Representativeness heuristic, people must discipline their intuition and make judgments based on probability and base rates. People should learn to question facts or analysis used to come up with their assumptions. In other words, think like a statistician.30 Potential for error—evaluating a person, place, or thing on how much it resembles something else without taking into account other salient factors can add bias to thinking.

Heuristic #15: Conjunction fallacy. This heuristic is about violating logic and the laws of probability. When given a set of priming details—some true and some assumed—about a person, place, or thing, people often create a plausible story based on both facts and assumptions over a more probable story based in facts alone. The notions of coherence, plausibility, and probability are often confused by the unwary when faced with a combination of facts and assumptions, some of which may be false.31 The more assumptions added to a description, forecast, or judgment, the more likely the conclusions are plausible but improbable. Why? System 1 thinking overlooks logic in favor of a plausible story, whether based in facts or not. Potential for error— thinking can be biased when intuition favors what is plausible but improbable over what is implausible and probable.

Heuristic #16: Overlooking statistics. When given purely statistical data and familiar with how to use statistics, people generally make accurate inferences. But when given statistical data and an individual story that explains things, people tend to go with the story rather than the statistics; that is people favor stories with explanatory power over mere data.32 This is related to the Part-to-the-whole informal logic fallacy in Appendix I. Potential for error—stereotyping, profiling, and making general inferences from a limited number of cases, rather than making general inferences from a larger number of cases, can generate bias in findings and conclusions.

Heuristic #17: Overlooking luck. Most people love to attach causal interpretations to the fluctuations of random processes. It is a mathematically inevitable consequence that luck plays a role in many outcomes. This; however, is not a satisfactory theory. People prefer a causal explanation—but often WYSIATI, what you see is all there is, comes into play.33 When a person removes causal stories and solely considers statistics, they often infer regularities; this is called regression to the mean. Those statistical regularities—regression to the mean—are explanations but not causes. People tend to be strongly biased toward causal explanations and generally do not deal well with pure statistics.34 Potential for error—seeing causes that do not exist can bias thinking.

Heuristic #18: Intuitive predictions. Conclusions drawn with strong System 1 thinking often feed overconfidence. Just because something “feels right” (intuitive) does not make it right. System 2 thinking is needed to carefully examine intuition, estimate baselines, consider regression to the mean, evaluate the quality of evidence, and so forth. Extreme predictions and a willingness to predict rare events from weak evidence are both manifestations of System 1 thinking.35 Potential for error-- unwarranted confidence when the information, logic, or reasoning are in error can bias thinking.

Heuristic #19: Narrative fallacy. In their continuous attempt to make sense of the world, people often create flawed explanatory stories of the past that shape their points of views, assumptions, and beliefs for the future; see Chapter 6 for more on evaluating points of view, assumptions, and beliefs. People often assign larger roles to talent, stupidity, and intentions, than to luck. A comforting conviction that the world makes sense rests on a secure foundation—people’s almost unlimited ability to ignore ignorance.36 This is particularly evident when a person hears, “I knew that was going to happen!” Potential for error--unwarranted comfort in conclusions can biases thinking.

Heuristic #20: Hindsight illusion. People often think they understand the past, which implies the future should be knowable; but, they understand less than they think. This is likely because a person’s intuitions and premonitions feel truer after the fact. Once an event takes place, they forget what they believed prior to that event; that is, before they changed their minds. For example, prior to 2008, financial pundits predicted a stock market crash, but they did not know it. Knowing requires showing something to be true, but no one could show that a potential crash was true because it had not happened yet. But after it happened, their hunches were retooled and became proofs. The tendency to revise the history of one’s beliefs in light of what actually happened produces a robust cognitive Hindsight illusion.37 Potential for error--people are prone to blame decision makers for good decisions that worked out badly and to give them too little credit for successful moves that appear obvious only after the fact. When the outcomes are bad, customers often blame their agents (analysts) for not seeing the potential problem. Actions that seemed prudent in foresight can look irresponsibly negligent in hindsight.38

Heuristics Leading to Overconfidence

Heuristic #21: Validity illusion. People sometimes confidently believe their opinions, predictions, and points of view are valid when that confidence often is unwarranted. Some even cling with confidence to ideas in the face of counter-evidence. Confidence in a judgment is not always a reasoned evaluation of the probability that the judgment is correct; it is a feeling that reflects the apparent coherence of the information and the cognitive ease of processing it.39 Factors that contribute to overconfidence include being impressed by one’s own brilliance, affiliating with like- minded peers, and over valuing one’s track record of wins and ignoring losses. Potential for error—basing the validity of a judgment on the subjective experience of confidence rather than objective facts creates bias. Unwarranted confidence is not a substitute for accuracy.

Heuristic #22: Ignoring facts and algorithms. People often overlook statistical information or other evidence and favor their gut feelings. Forecasting or predicting the future of stocks, diseases, car accidents, and weather should not be influenced by intuition, but they often are. People do well to consult facts, algorithms, check lists, statistics, and numerical records and not rely on subjective feelings, hunches, or intuition. Potential for error—relying on intuitive judgments for important decisions, particularly if an algorithm or other tool is available that will make fewer mistakes, has the potential for bias.40

Heuristic #23: Trusting expert intuition. Some experts are confident when the story they tell comes easily to mind, with no contradiction and no competing story. However, ease and coherence do not guarantee that a belief held with confidence is true. System 1 thinking often suppresses doubt and evokes ideas and information that are compatible with the currently dominant story.41 Kahneman is skeptical of experts because they often overlook what they do not know. He trusts experts only when two conditions are met: (1) the expert is in an environment that is sufficiently regular to be predictable, and (2) the expert has learned these regularities through prolonged practice. Potential for error—being misled by “experts” is a frequent source of bias.

Heuristic #24: Planning fallacy. This fallacy refers to taking on a risky project, confident of the best-case scenario without seriously considering the worst-case scenario. If people consult others who have engaged in similar projects, they will get an alternate perspective. Failure to consult others increases the risk of failure. Cost overruns, missed deadlines, loss of interest, and waning urgency, all can result from poor planning. Potential for error—making decisions based on delusional optimism rather than on a System 2 thinking analysis of gains, losses, and probabilities.42 In other words, poorly planned projects have a high probability of failure.

Heuristic #25: Optimism bias. People are prone to neglect facts, others’ failures, and what they do not know in favor of what they know and their perception of how skilled they are. People often do not appreciate the uncertainty of their environment, believing that the outcome of their achievements lies entirely in their own hands, while neglecting the luck factor. They suffer from the illusion of control, and often neglect to look at the competition. Experts who fail to acknowledge the full extent of their ignorance may expect to be replaced by more confident competitors who are better able to gain the trust of customers.43 Being unsure can be perceived as a sign of weakness, so people may turn to confident experts who may be wrong. Potential for error—unwarranted optimism, which does not calculate the odds and therefore could be risky, can bias thinking and led to failed actions.

Heuristic #26: Omitting subjectivity. People often think an object has only intrinsic objective value. A million dollars is worth a million dollars, right? Wrong! Magically making a poor person’s investment portfolio worth a million dollars would be fabulous! Magically making a billionaire’s investment portfolio worth a million dollars would be agony! One gained, the other lost. Economists have erred by failing to consider a person’s psychological state regarding value, risk, anxiety, or happiness. The 18th-century economist Bernoulli thought money had utility (fixed worth), but he failed to consider a person’s reference point.44 Potential for error—making decisions on pure logic without considering psychological states can bias thinking.

Heuristic #27: Theory-induced blindness. Once a person accepts a theory or model and uses it as a tool in their thinking, it is extraordinarily difficult to notice its flaws. If the person comes upon an observation that does not seem to fit a current theory or model, it is assumed that there must be a perfectly good explanation that was somehow missed.45 When the blinders fall off, the previously believed error seems absurd, and the real breakthrough occurs when the person cannot remember why he/she did not see the obvious. Potential for error—clinging to old paradigms (theories or models) that have outlived their validity may bias thinking.

Heuristics Affecting Decisions

Heuristic #28: Loss aversion 1. Kahneman’s claim to fame is Prospect Theory, which he created with his colleague Amos Teversky. After Teversky’s death, Khaneman won the 2002 Nobel Prize in Economics for his and Teversky’s joint efforts and Kahneman’s later work on cognitive influences on decision making (i.e., presented in his book Thinking, Fast and Slow). Economists previously believed the value of money was the sole determinant in explaining why people buy, spend, and gamble the way they do. Prospect Theory changed those beliefs by explaining three things. First, the value of money is less important than the subjective experience of changes in one’s wealth. In other words, the loss or gain of $500 is psychologically positive or negative depending on a reference point of how much money one already possesses. Second, people experience diminished sensitivity to changes in wealth. Losing $100 hurts more if they start with $200 than if they start with $1000. Third, people are generally loathe to losing money. People like winning and dislike losing, and people almost certainly dislike losing more than they like winning.46 System 1 thinking compares the psychological benefit of gain (win) with the psychological cost of loss, with the fear of loss usually influencing their behavior. Potential for error—passing by a decision for a sure win to avoid what a person thinks might be a possible loss, even when the odds are in favor of winning.

Heuristic #29: Loss aversion 2. Generally, people will work harder to avoid short- term losses than to achieve short-term gains.47 For example, a golfer may play it safe and putt for par to avoid bogeys (loosing points for going over par) rather than being more aggressive and putting for birdies (gaining points by putting under par). Contract negotiations stall when one party feels they are making more concessions than their opponent. Potential for error—biasing decisions by underestimating one’s own and other’s attitudes toward loss/gain, which are asymmetrical.

Heuristic #30: Endowment effect. An object a person owns and uses is more valuable to them than an object they do not own and do not use.48 People endow an object they own and use with significance and are unwilling to part with them for two reasons: they hate loss and the object has a history with them. Thus, people will not sell a beloved, useful object unless a buyer offers a significant payment. Objects a person does not like or does not use will sell for less or they may be given away. Potential for error—decisions to cling to objects for sentimental reasons can lead to considerable loss of income or other benefits.

Heuristic #31: Possibility effect. When highly unlikely outcomes are weighted disproportionately, people commit the Possibility effect heuristic.49 For example, while there may be a one in 10 million chance of winning the lottery, people rationalize that someone must win and buy a lottery ticket anyway because their loss would be only $2. Probability of error—decisions that discount the possibility or probability of an event often are risky and, in certain situations, have adverse consequences.

Heuristic #32: Certainty effect. The opposite of the Possibility effect, this heuristic concerns outcomes that are almost certain but are given less probability.50 For example, lawyers often try to convince clients to take a plea bargain or settlement that is “less than perfect” rather than go to trial, even though the trial almost certainly would result in a victory for the client. In these cases, the lawyers are under the influence of a Loss aversion heuristic because there is always a chance a trial will be lost. Probability of error—decisions that discount a high probability of success often do not serve the client or decision maker well.

Heuristic #33: Expectation principle. Similarities between Probability effect and certainty effect heuristics are that decision weights (probabilities) that people assign to outcomes are not always identical to the probabilities of these outcomes occurring. This is contrary to the Expectation principle of wins and losses summarized below.51

GAINS

LOSSES

HIGH PROBABILITY

(Certainty effect)

95% chance to win $10,000. Fear

of disappointment, risk averse, accept unfavorable settlement.

95% chance to lose $10,000.

Hope to avoid loss, risk seeking, reject favorable settlement.

LOW PROBABILITY

(Possibility effect)

5% change to win $10,000. Hope of large gain, risk seeking, reject

favorable settlement.

5% chance to lose $10,000. Fear of large loss, risk averse, accept

favorable settlement.

The above reveals people attach values to gains and losses, and decision weights assigned to outcomes often differ from actual probabilities. The risk aversion of the decision maker influences the actual decisions. A fourfold pattern of preferences usually accounts for the potential for error:

  • People are often risk averse when they look at the prospects of a large gain. They will lock in a sure gain and accept a less-than-expected value of the gamble.
  • When the potential gain is extremely large, such as a multi-million-dollar lottery ticket, the person is indifferent to the fact that their chance of winning is extremely small. Without the ticket they cannot win; but, with the ticket, they can at least dream.
  • This explains why people buy insurance. People will purchase insurance because they are buying protection and peace of mind.
  • This also explains why people take desperate gambles. They accept a high probability of making things worse for a chance of a slight ray of hope of avoiding the loss they are facing. This type of risk taking can turn a bad situation into a disaster.52

Heuristic #34: Overestimating the likelihood of rare events. It makes more sense to pay attention to things that are likely to happen (rain tomorrow) than to things that are less likely to happen (terrorist attacks, asteroid strikes, terminal illness, floods, fires, landslides, etc.).53 Under the influence of others, people tend to overestimate the probabilities of unlikely events, and thus tend to give too much weight to unlikely events in their decisions. This heuristic joins forces with the Availability cascade and Cognitive ease heuristics discussed above. Rather than choose an alternative with the highest likelihood, people are more likely to choose the alternative in a decision that is most recent and described by others with explicit vividness, repetition, and relative frequencies. Potential for error—decisions can be swayed by fear mongers who manipulate data in favor of their cause.

Note: U.S. intelligence analysts attempt to counter this heuristic by conducting low- probability/high-impact event analyses (Chapter 10).

Heuristic #35: Thinking narrowly. One way to decrease risk aversion is to think broadly, looking at the aggregate wins over many small gambles. Thinking narrowly, looking only at short-term losses, paralyzes decision making. But thinking broadly is non-intuitive; it is a System 2 thinking task that takes mental effort. People are wired by System 1 thinking to make irrational decisions (e.g., saying no to easy money or successes). The limit of individual human rationality is so stark, Kahneman calls it a “hopeless mirage.”54 Potential for error—decision making that passes by risks that could be in a person’s favor.

Heuristic #36: Disposition effect. Some people seem to have a System 1 thinking calculator in their head that keeps score not only of the potential gains and losses of a transaction but also of the emotional risks, rewards, and possible regrets of their decisions. The emotions that people attach to the state of their mental accounts often are not acknowledged in standard decision theory.55 People may be willing to sell money-earning stocks because it makes them feel like wise investors, but less willing to sell losing stocks because it is an admission of defeat. This is irrational; however, since a person would earn more money by selling the losers and clinging to the winners. Potential for error—inserting emotions into a decision-making process often results in losses.

Heuristic #37: Sunk cost fallacy. To avoid feeling bad about cutting their losses and being called a failure, people tend to throw good money after bad, stay too long in abusive marriages, and stay in unfulfilling careers.56 This is optimism gone haywire. Potential for error—once again, emotions can derail good decision making.

Heuristic #38: Fear of regret. This is an emotion people are familiar with as they strive to avoid making decisions that could lead to regret. However, people tend to be terrible at predicting how intense those feelings of remorse will be; it often hurts less than they anticipate. Similar conditions exist for the emotion of blame—where people avoid decisions when feeling they will be blamed for poor results.57 Potential for error— the bogeyman of emotion can also upend decisions leading to regret or blame.

Heuristic #39: Ignoring joint evaluations. People make decisions differently when asked to make them in isolation rather than when asked to make them in comparison with other scenarios. For example, a victim in a robbery will be awarded a higher compensation by a jury when there are poignant factors involved (e.g., the victim was visiting a store for the first time), but will be awarded a lower compensation if harmed while in a shopping location they frequently visit. When locations are compared (joint evaluation), people realize the victim’s location should be insignificant. Joint evaluations highlight a feature that was not noticeable in single evaluations but are recognized as a decisive factor when detected.58 Potential for error—avoid making decisions in isolation. Instead attempt comparison shopping, such as comparing sentences for crimes or comparing salaries for different jobs. Failure to conduct joint evaluations limits exposure to helpful norms.

Heuristic #40: Ignoring frames. How a problem is framed determines people’s choices more than purely rational considerations would imply. More drivers sign the “donate organ” card when they have to check the opt-in box, than drivers who must check the opt-out box. People are more willing to pay extra for gas when using a credit card (versus cash) if the fee is framed as “loss of cash discount” than “added credit card surcharge.” Doctors prefer interventions where outcomes are a “one month survival rate of 90%,” than to interventions where outcomes are “10% mortality rate.” Both intervention outcomes mean the same thing statistically, but the frame of “survival” has greater emotional value than “mortality rate.” Generally, the meaning of a sentence is formed by how people understand it. Reframing an idea or sentence requires mental effort, and System 2 thinking can be careless.59 Potential for error— thinking decisions are made in an objective bubble, when in fact there are subjective factors at work about which people are unaware.

Heuristics from the Experiencing Self and Remembering Self

Heuristic #41: Ignoring the two selves. People have an “experiencing self” and a “remembering self.” The latter usually takes precedence over the former as people remember their most recent experiences rather than consider experiences in the longer past. That is, a person could experience 13 days of vacation bliss; but, if on the 14th day things go bad, they tend to remember the vacation as negative, likely because more recent memories often override past experiences. For example, if a 40-minute blissful vinyl recording ends with a scratch, people tend to remember the scratch sound, not the 39 previous minutes of musical enjoyment. Confusing experience with the memory of it is a compelling cognitive illusion, and it is the substitution that makes people believe a past experience can be ruined. The experiencing self does not have a strong voice.60 Potential for error—basing decisions on the last experience or information revealed may bias decisions.

Heuristic #42: Peak end rule. How an experience ends seems to hold greater weight in people’s memory than how an experience was lived. Similar to the Ignoring the two selves heuristic, the Peak end rule is shorthand for remembering only how an experience felt at its end not at its worst moment.61 Potential for error—remembering the end of a situation can bias the experience of the entire situation.

Heuristic #43: Duration neglect. Another corollary of the Ignoring the two selves heuristic offers that the duration of an unpleasant or pleasant experience does not seem to be as important as the memory of how painful or pleasurable the experience was.62 Potential for error—remembering the overall pain or pleasure in a situation will bias the overall experience of the situation.

Heuristic #44: Narrative wholeness. When people evaluate how well their and others’ lives have been lived, they do well to consider the whole narrative and not just the end. Because of the previous three heuristics; however, people are prone to devalue a person’s long, sacrificial, generous life; if, at the end (or even after death), other people discover episodes of the person’s selfishness, depravity, or other negative conditions. One’s life story is about significant events and memorable moments, not about time passing. Duration neglect is normal in a story, and the ending often defines its character.63 Biases can be inserted in an analysis by paying more attention to longevity than quality, making decisions based on how memorable an experience will be rather than how exciting and enriching it will be by itself, or experiencing a moment of pleasure and forfeiting a reputation of integrity. Potential for error—only remembering the end of an experience can bias the overall experience.

Heuristic #45: Valuing a remembering self over an experiencing self. Since most people rely on unreliable memories, they do well to keep in mind what their experiences were like during them, not just at the conclusion. A person’s emotional state is largely determined by what they attend to, and they are normally focused on their current activity and immediate environment.64 A person stuck in traffic can still be happy because they are in love, or a person who is grieving may still remain depressed while watching a comedy. Potential for error—negative consequences may occur when not paying attention to what a person is doing, letting experiences happen without reflection, and going with the flow with no attempt to alter their schedules, activities, or experiences.

Heuristic #46: Affective forecasting. Which factor leads to a happier life: duration or experiences? Would a 20-year life with many happy experiences be better than a 60-year life with many terrible experiences? Which would a person rather be— happy or old? People usually are terrible at predicting what will make them happy. They tend to substitute an easier question when asked the very difficult question, “Overall, how happy is your life?” Instead, they may answer by asking “How happy am I right now?” Thus, responses to broad questions about well-being should be given little validity.65 People tend to make decisions based on what will make them happy in the future; but, when it is achieved, the happiness normally does not last. Potential for error—it is difficult for people to know their future selves.

Heuristic #47: Focusing illusion. Kahneman offers that nothing in life is as important as it is when a person is thinking about it.66 This means when people are asked to evaluate a decision, life satisfaction, or preference, they err if they focus on only one thing. How a person answers, “What would make you happy?” depends on many factors and rarely is one factor determinant. Yet people regularly focus on one issue—income, weather, health, relationships, pollution, etc.—and ignore other important factors. If a person is asked, “How much pleasure do you get from your car?,” the answer depends on how much they value the stereo, mileage, looks, age, cost, comfortable seats, tilt steering wheel, and more. Generally, peoples’ evaluations often are based on the heuristic that while they are thinking of a thing, they generally think better of it, forgetting how infrequently they actually think about those things (income, weather, health, stereo, mileage, looks, etc.). What initially strikes a person’s fancy is absorbed into daily living. People tend to adapt, acclimate, and experience the initial pleasure less intensely as time progresses. Potential for error—the remembering self is subject to a massive focusing illusion about the life that the experiencing self endures quite comfortably.67

Heuristic #48: Miswanting. Similar to the Focusing illusion, people often exaggerate the effect of a significant purchase or changed circumstances on their future well-being. Things that initially are exciting eventually lose their appeal.68 Potential for error—again, people do not know their future selves.

Notes

1 Daniel Kahneman, Thinking Fast and Slow (New York, NY: Farrar, Straus and Giroux, 2011).

2 Ibid, 7.

3 This appendix is a revision of a work by Erik Johnson, “Book Summary: Thinking Fast and Slow,” Erik Reads and Writes, April, 2014. https://erikreads.files.wordpress.com/2014/04/thinking-fast-and-slow- book-summary.pdf (accessed May 24, 2021).

4 This is not to be confused with the left-brain, right-brain abstract model. This model offers that the left-brain is where facts are stored and organized and logic applied to a person’s thinking. It also offers the right-brain is where a person’s creativity and innovation reside. Today, medical technology scans reveal there is no left-brain, no right-brain, but there are certain distinct sections of the brain that energize based on whether a person is recalling facts and applying logic, versus being creative.

5 Kahneman, 28.

6 Ibid, 35.

7Johnson, 2.

8 Kahneman, 46.

9 Ibid, 415.

10 Ibid, 417.

11 Ibid, 53.

12 Ibid, 62.

13 Ibid, 64.

14 Ibid. 76.

15Ibid.

16Ibid, 79.

17 Ibid, 81.

18 Ibid, 82.

19 Ibid, 85.

20 Ibid, 93.

21 Ibid, 95.

22 Ibid, 98.

23 Ibid, 103.

24 Ibid, 113.

25 Ibid. 115.

26 Ibid, 116.

27 Ibid, 127-128.

28 Ibid, 131-132.

29 Ibid, 140.

30 Ibid, 152.

31 Ibid, 159.

32 Ibid, 167.

33 Ibid, 179.

34 Ibid, 182.

35 Ibid, 191-194.

36Ibid, 201.

37 Ibid, 203.

38 Ibid.

39 Ibid, 212.

40 Ibid, 229.

41 Ibid, 239.

42 Ibid, 252.

43Ibid, 263.

44 Ibid, 269-277.

45 Ibid, 277.

46Ibid, 281.

47 Ibid, 303-309.

48 Ibid, 292-299.

49Ibid, 311.

50 Ibid.

51 Ibid, 312.

52 Ibid, 317-318.

53Ibid, 323-325.

54 Ibid, 335.

55 Ibid, 343.

56 Ibid, 346.

57 Ibid, 346-349.

58 Ibid, 359.

59 Ibid, 367.

60 Ibid, 381.

61 Ibid, 382.

62 Ibid, 387.

63 Ibid, 386.

64 Ibid, 394.

65 Ibid, 399.

66 Ibid, 402.

67 Ibid, 406.

68 Ibid.

Annotate

Next Chapter
Appendix III Analyzing Political Culture
PreviousNext
Powered by Manifold Scholarship. Learn more at
Opens in new tab or windowmanifoldapp.org