Chapter 4
Cognitive Biases, Logic Fallacies, Bad Arguments
When dealing with people,
remember you are not dealing with creatures of logic, but
with creatures bristling with prejudice and motivated by pride and vanity.
American writer and lecturer Dale Carnegie highlights above how when dealing with people, logic is often subverted by prejudice and self-interests. Most of us can see this situation around us every day. In the last chapter it was explained how when information (hopefully truthful) is combined with logic and reasoning it results in knowledge. Covering formal logic and the totality of scientific reasoning are beyond the scope of this book. Instead, this chapter investigates how frequently encountered cognitive biases (heuristics), informal logic fallacies, and bad arguments may derail the quality of a person’s thinking no matter how good the available information may be. When problems occur with logic and reasoning, it causes the overall results of the thinking project to be highly suspected. Below, bad thinking related to cognitive biases is first covered. With this material the reader is also introduced to the psychological theory of Thinking Fast and Thinking Slow. Next the discussion turns to informal logic fallacies highly prevalent in bad human thinking. Finally, logical argumentation, another frequently violated condition encountered in everyday discourse, is explored. The chapter ends with the analysis of a political policy-making case where misinformation, cognitive biases, informal logic fallacies, and bad argumentation all come into play.
Cognitive Biases
Cognitive biases tend to nullify valid arguments in problem-solving and decision- making. One dictionary definition of bias holds it is “a highly personal and unreasoned distortion of judgment.” Bias usually results from a person taking intellectual shortcuts or a failure to thoroughly think through an issue. Bias is a deviation from the truth. Bias and the validity of human thinking are indirectly related, as increasing bias leads to decreasing validity. Bias has two major sources. First, bias may be cognitive in how a person’s brain works, i.e., how the brain has been “programmed” by all the influences in the person’s life (see Figure 3.1). Such biases may be present even if the person is aware of them. Second, bias may be personal, meaning it emerges from a person’s individual belief system (see Chapter 7). Cognitive and personal biases differ for each person. Biases influence mindsets, which subsequently affect the results of thinking.
The work of Nobel laureate Daniel Kahneman provides insight on the sources and effects of cognitive biases. Cognitive psychologist Kahneman, and his colleague Amos Tversky, explained how people do not apply the full force of reason to their thinking but instead utilize shortcuts called heuristics that bias decision processes.[1] Kahneman won the 2002 Nobel Prize in Economics by demonstrating how economic decisions are influenced by a combination of heuristics and are not simply based on rational monetary considerations (gains or losses) as previously thought. This refuted decades of economic-man theory where decision-making in the face of uncertainty was based on primarily rational choices. Kahneman presents over 40 different heuristics (cognitive biases) in his book Thinking, Fast and Slow[2] and argues that one or more of these heuristics affect all human decision-making situations depending on the person and issue under consideration. These heuristics lead to cognitive biases that degrade the results of thinking and decision-making.
Kahneman makes use of an abstract psychological model to explain the influence of heuristics on human thinking. This abstract model offers two competing brain systems for human thought—System 1 (fast) Thinking and System 2 (slow) Thinking.[3] Figure 4.1 provides a summary of the two systems of thinking.
Figure 4.1 | Summary of Fast and Slow Thinking[4] |
System 1 (fast) | System 2 (slow) |
Uses subconscious values, drives, beliefs, and biologic influences, i.e., “gut feelings.” | Articulates judgments, makes choices, endorses or rationalizes ideas and feelings. |
Jumps to conclusions regarding causality and creates stories to justify conclusions. | Confirms causality and creates stories to either confirm or deny conclusions. |
Operates effortlessly. | Requires mental effort to engage. |
Can be wrong but it is more often right. | Can be right or wrong depending on the level of thinking effort. |
Heavily influenced by heuristics (biases). | Examines heuristics’ influences when so inclined. |
System 1 (fast) thinking governs most of a person’s everyday decision making and behavior. This type of thinking is not only fast, but also largely effortless, and takes place primarily in the subconscious. A System 1 thinker is good at data recall and looks for patterns and associations in data already known or recently found. That person also looks for situations governed by casualty (but not always successfully). System 1 thinkers tend to create stories to explain events and strive to avoid cognitive dissonance by making the person’s thinking more consistent. System 1 thinking is very intuitive (sometimes called a “gut feeling”) and governs most everyday human behavior. For example, System 1 is in action when people do not spend much mental effort deciding what clothes to wear each day. System 1 takes over in immediate crisis situations that lack time for deeper thinking, such as during a person’s immediate reactions (to brake, steer away, etc.) when a car unexpectedly pulls in front of them in traffic.
System 1 thinking is correct most of the time. It may not be correct; however, when the decision-making situation is not routine, not an immediate crisis, or calls for complex thinking. System 1 thinkers often jump to conclusions based on poor assumptions. In other words, a person’s lack of mental effort drives most of their poor-thinking tendencies (see Figure 2.5). System 1 thinkers may rely too much on unhelpful emotional responses, which can result in bias. Kahneman argues that when System 1 thinking is not correct, one or more heuristics leading to cognitive bias (see Figure 4.2) can usually be identified at work.
For situations requiring more robust and complex thought, System 2 (slow) thinking is required. System 2 thinking requires conscious effort and is very deliberate. System 2 thinkers can handle abstract concepts and are grounded in evidence (data), logic, and reasoning. This system is good for employing advanced mathematics and statistics. System 2 is good for reductive thinking, meaning taking complex conceptual or empirical situations and reducing them to more understandable basic or simple models. The downfall of System 2 thinking includes that it requires significant mental effort, time, and energy—which could lead to analytic or decision-making fatigue. A person’s System 2 thinking also can result in carelessness. There are times when System 1 thinkers will defer a decision situation to a System 2 solution. But, because a System 2 thinker’s process can be lackadaisical (lazy), it will accept the System 1 solution without taking the time and effort to complete a more robust analysis.
Most people successfully employ a combination of System 1 (fast) and System 2 (slow) thinking in their personal and professional decision-making. It is mainly a person’s System 1 thinking techniques and tendencies that have been developed and used since birth. A person’s ability to conduct System 2 thinking will depend on their level of education, intellectual curiosity, and flexibility in their thought processes.
Critical thinking is one method designed to improve System 2 thinking as it provides systematic techniques for reducing System 1 thinking cognitive and personal biases. System 1 thinking still has its place in everyday human behavior and decision-making. However, to become a good problem-solver or decision-maker, the systematic techniques of critical thinking (see Chapters 2, 8) must be applied to improve System 2 thinking. Additionally, the person must be ready to expend the mental effort required to make System 2 thinking successful.
Figure 4.2 summarizes the most common types of cognitive biases in politics influencing System 1 thinking. Figure 4.2 is taken partly from Daniel Kahneman’s book, Thinking, Fast and Slow, which contains additional heuristics (cognitive biases) that may be found in human discourse. A summary of Kahneman’s heuristics may be found in Appendix II of Security Analysis, A Critical-Thinking Approach.[5]
Figure 4.2 | Selected Cognitive Biases Commonly Found in Politics[6] |
| Cognitive Bias | Description |
|---|---|
| Confirmation (Affirmation) | Accepting only evidence that supports a pre-formed point of view and rejecting evidence that contradicts it. This is likely the most prevalent and destructive cognitive bias in all societies. |
| Anchoring | Focusing on one piece of information to the exclusion of alternatives, especially new or conflicting information. |
| Cognitive Ease | Accepting material that is easier to compute, more familiar, and easier to read, making them seem truer than material that requires hard thought, are novel, or are hard to see. (Closely related to both confirmation and anchoring biases.) |
| Coherent Stories (Associative Coherence) | Making sense of the world by telling logically consistent stories about what is going on, such as making associations between people, events, circumstances, and regular occurrences. The more these events fit into their stories (even if not true), the more normal they seem. Stories often violate both logic and statistical probabilities. |
| Law of Small Numbers | Offering small samples, often with no source data (i.e., “many people say”), leading to giving the outcomes of small samples more credence than statistics warrant. |
| Representativeness (Stereotyping) | Explaining the opponent’s decisions or behaviors based on their ideology or other traits (e.g., political views, religion, ethnic group, language, country of origin, etc.). |
| Trusting Expert Intuition | Becoming confident when an expert’s story comes easily to mind, with no contradiction and no competing story. However, ease and coherence do not guarantee that a story is true. |
| Fundamental Attribution | Over-emphasizing the personality-based agency explanations (opponent’s internal traits such as motivation, beliefs, decision-making tendencies, etc.) over structural explanations (political culture, laws and regulations, organizational or bureaucratic influences, other outside structural influences, etc.). |
| Blind-Spot Bias | Being unaware of and failing to consider your own personal biases, even as you recognize biases in others. |
Confirmation (affirmation) and anchoring biases are the most pernicious when combined with a condition known as cognitive dissonance. This is when a person experiences mental distress as they hold two competing ideas in their brain at once. Educators purposely create cognitive dissonance in their students, because as the student works through the competing ideas it results in learning. More common in politics, when two competing ideas are present in a person’s brain, rather than do the hard critical thinking to work out the mental conflict, the person will just reject one of the ideas, usually the newest idea, as they relieve their mental distress by accepting either the older and more familiar idea or the one that most coincides with their existing worldview or situational perspectives. Cognitive dissonance is closely related to the cognitive ease bias. When one or more of the cognitive biases in Figure 4.2 are present, the actual thinking is degraded (biased).
Informal Logical Fallacies
When cognitive biases are combined with logic fallacies the result is a witch’s brew of poor information use and flawed logic and reasoning. Logic fallacies result in defective arguments.[7] Formal logic fallacies are usually easy to identify due to the recognizeable false premises or logic presented.[8]Formal logic fallacies result from invalid evidence, bad assumptions, or poor reasoning, that do not guarantee the truth of a corresponding statement, finding, or conclusion.[9] Informal logic fallacies, our focus in this section, also display these same degraded thinking charateristics, but are harder to identify as they are so common in society where bad reasoning often leads to equally bad arguments and decisions. Many people have become desensitized to informal logic fallacies as they are rampant in human discourse and are constantly reinforced as a result of their frequent use in the news media, editorials, entertainment media, marketing and advertising, political discourse, and personal conversations. In these situations, individuals are trying to convince an audience of the correctness of their points of view, perspectives, findings, or recommendations. Avoiding both formal and informal logic fallacies is critical to ensuring valid findings in written and verbal arguments.
Informal logic fallacies leading to defects in reasoning may be separated into four categories: (1) fallacies of ambiguity, (2) fallacies of relevance, (3) fallacies of presumption, and (4) fallacies of weak induction.[10] These categories are discussed in greater detail below. Just watch the political news for a night or two and it is more likely than not it will be filled with informal logic fallacies that add bias to arguments.[11]
Fallacies of Ambiguity
Fallacies of Ambiguity: Fallacies resulting in defective arguments due to problems with the wording or sentence structure of a statement.
Equivocation: Fallacies that result from different meanings of a word in an argument or using different definitions of words to support an argument.
Example—Politician A: “I seek a structure of Democratic-Socialism, where citizens share in the benefits of our strong economy.”
Politician B: “See, Politician A is a socialist and will turn our country into an authoritarian state just like the former Soviet Union.”
(Notice the two Politicians are using different defintions of socialism—Politician A the Democratic-Socialist structure employed in many modern democratic states, and Politician B the Authoritarian-Socialism structure of the former Soviet Union and other authoritarian socialist states. See Chapter 6 for a more in-depth discussion of different definitions of “socialism.”)
Accent: Fallacies commited due to either emphasizing certain words such that their meaning shifts or a statement’s meaning is shifted by using it partly out of context.
Example—Politician A: “I agree with the Second Amendment to the U.S. Constitution that gives the right to citizens to bear arms, but I also support common-sense gun safety laws, including restricting certain citizens (mentally ill, convicted criminals, minors, etc.) so they do not have access to guns.”
Politican B: “See, Politician A does not support the Second Amendment.
(The two politicians have differebt perspectives of the Second Amendment. Full context is not provided of Politican A’s or B’s views on the Second Amendment.)
Whole-to-part (division): Fallacies that assert what is true of something as a whole must also be true of each ot its parts.
Example—NATO is the most powerful defense force alliance on Earth. Estonia is a member of NATO. So, Estonia must have one of the most powerful defense forces on Earth.
(Estonia is a small European country on the Baltic bordering Russia and does not have powerful defense forces, which is why it joined the NATO alliance.)
Part-to-whole (composition): Fallacies that assert what is true of part of something also must be true of the whole thing. Similar to the law of small numbers bias in Figure 4.2.
Example—Saudi Arabia is one of the richest nations on Earth. Saudi Arabia belongs to the Arab League; so, the Arab League is made up of the richest nations on Earth.
(All members of the Arab League are not rich, nor are all rich nations on earth members of the Arab League.)
Fallacies of Relevance
Fallacies of Relevance: Fallacies in which the reasons given are not pertinent to the truth or falsehood of a statement.
Appeal to force: Fallacies resulting from an improper or inappropriate threat.
Example—Politician: “Country A must contribute its fair share to the security alliance. After all, they are currently allowed to be part of the alliance.”(Implied threat that Country A may be expelled from the alliance if it does not pay its fair share. This draws on the emotion of being threatened if action stated or implied is not carried out—also known as blackmail!)
Appeal to fear: Fallacies that move a person, country, etc., to fear the consequences of not doing what the other person, country, etc., wants.
Example—Politician: “If you vote for the opposing party our economy will be destroyed, unrestricted immigration will raise crime rates, and we will no longer be a democracy.”(Draws on the emotion of fear—with no accurate supporting evidence—to convince voters not to vote for the opposing party.)
Personal attack (Ad hominem attack): Fallacies created by attacking an opponent’s character or their motives for believing something instead of disproving their argument. There are three types of fallacies of personal attacks: abusive, circumstantial, and tu quoque (you too).
Abusive: Fallacies where there is a direct attack on an opponent’s character.
Example—Politician: “The President says better relations with Country Z benefit our country. But, the President is a known liar and con-man so there is not reason to believe him.”(The politician attacks the President’s character and not the argument of whether there will be benefits by improving relations with Country Z.)
Circumstantial: Fallacies where aspects of the opponent’s circumstances are given as a reason not to support the argument.
Example—Politician A: “We must make use of better technology to secure our borders.”
Politician B: “We cannot believe Politican A because her family owns companies that may provide and install the technology, and thus financially benefit from Politican A’s statement. So, we cannot take her argument seriously.”
(The fact that politician A’s family owns technology companies does not refute her argument of how better technology would secure the borders.)
Tu quoque (You too): Fallicies caused by dismissing an opponent’s viewpoint or behavior on an issue because they were inconsistent on the same thing in the past.
Example—Country Z invades Country A and seizes portions of Country A territory. Country Z justifies its actions as protecting Country A citizens who speak Country Z’s native language. The invasion was after Country Z previously made statements supporting the territorial integrity of Country A on numerous past occasions. Therefore, Country Z’s statements and behaviors are inconsistent and must be rejected.
(Just because a past statement or action was inconsistent with recent actions does not invalidate the current justification by Country Z—but still its recent actions must be evaluated as to whether they are illegal, unethical, or immoral.)
Mob appeal (Appeal to the people): Fallacies that play on people’s emotions by claiming a viewpoint is correct because many other people agree with it.
Example—Politician: “My opponent says we must better protect the borders because criminal gang members are entering the country illegally and committing murder, rape, and a host of other crimes. Many candidates in my opponent’s political party cite these same conditions. FBI statistics, though, say there are only a handful of crimes conducted by criminal gang members entering the country illegally, while per-capita crime rates are much higher for native-born citizens.”
(Avoid accepting conditions or viewpoints that “many (or some) people say or know” without seeking reliable statistics or other evidence concerning the stated conditions.)
Accident: Fallacies wherein a general principle (legal, ethical, or moral) is applied in a situation where the principle does not apply.
Example—Interrogator: “This terrorist admits he has planted a large “dirty bomb” set to explode in two days in a major city. Normal interrorgation techniques have not convinced the terrorist to reveal the city and location of the bomb. Therefore, we need to immediately employ enhanced interrorgation techniques, including torturing the terrorist, to hopefully save thousands of citizens.”
(It is illegal to torture the terrorist even if the lives of thousands of ciizens are threatened. But in this case the ethical or moral implications of not ordering the torture must be considered.)
Stereotyping: Fallacies arguing that an opponent’s decisions or behaviors are based on their ideology or other traits (e.g., political views, religion, ethnic group, language, country of origin, etc.). Similar to the representativeness bias in Figure 4.2.
Example—Politician: “All communists want to overthrow democracies and destroy capitalism. Since President Z is a communist, he wants to overthrow all democracies and destroy capitalism.”
(The statement “all communists want to overthrow democracies and destroy capitalism” is invalid.)
Genetic fallacy: Fallacies condeming an argument because of where it began, how it began, or with whom it began (type of stereotyping).
Example—Speaker: “All persons born in Country Z want that country to rule the World. My next door neighbor here in Country A was born in Country Z and has lived here 20 years. Therefore, my next door neighbor wants Country Z to rule the World.”(The statement “all persons from Country Z want that country to rule the World” is invalid.)
Straw person (Strawman): Fallacies that distort the opponent’s point of view or stance on an issue to make it easier to attack and disprove the opponent’s arguments; thus, the attack is really about a point of view or stance that does not exist.
Example—Politician A: “Increasing terrorist attacks are causing us to change our views of civil liberties such as free speech and privacy. Our country’s policies and actions should weigh slightly more on the side of improved security to keep our citizens safe, which will likely impinge on their civil liberties. Our citizens will support the slight loss of some civil liberties to gain more personal security.”
Politician B: “Politician A is advocating a Machiavellian approach saying the desireable ends (more security) justifies the means (impinging on civil liberties). This could lead to situations where in the interest of more security we will all be the subject of intrusive government surveillance, and we could all be locked up in detention camps if the government decides it needs to protect us.”
(Politician B distorted the original argument and then provides a “worst case” scenario not related to Politican A’s original statements about slight changes in civil liberties.)
Red herring: Fallacies that introduce an irrelevant point into an argument. Someone may think (or want people to think) it proves their point, but it really does not. Introducing material not related to the core question or argument is included in this fallacy. (This fallacy takes its name from the British practice of dragging a bag of red herring across the fox’s trail in a fox hunt to distract the foxhounds off the actual trail of the fox. Red herring is similar to the straw person fallacy.)
Example—Reporter to President: “Will you start arms control talks with Country Z? President’s response: “It would be terribly presumptive for us to discuss arms control talks with Country Z when we have not yet talked to them about the issue. Most countries see arms control treaties as one way to help reduce violence and conflict in search of peace. I am sure Country Z has aspirations for peace like other countries.”
(The President did not answer the original question but offered a red herring response on a related issue (world peace) that she probably thought did answer the question.)
Bandwagon: Fallacies that pressure someone to do something just because many other people are doing it. This is similar to the mob appeal (appeal to the people) fallacy.
Example—Politician: “All politicians are reading the biography of President A; so, you should read it too if you want to be able to talk with other politicians.”
(It may not hurt to read the biography, but you could still talk with other politicians if you did not.)
Irrelevant conclusion: Fallacies in which conclusions are reached bearing little resemblence to the supporting argument.
Example—Politician: “We have put in a lot of hard work in preparing for the upcoming elections. Therefore, we are going to win the election in a landslide.”
(A lot of hard work does not always result in the desired outcome, but it increases the probabilities of success.)
Repetition: Fallacies based on repeating a message loudly and often in the hope that it will eventually be believed.
Example—President: “I am not a crook! I am not a crook! I am not a crook! I have never been involved in anything illegal. I am not a crook! I will repeat this statement every day until the charges against me are dropped. I am not a crook!”
(Just because a message or idea is repeated frequently does not make it true.)
Appeal to tradition: Fallacies that result from encouraging someone to buy a product or do something because it is associated with something old.
Example—Politician: “We must buy the new rapid fire gatling guns because the gatling gun has been a major part of our success in every war over the last 125 years.”
(Just because something worked previously does not mean it is the best choice for the future.)
Appeal to hi-tech (Latest thing): Fallacies based on urging someone to buy or support something because it is the “latest thing,” but not necessarily because it is the best thing.
Example—Politician: “We need to procure the latest laser missile defense systems for our Navy’s ships as they are the most advanced technology available.”
(Just because they are the latest technology does not mean they will work better than existing missile defense systems.)
Fallacies of Presumption
Fallacies of Presumption: Fallacies focusing on how assumptions used in reasoning do not support the argument being made.
Circular reasoning: Fallacies supporting a conclusion by simply restating it in the same or similar wording. Someone says P is true because Q is true, and Q is true because P is true. Not to be confused with circular reporting discussed in Chapter 3.
Example—Major: “We know our counterinsurgency doctrine is true, because it was written by our most inspired General. And, we know she is our most inspired General, because she wrote the counterinsurgency doctrine.”
(Notice the argument both assumes and concludes the counterinsurgency doctrine is true, thus using circular reasoning.)
Complex question (Loaded question): Fallacies resulting from loaded questions, where the respondent is put in a bad situation no matter the answer, or the answer to the presumed question is false.
Example—Politican A to Politician B: “When did you stop stealing your campaign funds?”
(This assumes Politician B was in fact stealing campaign funds, which might not be true.)
Suppressed evidence: Fallacies that result from withholding relevant evidence. Similar to omission of information discussed in Chapter 3.
Example—Politician: “If we want to get rid of chemical weapons, just take them to a barren desert and bury them. There will be no chemical weapons after that.”
(These statements suppress the relevant evidence on the effects of burying chemical weapons on the desert’s ecosystem and the ability to produce more chemical weapons.)
Either-or (False dichotomy): Fallacies asserting that we must choose between only two things, when in fact there are a number of different alternatives we could choose.
Example—General to President: “We either need to conduct a full-on military assault or do nothing in reaction to Country Z’s aggressive actions.”
(Statement does not consider alternative diplomatic or lesser military options.)
Fallacies of Weak Induction
Fallacies of Weak Induction. Fallacies resulting from the use of poor evidence and weak reasoning that do not support the findings or conclusions.
Appeal to authority (Illegitimate authority): Fallacies due to an appeal to someone who has no special knowledge or expertise in the area they are discussing or due only to tradition or rumors.
Example 1—Politician: “We must invade Country Z because Singer A, the most famous singer in the world, says we should.”
(Singer A is outside their area of expertise.)
Example 2—Voter: “I must vote for all candidates from Political Party A because my family has voted for all candidates from this party over the past five decades.”
(This is a play to tradition and does not allow the voter to become an informed citizen or critical thinker.)
Example 3—Voter: “Politicain A reported many studies (no specifics) showing his foreign policy agenda is the best.”
(No information on exactly what the studies say or if they even exist.)
Proof by lack of evidence (Appeal to ignorance): Fallacies claiming something is true simply because nobody has yet given any evidence to the contrary.
Example—Politician: “Country Z must have laser weapons as we have seen no evidence such weapons are located in Country Z.”
(If there is no evidence whether a statement is true or false, it is best to suspend judgment regarding its truth.)
Hasty generalization: Fallacies using a very limited sample to generalize to a larger group or set of actions. Similar to the part-to-whole fallacy and law of small numbers bias in Figure 4.2.
Example—Admiral: “The shipboard surface-to-air missile exercise failed to destroy a simulated inbound enemy aircraft, so we must remove all of these missiles from the fleet.”
(Just because one missile failed does not mean they will all fail; more testing is needed.)
False cause: Fallacies due to a result of a false causal claim. Some false causal claims are due to myths and superstitions. See false causality discussed in Chapter 3.
Example—Politician: “The crime rate in my home town, located only 100 miles from the U.S. border, has doubled in the last year, showing our immigration policies are not working.”
(There is no verifiable data or studies that show immigration policies allowing the entry of more immigrants cause rising crime rates.)
Slippery slope: Fallacies asserting that if one thing happens, that one or more other things will follow; when there is no evidence to support the follow-on actions. This is a specific type of the false cause fallacy.
Example—Containment of the spread of Communist Soviet Union influence was a major part of U.S. strategy in the Cold War. The Domino Theory was part of this strategy as it offered: “if one country fell to Communism, then another would fall, then another, and another, just like a line of dominoes would fall in sequence if lined up on-end in a row and one is pushed over.”
(Domino Theory was seldom questioned. It was a classic example of an invalid slippery slope argument as there was no causal mechanisms supporting this theory. In fact, whether a country fell to communisim was related to its internal political, economic, and social conditions, and had little to do with domino-like forces pushing one country over after another.)
Weak analogy: Fallacies claiming that some items or events that have only a few similarities are practically the same in almost everything else. This is especially true when trying to analyze current or future behaviors or situations based on past behaviors or situations, because there may be significantly different contexts between the past and current or future behaviors or situations. (This is why incorporating analogies into analyses is not recommended.)
Example 1—Politician: “The U.S. Military Academy (West Point) and U.S. Naval Academy (Annapolis) should merge into one institution as they both graduate future military leaders.”
(Other than both graduating future military leaders, the two academies are very different in terms of their curriculums, service customs, procedures used to socialize cadets, organizational cultures, and the focus of preparing graduates for future careers: one produces U.S. Army soldiers and aviators, the other U.S. Navy sailors and aviators, plus U.S. Marine Corps infantry and aviators.)
Example 2—U.S. Presidential Advisor during 1962 Cuban Missile Crisis, when short and intermediate range nuclear missiles were found in Cuba: “We can expect the Soviets to behave in the exact same ways they did in the Greek Revolution and during their interventions in Czechoslovakia and Hungary as they will in Cuba. The Soviets only understand brute force, so we must attack Cuba!”
(The U.S. proximity to Cuba and the situation with nuclear weapons make the context of these past situations and the current (1962) Cuban situation completely different. In fact, the president did not take the advisor’s advice, and the crisis was solved peacefully.)
Post hoc ergo proctor hoc: Fallacies stating since A happened before B, A must have caused B. This is similar to the false cause fallacy.
Example—Politician: “Our political party was born over 100 years ago. Last year we won a war while our political party controlled the executive and legislative branches of government. Therefore, the formation of our political party caused us to win the war.”
(There are a number of other factors other than political party influencing the outcome of a war. Remember: Correlation does not mean causation.
Exigency: Fallacies offering nothing more than a time limit as a reason for a person to do what someone wants.
Example—Politician: “We need to pass the defense budget with the new shipbuilding authorizations by tomorrow or we may not get to it for months, since we have summer recess and national elections coming up next.”
(Shipbuilding plans take years to create and even longer for construction—a few months delay is not much in the larger scheme.)
Leap of faith: Fallacies asserting a causal linkage or condition exists with no good supporting reasons or evidence.
Example—Politician A: “We estimate that terrorist group A will attack our country soon.”
Politician B: “What are your reasons and evidence for that finding?”
Politician A: “Despite the terrorist group never stating it would attack our country, it does have the capabilities and logistic support to do so. Although it has never attacked us before, they have attacked our allies’ interests. We therefore estimate they will attack us soon, even though we are 4,000 miles away from their normal operating area. They must be planning attacks on our country!”
(More evidence and reasoning is needed to declare a threat than just that it exists in a distant region or may have done so in the distant past.)
Logical Argumentation
An additional indicator as to whether information, logic, and reasoning are valid and if a source is using at least the basics of critical thinking is to assess the structure of written or oral arguments. A critical thinker should use a logical-argumentation framework (described below) for evaluating the results of their own or others’ thinking efforts. If logical argumentation is not present, or is incorrectly applied, it is a major indicator the information, logic, and reasoning are being influenced more by falsehoods, emotions, feelings, unsubstantiated opinions, cognitive biases, and informal logic fallacies, rather than being based in a more comprehensive critical-thinking effort. Whether using natural science techniques, historicist (formal logic) methods, or behaviorist (social science) techniques, most academic disciplines and professions champion the use of logical argumentation combining information (data, facts, evidence) with basic logic and reasoning to reach contentions (theses, key judgments, findings, conclusions, and recommendations). This is also called evidentiary reasoning. Logical argumentation is a technique for organizing and presenting the results of a thinking effort—not necessarily for conducting the actual thinking process itself (see Chapters 2, 8). Logical argumentation is meant to persuade others of the validity of the arguments and contentions.
The logical argumentation technique should be familiar to most people as it mirrors methods taught in secondary and post-secondary schools to write research papers and present written and verbal arguments. A primary difference between academic and professional use of logical argumentation concerns how in the professions the logical argument contentions (findings, conclusions, etc.) are placed at the start of a written or verbal presentation (called the bottom-line-up-front) and not left to an ending summary or conclusion as used in most academic work. Lawyers are familiar with this technique as it is how they build their case arguments. Journalists use logical argumentation to organize and present their reporting. Historians use this general technique, organized using chronologies, to allow readers to follow the historical flow of their arguments. Academic debaters use a similar form of logical argumentation. Natural and social scientists also use logical argumentation to organize and present their deductive research results.
To assist with a logical argumentation presentation, the person should create an argument map as shown in Figure 4.3.[12] The argument map format allows the diagramming of an argument. The contention (bottom-line-up-front) is placed at the top of an argument map. The contention should be created only after completion of the critical-thinking effort (i.e., consisting of combining information (data, facts, evidence) with logic and reasoning) and then placed into the argument map framework. The argument map could cover the entire thinking project or just support a single, complex finding, all depending on the nature of the thinking project. Using the hierarchical structure shown in Figure 4.3, the findings (analyses) supporting the contention, already identified through the critical-thinking process, are presented. Figure 4.3 depicts only one finding; but, in situations where there are more supporting findings (usually the case), the most important should be listed starting on the left and moving right on the argument map. The most important findings are then on the far left of the map. Each finding is supported with reasons (information, logic, reasoning), in addition to any individual objections related specifically to the finding.
Figure 4.3 major objections address objections to the specific contention and are placed to the right on an argument map; again, the importance of major objections also runs left to right. Major objections should address alternative contentions not selected and must be included in the argument map and final report. These objections help moderate the effects of confirmation bias, where only evidence and reasons supporting pre-formed points of view may be presented, while overlooking evidence and reasons contrary to these pre-formed points of view. Each major objection will have its own supporting evidence and reasons. Additionally, rebuttals to either the major objection itself or to the evidence and reasons supporting the major objection help explain why the major objection was not considered further or included in the argument’s findings or contention.
Figure 4.4 is an example of a logical argumentation analysis with a corresponding argument map.[13] In this example, the question asks: “Should voting be compulsory in political elections?” Figure 4.4 is an Australian example, where compulsory voting exists, but could apply to democracies anywhere. The Figure 4.4 contention (recommendation) is that “Voting should be compulsory.”
As presented in Figure 4.4, the contention has two findings and one major objection. The first (and highest priority) finding is “Compulsory voting ensures that the Government is representative.” The second supporting finding is “Political parties do not have to waste money on persuading people to turn up to the voting booth.” The major objection offers “Compulsory voting is an infringement on democratic principles.” The evidence and reasons, objections, and rebuttals to the two findings and major objection are seen in Figure 4.4. While somewhat simplistic, this example reveals the power of a logical argumentation framework to help organize and present the results of a thinking effort.
Logical argumentation is of little use if a good prior critical-thinking effort is missing. This can lead to three main weaknesses. First, the argument can be weak when a person fails to collect all the truthful information (data, facts, evidence) on a situation. This means the person must collect information, assess its truthfulness (see Figure 3.7), and then use the information in their thinking. People may discount information that does not support pre-formed views of the resultant contention (conformation bias). They may leave out information pertinent to the major objections that does not support their findings. Poor logical argumentation may result when the thinker knows a customer for the analysis has a pre-formed view of the situation and he/she builds an argument to support this pre-formed view (this is bad analysis). The context of the situation may be misunderstood when all truthful information is not collected or included in the argument map.
Second, a logical argument may be weak if a person fails to develop all the alternatives available in the situation under study. Most situations have alternative courses of action. Some alternatives may be found in the information search, some may be identified when conceptualizing (theorizing) the study, and some may be the result of creative-thinking efforts (covered in Chapter 2). Not including all alternatives is also a common problem seen when the thinker is preparing an analysis for a customer with a pre-formed view of the results.
Third, a logical argument may be weak if the person employs several reasoning flaws in reaching their conclusions. They may apply formal logic mistakes or informal logic fallacies (such as Ad hominin (personal) attacks). The reasoning may also include an array of cognitive biases, with the most prominent being confirmation bias mentioned above. Thus, if a person is not working from a critical-thinking framework, these weaknesses will be noticeable from the presentation of their arguments. Following is an example of how an argument may be assessed for validity using the above tenets.
Logical Argumentation: U.S. Politics Example
In August 2022, the president signed into law the Inflation Reduction Act (IRA). The Act was meant to reduce inflation, address climate change, decrease health care costs, and reduce the federal budget deficit. The Act passed both houses of Congress along strict party lines. Senate budget reconciliation rules allowed a simple majority vote to pass the bill, bypassing the normal Senate 60 percent majority requirement to pass most bills.
One section of the IRA allocated $80 billion over 10 years to the Internal Revenue Service (IRS) to improve the agency’s performance, including collecting more of the estimated $600 billion in annual unpaid taxes. Over the previous decade, federal budgets had decreased IRS employees from 92,000 to 76,000, resulting in the agency’s degraded abilities to provide good customer service to all taxpayers and to conduct audits, especially on wealthy citizens whose tax returns were highly complex. The current administration estimated wealthy citizens (those making more than $400,000 a year) were the largest source of unpaid taxes. The IRA $80 billion additional funding allocated to the IRS was estimated by the administration to increase federal revenue and decrease the federal budget deficit by collecting around $204 billion in unpaid taxes over 10 years. The additional IRA funding would allow IRS to hire additional employees, including those assigned to auditing (accountants and tax law specialists), customer service representatives, and information technology (IT) specialists. The additional funding would also allow the IRS to upgrade their IT systems, some dating to the 1960s, which would better identify possible tax evaders. Opposition politicians immediately began a campaign to undermine the IRS budget increase.[14]
Figure 4.5 provides the argument map for assessing the opposition’s arguments opposing the IRS budget increase. This was not the argument of a single Senator or House member. Instead, senior opposition leaders first developed the main tenets (talking points) of this argument, it was then revised by other Senate and House members and eventually used by opposition party elected officials and supporters down through the federal, state, and local levels, including spreading the argument on partisan Internet, social media, cable news, and other media outlets. As the analysis below reveals, the Figure 4.5 argument is invalid. However, due to the thousands of times the Figure 4.5 argument was repeated, it underwent a process driven by a repetition logic fallacy, a condition where false information is repeated often and the audience (opposition supporters) began to believe it is true, even when it was a blatantly flawed argument.
The Figure 4.5 opposition contention was “[t]he IRA budget increase for the IRS must be opposed and reversed.” This contention rests on three main findings, all of which can be assessed as misinformation (see below). Figure 4.5 is also missing objections to individual findings and major objections with rebuttals to the overall contention. At a minimum, the opposition should have offered a major objection concerning the president’s point of view of how the IRA would increase federal revenue and decrease the federal budget deficit. Any argument with no underlying reasons (or questionable reasons), no individual finding objections, or no major objections with rebuttals, should be suspect from the start. An assessment of each individual finding in Figure 4.5 includes:
Finding 1 offers, “[t]he IRA funds an “army” of 87,000 new IRS agents. In fact, the IRA did not mention the number of new IRS employees. On the IRA’s passage, The Secretary of the Treasury directed the IRS Commissioner to determine how best IRS could use the $80 billion to hire additional employees and upgrade IRS IT systems. In addition to countering a 10-year decline in IRS employee numbers, the IRS reportedly faced over the next 5 years an estimated retirement of 50,000 of their current 76,000 employees. The reason the opposition gave for this finding was traced to a 2021 Presidential budget proposal, also calling for an $80 billion IRS budget increase that said the IRS would hire 86,852 additional employees with the increased funding. It appears this 2021 budget proposal, which was never passed, did not account for the need not only for additional IRS auditors, but also the IT system and employee hires to improve the degraded IRS customer services and the dire need to upgrade their aging IT systems. This opposition reasoning is a weak analogy logic fallacy as it claims that the mid-2022 IRS budget specifics are the same as in early-2021. The weak analogy fallacy is often used when trying to analyze current or future situations based on past situations, but there may be significantly different contexts between the past and current or future situations. This was the case here, as in mid-2022, the new presidential administration had a better understanding of the context of problems within the IRS and thus did not use the 86,852 number from the early-2021 budget proposal. This is why finding 1 is not supported by good evidence and is thus no more than misinformation.
Finding 2 states, “{a]ll of the 87,000 new IRS agents will be armed.” Several opposition officials claimed (wrongly) the current administration was increasingly stockpiling weapons and ammunition for the IRS, while creating a new armed law enforcement agency they could use to threaten the freedom and civil liberties of U.S. citizens. No evidence or reason was offered supporting this finding by the opposition. In fact, the IRS had a current force of around 2,100 armed criminal investigators (special agents), one of the many U.S. agencies with law enforcement powers. The IRS special agents are involved in criminal investigations, often with other U.S. law enforcement agencies, into financial crimes such as tax evasion, money-laundering, corruption, fraud, organized crime, and other criminal acts. There was no specific mention in the IRA of the hiring of more special agents—although it is logical some may be hired as the IRS expanded. It appeared the main personnel needs of the IRS were for additional auditors, more customer service representatives, and additional IT personnel. Finding 2 is pure misinformation (lies) with no associated evidence or logic—a case with characteristics of gaslighting. This also points out how confirmation bias can be employed in political arguments. Many opposition supporters were distrustful of U.S. government law enforcement agencies and frequently emphasize how their freedoms and civil liberties are being violated. Pre-programmed to distrust the U.S. government and ready to accept information that supports their existing perspectives, statements such as finding 2 are readily accepted and internalized by those who want to believe this finding, even if it is not true.
Finding 3 declares, “[t]he new IRS employees will squeeze more taxes out of those earning $400,000 or less and from small businesses.” In fact, the Secretary of the Treasury directed the IRS Commissioner to not plan for increasing audits on those earning $400,000 or less. Additional tax enforcement on small businesses is also not mentioned in any of the IRA or IRS literature. The only evidence for this finding is an opposition argument that the wealthy are only responsible for 4 to 9 percent of unpaid taxes. This opposition-cited statistic was credited to the Congressional Joint Committee on Taxation (JCT). However, estimates of the sources responsible for the greatest amounts of U.S. tax evasion differ substantially across government agencies and think tanks. In 2021, Department of the Treasury (DOT) published estimates that the wealthiest top 10 percent of U.S. taxpayers are responsible for 64.2 percent of the tax gap—the difference between taxes paid versus those that should be paid.[15] This equates to $385 billion of unpaid taxes annually. The top 1 percent of the wealthiest taxpayers account for 28 percent of the tax gap, estimated at $160 billion of unpaid taxes annually.[16] Citizens must be especially wary of information that cannot be confirmed across multiple sources. The opposition logic supporting this finding appears to be: (1) since the intent of the IRA is to increase IRS tax collection, then (2) the IRS will target those with the highest rates of unpaid taxes (according to the JCT estimates), i.e., those earning $400.000 or less (which is contrary to the DOT published estimates and DOT directions to IRS). In addition to playing on the confirmation bias of opposition supporters as described in finding 2 above, this finding draws on the straw-person (strawman) logic fallacy where misinformation is used to distort the opponent’s point of view or stance on an issue to make it easier to attack and disprove the opponent’s arguments. The $400,000 earning level figure was politically sensitive to the president, where he vowed in his presidential campaign and after inauguration that his administration would not increase taxes on any person (or couple) earning $400,000 or less. By promoting the questionable JCT statistics and using the faulty logic cited above, the opposition party hoped to discredt the president’s compliance with the promises he made to U.S. taxpayers, while also protecting their wealthy campaign donors from tax audits (their hidden agenda).
Thus, the argument presented in Figure 4.5 is invalid due to a combination of misinformation, application of cognitive biases, and employment of informal logic fallacies not to mention the violations of good logical argumentation. Another problem with the Figure 4.5 argument is its overall intent consisted of an appeal to fear informal logic fallacy. This fallacy is intended to move a person to fear the consequences of a situation or behavior. The Figure 4.5 findings of creating a new U.S. government armed force and that the IRS would target for audits those making $400,000 or less and small businesses are classic appeal to fear tactics. One opposition leader went so far as to tell prospective IRS employees to not take the new IRS jobs, as the opposition would defund them when they regained control of Congress in the future. In fact, when the opposition took control of the House of Representatives in January 2023, the first bill passed by the new opposition majority was to reduce the IRS budget by $70 billion over 10 years. Even though this bill was passed in the House, it had no chance of passing in the Senate or overcoming a presidential veto. One media report evaluated the Figure 4.5 argument of trying to cast the IRS as a “Boogyman”—a classic appeal to fear tactic. With Figure 4.5 being an actual U.S. political argument, one that lacked critical thinking and was completely invalid, but one still believed by millions of U.S. citizens, it should make you wonder about the overall quality of U.S. political discourse. Thus it is important that thinkers assess the information, cognitive biases, informal logic fallacies, and structure of the logical argument when conducting System 2 (slow) analyses.
Notes
Jonathan Haber, Critical Thinking (Cambridge, MA: The MIT Press, 2020), 29-31.↑
Daniel Kahneman, Thinking Fast and Slow (New York: Farrar, Straus and Giroux, 2011).↑
This differs from the more popular abstract theory of “Left Brain and Right Brain,” which science has refuted as new technology for brain scans shows that, depending on a person’s mental activity, both left and right sections of the brain may be engaged in thinking, depending on the type of thinking. ↑
Kahneman.↑
Michael W. Collier, Security Analysis, a Critical-Thinking Approach (Richmond, KY, Eastern Kentucky University Libraries Encompass, 2023), https://encompass.eku.edu/ekuopen/6/ or https://manifold.open.umn.edu/projects/security-analysis.↑
Modified from Kristan J. Wheaton (seminar presentation at the annual conference of the International Association for Intelligence Education, Erie, PA, July 14-16, 2014), supplemented by Kahneman, Thinking Fast and Slow.↑
Material for this section was synthesized primarily from Noel Hendrickson et al., The Rowman and Littlefield Handbook for Critical Thinking (Lanham, MD: Rowman and Littlefield Publishers, Inc., 2008), 111-126; and Nathaniel Bluedorn and Hans Bluedorn, The Fallacy Detective, Thirty-Six Lessons on How to Recognize Bad Reasoning, 2nd ed. (Muscatine, IA: Christian Logic, 2003), 205-208. ↑
See Patrick J. Hurley, A Concise Introduction to Logic, 10th ed. (Belmont, CA: Thomson Wadsworth, 2008). ↑
Informal logic fallacies include flaws in the evidence or reasoning of an argument and may be found in both inductive and deductive arguments. ↑
Hendrickson et al., 111.↑
All examples prepared by the author, often modifying examples in Hendrickson et al. and Bluedorn and Bluedorn. ↑
Modified from Reasoninglab, “Argument Mapping,” https://www.reasoninglab.com/argument-mapping/ (accessed July 25, 2018).↑
Ibid.↑
David Lawder, “Republicans call it an “army’ but IRS hires will replace retirees, do IT, says Treasury, Reuters,” August 19, 2022, href="https://www.yahoo.com/news/republicans-call-army-irs-hires-100822880.html">https://www.yahoo.com/news/republicans-call-army-irs-hires-100822880.html (accessed August 20, 2022). href="#endnote-ref-14">↑
Natasha Sarin, Deputy Assistant Secretary for Economic Policy, “The Case for a Robust Attack on the Tax Gap.” U.S. Department of the Treasury, Featured Stories, September 21, 2021, (home.treasury.gov) (accessed January 1, 2023). Statistics cited are from Jason DeBacker et al, “Tax Noncompliance and Measures of Income Inequality,” Tax Notes Federal, February 17, 2021. ↑
Ibid.↑