“Appendix I Informal Logic Fallacies” in “Security Analysis Text”
Appendix I
Informal Logic Fallacies
Introduction
Logic fallacies result in defective arguments.1 Formal logic fallacies are usually easy to identify due to the recognizeable false premises presented.2 Formal logic fallacies result from invalid evidence, bad assumptions, or poor reasoning that do not guarantee the truth of a corresponding statement, finding, or conclusion.3 Informal logic fallacies also may display these same corrupt charateristics, but are harder to identify as they are so common in society where bad reasoning often leads to equally bad arguments. Many people have become desensitized to informal logic fallacies as they are rampant in human discourse and are constantly reinforced as a result of their frequent use in the news media, editorials, entertainment media, marketing and advertising, political discourse, personal conversations. In these situations, individuals are trying to convince an audience of the correctness of their points of view, perspectives, findings, or recommendations. Avoiding both formal and informal logic fallacies is critical to ensuring valid security analysis written reports and verbal briefings.
Informal logic fallacies leading to defects in reasoning may be separated into four categories: fallacies of ambiguity, falacies of relevance, fallacies of presumptions, and fallacies of weak induction.4 Fallacies of ambiguity result in defective arguments due to problems with the wording or sentence structure of a statement. Fallacies of relevance concern cases in which the reasons given are not pertinent to the truth or falsehood of a statement. Fallacies of presumptions focus on how assumptions used in reasoning do not support the argument being made. Fallacies of weak induction result from the use of poor evidence and weak reasoning that do not support the findings or conclusions. Definitions and examples of specific informal logic fallacies under these four categories are presented in this appendix.5
Fallacies of Ambiguity
Equivocation. Fallacies that result from different meanings of a word in an argument (see Example 1), or using different definitions of words to support an argument (see Example 2).
Example 1--Sergeant: “I am going to turn you into a responsible soldier.” Soldier: “I am already responsible. Whenever something goes wrong around here, I am always held responsible.” (Notice the different meanings of the word responsible.)
Example 2—Politician A: “I seek a structure of democratic socialism, where citizens share in the benefits of our economic output.” Politician B: “See, Politician A is a socialist and will turn our country into an authoritarian state.”
(Notice the two Politicians are using different defintions of socialism—one the democratic socialist structure employed in many modern democratic states, the other the authoritarian socialism of the Soviet Union and other authoritarian states.)
Amphiboly. Fallacies based on loosely constructed sentences.
Example—“Today we will celebrate the one-hundred fiftieth anniversay of the Battle of Gettysburg in Washington D.C.” (Was the Battle of Gettysburg fought in Washington D.C., or will the celebration be held in Washington D.C.?)
Accent. Fallacies commited due to either emphasizing certain words such that their meaning shifts (see Example 1) or a statement’s meaning is shifted by using it partly out of context (see Example 2).
Example 1—Soldier: “It is a general principle a soldier should seek to get all the military training and education they can. So, I do not have to go to college for an education.” (Military education and college education are not necessarily the same, but most soldiers are encouraged to seek college degrees.)
Example 2—Politician A: “I agree with the Second Amendment to the U.S. Constitution that gives the right to citizens to bear arms, but I also support common-sense gun safety laws, including restricting certain citizens (mentally ill, convicted criminals, minors, etc.) so they do not have access to guns.” Politican B: “See, Politician A does not support the Second Amendment. (Full context not provided of Politican A’s or B’s views on the Second Amendment.)
Whole-to-part (division). Fallacies that assert what is true of something as a whole must also be true of each ot its parts.
Example—NATO is the most powerful defense force alliance on Earth. Estonia is a member of NATO. So, Estonia must have one of the most powerful defense forces on Earth. (Estonia is a small European country on the Baltic bordering Russia and does not have powerful defense forces, which is why it is in the alliance.)
Part-to-whole (composition). Fallacies that assert what is true of part of something also must be true of the whole thing.
Example—Saudi Arabia is one of the richest nations on Earth. Saudi Arabia belongs to the Arab League; so, the Arab League is made up of the richest nations on Earth. (All members of the Arab League are not rich.)
Fallacies of Relevance
Appeal to force. Fallacies resulting from an improper or inappropriate threat.
Example—Politician: “Country A must contribute its fair share to the security alliance. After all, they are currently allowed to be part of the alliance.” (Implied threat that Country A may be thrown out of the alliamce if it does not pay its fair share. This draws on the emotion of being threatened if action stated or implied is not carried out—also known as blackmail!)
Appeal to fear. Fallacies that move a person, country, etc., to fear the consequences of not doing what the other person, country, etc., wants.
Example 1—Country A: “Our security alliance must conduct a pre-emptive attack on Country Z, or other members of the alliance may be attacked soon by Country Z.” (Draws on the emotion of fear to incorrectly support an argument.)
Example 2—Politician: “If you vote for the opposing party our economy will be destroyed, unrestricted immigration will raise crime rates, and we will no longer be a democracy.” (Draws on the emotion of fear—with no supporting evidence—to convince voters not to vote for the opposing party.)
Personal attack (Ad hominem attack). Fallicies created by attacking an opponent’s character or their motives for believing something instead of disproving their argument. There are three types of fallacies of personal attacks: abusive, circumstantial, and tu quoque (you too).
Abusive. Fallacies where there is a direct attack on an opponent’s character.
Example—Politician: “The President says better relations with Country Z benefit our country. But, the President is a known liar and cheat so there is not reason to believe him.” (The politician attacks the President’s character and not the argument of whether there will be benefits by improving relations with Country Z.)
Circumstantial. Fallacies where aspects of the opponent’s circumstances are given as a reason not to support the argument.
Example—Politician A: “We must make use of better technology to secure our borders.”
Politician B: “We cannot believe Politican A because her family owns companies that may provide and install the technology, and thus financially benefit from Politican A’s statement. So, we cannot take her argument seriously.” (The fact that politician A’s family owns technology companies does not refute her argument of how better technology would secure the borders.)
Tu quoque (You too). Fallicies caused by dismissing an opponent’s viewpoint or behavior on an issue because they were inconsistent on the same thing in the past.
Example—Country Z invades Country A and seizes portions of Country A territory. Country Z justifies its actions as protecting Country A citizens who speak Country Z’s native language. The invasion was after Country Z previously made statements supporting the territorial integrity of Country A on numerous past occasions. Therefore, Country Z’s statements and behaviors are inconsistent and must be rejected. (Just because a past statement or action was inconsistent with recent actions does not invalidate the current justification by Country Z—even if its recent actions are illegal, unethical, or immoral.)
Mob appeal (Appeal to the people). Fallacies that play on people’s emotions by claiming a viewpoint is correct because many other people agree with it.
Example—Politician: “My opponent says we must better protect the borders because criminal gang members are entering the country illegally and committing murder, rape, and a host of other crimes. Many candidates in my opponent’s political party cite these same conditions. FBI statistics, though, say there are only a handful of crimes conducted by criminal gang members entering the country illegally, while per-capita crime rates are much higher for native-born citizens.” (Avoid accepting conditions or viewpoints that “many (or some) people say or know” without seeking reliable statistics or other evidence concerning the stated conditions.)
Pity. Fallacies that result from urging someone to do something only because of an emotional argument stating they pity us or something associated with us, or seek pity themselves.
Example—Recruit: “Sergeant, I tried, but I have never passed the obstacle course. If you do not pass me, I will not graduate from recruit training. My great-grandfather, grandfather, and father all served in the military with honor. If I do not pass recruit training I will disappoint my family and it will ruin my entire life.” (The Sergeant must weigh what is legally, ethically, and morally the best decision for this recruit and the service.)
Accident. Fallacies wherein a general principle (legal, ethical, or moral) is applied in a situation where the principle does not apply.
Example—Interrogator: “This terrorist admits he has planted a large “dirty bomb” set to explode in two days in a major city. Normal interrorgation techniques have not convinced the terrorist to reveal the city and location of the bomb. Therefore, we need to immediately employ enhanced interrorgation techniques, including torturing the terrorist, to hopefully save thousands of citizens.” (What legal, ethical, or moral principles do or do not apply in this case?)
Stereotyping. Fallacies arguing that an opponent’s decisions or behaviors are based on their ideology or other traits (e.g., political views, religion, ethnic group, language, country of origin, etc.).
Example—Politician: “All communists want to overthrow democracies and destroy capitalism. Since President Z is a communist, he wants to overthrow all democracies and destroy capitalism.” (The statement “all communists want to overthrow democracies and destroy capitalism” is invalid.)
Genetic fallacy. Fallacies condeming an argument because of where it began, how it began, or with whom it began (type of stereotyping).
Example—Speaker: “All persons born in Country Z want that country to rule the World. My next door neighbor here in Country A was born in Country Z and has lived here 20 years. Therefore, my next door neighbor wants Country Z to rule the World.” (The statement “all persons from Country Z want that country to rule the World” is invalid.)
Straw person (Strawman). Fallacies that distort the opponent’s point of view or stance on an issue to make it easier to attack and disprove the opponent’s arguments; thus, the attack is really about a point of view or stance that does not exist.
Example—Person A: “Increasing terrorist attacks are causing us to change our views of civil liberties such as free speech and privacy. Our country’s policies and actions should weigh slightly more on the side of improved security to keep our citizens safe, which will likely impinge on our civil liberties. Our citizens will support the slight loss of some civil liberties to gain more personal security.”
Person B: “Person A is advocating a Machiavellian approach saying the desireable ends (more security) justifies the means (impinging on civil liberties). This means in the interest of more security we will all be the subject of intrusive government surveillance, and we could all be locked up in detention camps if the government decides it needs to protect us.” (Person B distorted the original argument and then provides a “worst case” scenario not related to Person A’s original statements about slight changes in civil liberties.)
Red herring. Fallacies that introduce an irrelevant point into an argument.
Someone may think (or want people to think) it proves their point, but it really does not. Introducing material not related to the core argument is included in this fallacy. This fallacy takes its name from the British practice of dragging a bag of red herring across the trail in a fox hunt to distract the foxhounds off the actual trail of the fox. Red herring is similar to the Straw person fallacy.
Example—Reporter to President: “Will you start arms control talks with Country Z? President’s response: “It would be terribly presumptive for us to discuss arms control talks with Country Z when we have not yet talked to them about the issue. Most countries see arms control treaties as one way to help reduce violence and conflict in search of peace. I am sure Country Z has aspirations for peace like other countries.” (The President did not answer the original question but offered a Red herring response on a related issue (world peace) that she probably thought did answer the question.)
Bandwagon. Fallacies that pressure someone to do something just because many other people are doing it. This is similar to the Mob appeal (Appeal to the people) fallacy.
Example—Soldier: “All soldiers are reading the biography of General A; so, you should read it too if you want to be able to talk with other soldiers.” (It may not hurt to read the biography, but you could still talk with other soldiers if you did not.)
Irrelevant conclusion. Fallacies in which conclusions are reached bearing little resemblence to the supporting argument.
Example—Sailor: “We have put in a lot of hard work in preparing for the upcoming shipboard inspections and operational readiness evaluations. Therefore, we are going to pass with flying colors.” (A lot of hard work does not always result in the desired outcome, but it increases the probabilities of success.)
Repetition. Fallacies based on repeating a message loudly and often in the hope that it will eventually be believed.
Example—President: “I am not a crook! I am not a crook! I am not a crook! I have never been involved in anything illegal. I am not a crook! I will repeat this statement every day until the charges against me are dropped. I am not a crook!” (Just because a message or idea is repeated frequently does not make it true.)
Appeal to tradition. Fallacies that result from encouraging someone to buy a product or do something because it is associated with something old.
Example—General: “We must buy the new rapid fire gatling guns because the gatling gun has been a major part of our success in every war over the last 100 years.” (Just because something worked previously does not mean it is the best choice for the future.)
Appeal to hi-tech (Latest thing). Fallacies based on urging someone to buy something because it is the “latest thing,” but not necessarily because it is the best thing.
Example—Admiral: “We need to procure the latest laser missile defense systems for our ships as they are the most advanced technology available.” (Just because they are the latest technology does not mean they will work better than existing missile defense systems.)
Fallacies of Presumption
Circular reasoning. Fallacies attempting to support a conclusion by simply restating it in the same or similar wording. Someone says P is true because Q is true, and Q is true because P is true.
Example—Major: “We know our counterinsurgency doctrine is true, because it was written by our most inspired General. And, we know she is our most inspired General, because she wrote the truthful counterinsurgency doctrine.” (Notice the argument both assumes and concludes the counterinsurgency doctrine is true.)
Complex question (Loaded question). Fallacies resulting from loaded questions, where the respondent is put in a bad situation no matter the answer, or the answer to the presumed question is false.
Example 1—Politican A to Politician B: “When did you stop stealing your campaign funds?” (This assumes Politician B was in fact stealing campaign funds, which might not be true.)
Example 2—Sergeant: “Where did you get that ridiculous idea?” Soldier: “I saw it in my dreams last night.” Sergeant: “So you admit it is a ridiculous idea! We need a lot less sleeping around here.” (The Sergeant assumes it is a ridiculous idea, which might not be true.)
Suppressed evidence. Fallacies that result from withholding relevant evidence.
Example—Politician: “If we want to get rid of chemical weapons, just take them to a barren desert and bury them. There will be no chemical weapons after that.” (These statements suppress the relevant evidence on the effects of burying chemical weapons on the desert’s ecosystem and the ability to produce more chemical weapons.)
Either-or (False dichotomy). Fallacies asserting that we must choose between only two things, when in fact there are a number of different alternatives we could choose.
Example—General to President: “We either need to conduct a full-on military assault or do nothing in reaction to Country Z’s aggressive actions.” (Statement does not consider diplomatic or alternative, lesser military options.)
Fallacies of Weak Induction
Appeal to authority (Illegitimate authority). Fallacies due to an appeal to someone in authority but who has no special knowledge in the area they are discussing or due to tradition or rumors.
Example 1—Politician: “We must invade Country Z because Singer A, the most famous singer in the world, says we should.” (Singer A is outside their area of expertise.)
Example 2—Voter: “I must vote for all candidates from Political Party A because my family has voted for all candidates from this party over the past five decades.” (Play to tradition.)
Example 3—Voter: “Politicain A reported many studies (no specifics) showing his foreign policy agenda is the best.” (No information on exactly what studies say or if they even exist.)
Proof by lack of evidence (Appeal to ignorance). Fallacies claiming something is true simply because nobody has yet given any evidence to the contrary.
Example—General: “Country Z must have laser weapons as we have seen no evidence such weapons are located in Country Z.” (If there is no evidence whether a statement is true or false, it is best to suspend judgment regarding its truth.)
Hasty generalization. Fallacies using a very limited sample to generalize to a larger group or set of actions. Similar to the Part-to-whole fallacy.
Example—Admiral: “The shipboard surface-to-air missile test failed to destroy an inbound enemy aircraft, so we must remove all of these missiles from the fleet.” (Just because one missile failed does not mean they will all fail; more testing is needed.)
False cause. Fallacies due to a result of a false causal claim. Some false causal claims are due to myths and superstitions.
Example—Petty Officer: “Nautical superstitions say to never whistle on a ship, as whistling aboard ship will bring strong winds, placing the ship and crew in danger. One of our new sailors whistles onboard all the time. Sure enough, on our next deployment we ran smack into a hurricane.” (Science cannot prove whistling will generate a hurricane.)
Slippery slope. Fallacies asserting that if one thing happens, that one or more other things will follow; when there is no evidence to support the follow-on actions.
Example—Containment of the spread of Communist Soviet Union influence was a major part of U.S. strategy in the Cold War. The Domino Theory was part of this strategy as it offered: “if one country fell to Communism, then another would fall, then another, and another, just like a line of dominoes would fall in sequence if lined up in a row.” Domino Theory was seldom questioned. It was a classic example of an invalid Slippery slope argument as there was no causal basis supporting this theory. In fact, whether a country fell to communisim was related to its internal political, economic, and social conditions, and had little to do with domino-like forces pushing one country over after another. (This is another case related to the False cause fallacy.)
Weak analogy. Fallacies claiming that some items or events that have only a few similarities are practically the same in almost everything else. This is especially true when trying to analyze current or future behaviors or situations based on past behaviors or situations, because there may be significantly different contexts between the past and current or future behaviors or situations. (This is why incorporating analogies into security analysis is not recommended.)
Example 1—Politician: “The U.S. Military Academy (West Point) and U.S. Naval Academy (Annapolis) should merge into one institution as they both graduate future military leaders.” (Other than both graduating future military leaders, the two academies are very dissimilar in terms of their curriculums, customs used to socialize cadets, organizational cultures, and the focus of preparing graduates for future careers: one produces U.S. Army soldiers, the other U.S. Navy sailors and aviators, plus U.S. Marine Corps infantry and aviators.)
Example 2—U.S. Presidential Advisor during 1962 Cuban Missile Crisis, when short and intermediate range nuclear missiles were found in Cuba: “We can expect the Soviets to behave in the exact same ways they did in the Greek Revolution and during their interventions in Czechoslovakia and Hungary as they will in Cuba. The Soviets only understand brute force, so we must attack Cuba!” (The U.S. proximity to Cuba and the situation with nuclear weapons make the context of these past situations and the current (1962) Cuban situation completely different; in fact, President Kennedy did not take the advisor’s advice, and the crisis was solved peacefully.)
Post hoc ergo proctor hoc. Fallacies stating since A happened before B, A must have caused B. This is similar to the False cause fallacy.
Example—Politician: “Our political party was born over 100 years ago. Last year we won a war while our political party controlled the executive and legislative branches of government. Therefore, the formation of our political party caused us to win the war.” (There are a number of other factors other than political party influencing the outcome of a war. Remember: Correlation does not mean causation.)
Exigency. Fallacies offering nothing little more than a time limit as a reason for a person to do what someone wants.
Example—Politician: “We need to pass the defense budget with the new shipbuilding authorizations by tomorrow or we may not get to it for months, since we have summer recess and national elections coming up next.” (Shipbuilding plans take years to create and even longer for construction— a few months delay is not much in the larger scheme.)
Leap of faith. Fallacies asserting a causal linkage or condition exists with no good supporting reasons or evidence.
Example—Analyst: “We estimate that terrorist group A will attack our country soon.” Customer: “What are your reasons and evidence for that finding?”
Analyst: “Despite the terrorist group never stating it would attack our country, it does have the capabilities and logistic support to do so. Although it has never attacked us before, they have attacked our allies’ interests. We therefore estimate they will attack us soon, even though we are 4,000 miles away from their normal operating area. They must be planning attacks on our country.” (More evidence and reasoning is needed to declare a threat than just that it exists in a distant region or may have done so in the distant past.)
Notes
1 Material for this appendix was synthesized primarily from Noel Hendrickson et al., The Rowman and Littlefield Handbook for Critical Thinking (Lanham, MD: Rowman and Littlefield Publishers, Inc., 2008), 111-126; and Nathaniel Bluedorn and Hans Bluedorn, The Fallacy Detective, Thirty-Six Lessons on How to Recognize Bad Reasoning, 2nd ed. (Muscatine, IA: Christian Logic, 2003), 205-208.
2 See Patrick J. Hurley, A Concise Introduction to Logic, 10th ed. (Belmont, CA: Thomson Wadsworth, 2008).
3 Informal logic fallacies include flaws in the evidence or reasoning of an argument and may be found in both inductive and deductive arguments. Formal logic fallacies are normally found in deductive arguments and refer to errors in the scientific method or actual structure of the argument, which make the argument invalid.
5 All examples prepared by the author, often modifying examples in Hendrickson et al. and Bluedorn and Bluedorn.
We use cookies to analyze our traffic. Please decide if you are willing to accept cookies from our website. You can change this setting anytime in Privacy Settings.