Skip to main content

Full Text: Chapter 3: Information and Misinformation

Full Text
Chapter 3: Information and Misinformation
  • Show the following:

    Annotations
    Resources
  • Adjust appearance:

    Font
    Font style
    Color Scheme
    Light
    Dark
    Annotation contrast
    Low
    High
    Margins
  • Search within:
    • Notifications
    • Privacy
  • Project HomeCritical Thinking for a Better Civic Life
  • Projects
  • Learn more about Manifold

Notes

table of contents
  1. Licensing Statement
  2. Dedication
  3. About the Author
  4. Accessibility Statement
  5. List of Figures
  6. Preface
  7. Chapter 1: What is the Problem?
  8. Chapter 2: Human Thinking Across the Ages
  9. Chapter 3: Information and Misinformation
  10. Chapter 4: Cognitive Biases, Logic Fallacies,Bad Arguments
  11. Chapter 5: Democracy and Its Decline
  12. Chapter 6: Influences of Political Culture
  13. Chapter 7: Points of View, Assumptions, Beliefs
  14. Chapter 8: Critical Thinking at Work!
  15. Chapter 9: What is To Be Done?

Chapter 3

Information and Misinformation

Information is the currency of Democracy.

- Former U.S. President Thomas Jefferson

In the above quote, the third U.S. President Thomas Jefferson highlights how democracy requires information. At the heart of democracy and our civic life, being informed citizens require they become “critical consumers of information.” This chapter discusses issues regarding the search for and assessment of information. Chapter 4 will then address the combining of good information with good logic and reasoning to construct arguments based in knowledge (information plus logic).


The ability to assess truthful information is a major factor in critical thinking. Unfortunately, the last few decades have seen an avalanche of misinformation at work in U.S. politics. Misinformation is defined as “incorrect or misleading information.”[1] There has always been some misinformation in U.S. politics—usually due to an occasional inadvertent misstating of facts, statements made when the official lacks good information, or attempts to exaggerate or bolster the facts with untruthful or out of context information to serve an agenda. However, the last few years have seen an explosion of political misinformation as some politicians have made lying to U.S. citizens as a regular part of the discourse. Political candidates, political parties, and elected officials increasingly present a different reality of situations than is actually occurring. This chapter will assist citizens in locating and assessing information and determining if it is trustworthy or just a string of political lies.


Where Our Information Comes From


How people obtain and process information varies by the individual. Thinking starts with influences that provide information and often tell people how to process the information. Figure 3.1 identifies several of the main sources and influences of how people gain information.



A man's figure sitting down thinking with numerous boxes and arrows pointing at him.



Education is usually the primary influence on a person’s information base. The more education people receive, combined with the quality of the education, leads them to develop different perspectives on the world. Family and friends are a second major influence on a person’s information base. People may attend education programs for 6-8 hours a day, 5 days a week, for 9-10 months a year, over 12 or more years, but the remainder of their waking time they are mainly influenced by family and friends. Personal observations, including a person’s own experiences and reading engagement, are also influences on a person’s information base. Religion and political ideology (liberal, conservative, etc.) are also influences. What a person learns in their employment (workplace) and from the media and entertainment industries additionally influence a person’s information base and how they process this information (think). Every person experiences a different combination of the Figure 3.1 influences and thus develops varying perspectives on the world—both valid and faulty.


Since the advent of the Internet, how people nationwide receive and process information has changed greatly. Over the past few decades the Internet, in particular social media sites accessible on the Internet, have become a major source of U.S. citizen information. The Internet, and especially social media, have been described as “boiling cauldrons of hate and lies.” Thus, those who rely primarily on the Internet for their information see as much or more misinformation than truthful information. See more on Internet information below.


People gather and use information in different ways. The type of people this creates can be categorized along an information-seeking spectrum as follows:


Low Information (Politically Disengaged) People: Those who minimize their political information engagement: typically, do not read; pay only limited attention to the news media; often wait until just before an election to become “informed;” and are not proficient or care little about key issues nor strive to be critical thinkers. Low information people are those most afflicted by confirmation bias—or the refusal to consider perspectives other than their own. (Mentally Lazy U.S. Citizens—estimated ± 30% of U.S. voters.)[2]


Moderate Information People: Those who engage with some political information and read and listen to some professional and popular literature and news media (however often from one ideological view); think they are fully informed about key issues (but usually are not); and think they are critical thinkers (but likely are not). Confirmation bias is very prevalent in these people. (Many U.S. Citizens—estimated ± 50%.)


High Information People: Those who engage with diverse political information and read widely in professional and scholarly literature and popular news media; strive to understand multiple perspectives of issues; constantly seek to learn more about key issues; are the most likely to develop the tools to be critical thinkers. (Not Enough U.S. Citizens—estimated ± 20%.)


High information people often become “information addicts” in not only politics but all spheres of human knowledge. These people frantically seek a continuing vast volume of written and verbal information. Information addicts include, among others, knowledge professionals such as scholars, scientists, journalists, lawyers, intelligence analysts, technocrats, some government and military officials, some politicians, and some corporate leaders. Information addicts are intellectually curious and are always learning, including seeking additional education and training opportunities, spending hours watching network and cable television news channels, and voraciously reading and researching in a myriad of information sources. One major problem with information addicts is how some often limit their information sources to those of a particular ideological view, meaning they generate only a very narrow perspective of the world they seek to understand. To be a balanced information addict, and eventually a good critical thinker, requires the person to consume information from a wide array of sources, including from differing ideologies and different perspectives on the world. Information addicts are usually the most open to critical thinking.


Most people fall along the information-seeking spectrum below high information people. Most of these people can be considered “information moderates,” who spend some time seeking additional information, both written and verbal, mostly from a variety of information sources. Information moderates usually seek to become critical consumers of information (even if they do not know it) and are normally motivated to learn more about the world—hopefully from a number of perspectives. Like the information addicts, information moderates often limit their information sources to those of a particular ideological view, getting a very narrow perspective of the world. Information moderates usually seek to become better informed citizens and may at least partly be open to learning critical thinking.


Other people are intellectually lazy and can be classified as “information cynics,” who ignore, or outright avoid, learning from written and verbal information sources. Information cynics figure they already know what they need to know and if not will get the needed information from a single news source, family and friends, or co-workers. They fail to read or understand historical and current texts. The information cynics are usually anchored on one perspective of the world and resist information from other perspectives (confirmation bias). Information cynics are the least likely to become good critical thinkers. This book’s audience is high and moderate information people. It is unlikely those wishing to remain low information people would have an interest in learning critical thinking or increasing their civic education information base—the main topics of this book.


All people require the skills known as information literacy, meaning they become proficient in locating, assessing, using, and documenting their information. A recent movement teaches students “media literacy” a small slice of the larger information literacy—but a good place to start. Some information literacy skills are taught in secondary and post-secondary education programs, but except for trained academic and professional researchers, people are often ill-prepared for the level of information literacy needed to be informed citizens. This chapter addresses information literacy skills related to locating and assessing information from a variety of sources. This chapter is for those desiring to be critical consumers of information in their civic, personal, and professional lives. For example, buying a new automobile requires a medium amount of information search, combined with a simple critical-thinking effort (see Chapters 2 and 8), which is likely all that is required to buy an acceptable automobile. However, for larger thinking projects, such as generated in the workplace or in their civic and personal lives, like deciding what political candidates would best serve a person’s interests (see Chapter 8), a much deeper information gathering effort and more complex critical-thinking process is required. Being information literate is a key to being both a good critical thinker and an informed citizen.


Epistemology: The Study of Knowledge


People use information, combined with logic and reasoning, to create knowledge and decide what to think or how to act (behave). The academic field of epistemology, the study of knowledge, helps us understand basic information gathering and later decision-making. Epistemology investigates where knowledge (information combined with logic and reasoning) originates.[3] Academics normally offer seven categories of knowledge depicted in Figure 3.2. If a person could go back and classify every piece of knowledge they possess, it could likely be placed in one or more of the below seven categories. Any individual’s knowledge base is usually a combination of material from all these categories.[4] An initial step in analyzing how a person’s information base influences their behaviors is to uncover the most significant sources of their knowledge on the problem or decision under study.


Figure 3.2Categories of Knowledge
CategoryDescription
AuthorityKnowledge provided by an expert or someone in authority (e.g., president, dictator, minister, teacher, journalist, parent, cult leader) that a person tends to accept without challenge.
FaithKnowledge accepted with little or no supporting evidence and rarely challenged. Religious and political ideologies are major sources of faith-based knowledge.
Common SenseKnowledge presented as something “everybody just knows,” often accepted without question.
IntuitionKnowledge that feels instinctive, with no conscious reasoning process behind it, and usually accepted without challenge.
EmpiricismKnowledge gained through observation, experience, and data collection using the five senses or instruments (e.g., thermometers, gauges). Such knowledge can be tested or challenged with further observation.
RationalismKnowledge developed through logic and reasoning, which can be evaluated or challenged through further reasoning and debate.
ScienceKnowledge derived from combining empiricism and rationalism, subject to verification through the scientific method and critical thinking frameworks.

Figure 3.2 knowledge categories are not presented in a hierarchical order; however, the science category is generally considered the highest form of knowledge. When a person’s decision-making is mainly driven by the first four knowledge categories (authority, faith, common sense, and intuition) their conclusions are likely no more than opinions driven by emotions and feelings, which if used in decision-making may result in unsuccessful outcomes. This is not always the case, but these four categories generally lack the ability to challenge the knowledge presented. The last three knowledge categories (empiricism, rationalism, and science) are more likely to lead to rational thinking and successful decision-making outcomes, which can confirm or challenge existing knowledge.


The Figure 3.2 knowledge categories can be full of misinformation. This chapter and the following Chapter 4 discuss techniques for identifying misinformation. Additional discussion of each category of knowledge is presented below:


Authority knowledge occurs when an expert or someone in authority provides the knowledge that a person tends to accept without challenge. This category of knowledge may not be capable of being replicated, which means it often cannot be studied. Much of what is read and heard in the open public media falls into this category. For example, high-ranking people or other pundits often give their opinions on an issue in a government report, speech, newspaper editorial, television interview, non-academic journal, or other communication; however, there may exist little empirical data, reasoning, or systematic analysis to support the opinions. In some cases, the authority figure will distort the data to support their intended message or agenda, or to deceive the intended audience. Even though some of these opinion pieces may be from highly informed sources, it does not ensure that the message will pass the scrutiny of the scientific method or the critical-thinking framework. Therefore, a person must be extremely careful about using opinion-based knowledge gained from so-called authorities.

Faith knowledge is at play when a person accepts the knowledge presented with little or no supporting evidence. Most people think of faith knowledge as mainly religious teaching; however, this category also encompasses a wider range of knowledge sources driven by ideologies in politics, economics, and social matters. Faith knowledge is often based on “sacred texts” that lack empirical (factual) support. Faith knowledge sources often include stories, myths, folktales, rumors, conspiracy theories, political ideologies, cult messages, and other sources of knowledge that lack a solid empirical basis. Faith knowledge is often a means groups use to pass on the stories, myths, and folktales that define the values, ethics, and expected behaviors of their community. For example, the folktale of lumberjack Paul Bunyan and Babe the Blue Ox in the woodlands of the northern United States and southern Canada advances community values of personal strength and superhuman labor. Rumors and conspiracy theories usually lack supporting evidence and pander to the confirmation bias tendencies in those who believe them (see Chapter 4).


Political ideologies try to establish community values and define government policy and human behavior. Those following a specific political ideology are highly subject to confirmation bias. Cults usually offer a more or less consistent and tempting belief system not based in evidence and encompass social groups holding a specific ideology (religious, political, etc.) or having common interests in a particular personality, object, or goal. The word “cult” is considered pejorative and often leads to negative consequences. For example, in 1978, Jim Jones’s Peoples’ Temple religious cult resulted in a mass suicide of over 900 cult members in Guyana—men, women, and children. In 1993, David Koresh’s Branch Davidians religious cult was involved in a violent clash with U.S. federal law enforcement in Waco, Texas, resulting in the deaths of 4 federal agents and 85 cult members, including several women and children. Faith knowledge differs from authority knowledge as it usually offers a consistent, enticing belief system for its members. Because there is often little empirical basis for faith knowledge, it cannot be challenged as it cannot be replicated to test for validity. Religious and political ideologies’ roles in thinking about democracy and civic life are addressed in more detail in Chapter 6.


Common-sense knowledge is usually obtained through externally plausible sources and is presented as if “everybody just knows it.” Common-sense knowledge is usually tied to a person’s overall beliefs, which differ for people with competing belief systems (see chapters 6 and 7). Common-sense knowledge will likely differ based on beliefs emerging from a person’s culture and the context of a situation. For example, in the Sikh religion it is common-sense that men do not cut their hair or beard, something not common-sense in a Christian religion. Common-sense knowledge is often contradictory and usually too general to be challenged or studied. As one saying holds, “Common sense is not all that common!” Chapter 7 provides an expanded discussion of how beliefs influence thinking.


Intuition knowledge has no conscious reason for knowing. It usually originates in internal belief systems, sometimes where implicit connections are made through material gained from other knowledge categories—primarily authority, faith, and common-sense categories. Intuition often employs biased facts and poor logic and reasoning. Moreover, intuition is often without a strong empirical basis; that is, people just seem to know (think) it is true. For example, Soviet Premier Joseph Stalin used his intuition to conclude there was no chance the Germans would attack the Soviet Union in June 1941—he turned out to be very wrong, resulting in an eventual 25,000,000 Soviet casualties in World War II. Intuition knowledge usually cannot be replicated, thus cannot be challenged or assessed for validity.


Empiricism knowledge is gained from a person’s observations, experiences and gathering of data, facts, evidence, etc., with the five senses or with technical measurement instruments (thermometers, gauges, etc.). It can be challenged as additional observations are usually possible. To brute-empiricists, only the empirical data matters. Like Sgt. Joe Friday said, “Just the facts, ma'am, just the facts.” (Sgt. Joe Friday was the lead character in the 1950’s and 1960’s TV detective program Dragnet.)


Rationalism knowledge is generated through humans’ innate abilities to use logic and reasoning that allows a person to reason separately from their actual experiences with the real world. This is the realm of theory-building.


Science (includes social science) knowledge is gained through combining empiricism and rationalism. This category argues that empiricism and rationalism alone may not always be correct; however, if the two are used in tandem, it is more likely to generate knowledge with fewer biases. The scientific method allows the assessment of research validity. Social science positivists assume the scientific method may be used to study human behavior, human decisions, and human conditions.[5] In Chapter 2, this book takes a positivist approach as it combines the scientific method with a robust critical-thinking framework.


Locating Information and Knowledge


The above discussion reveals a person’s knowledge base is generated by a number of influences (Figure 3.1) and can be placed in a number of epistemological categories (Figure 3.2). A major contributor to our information and knowledge bases can be found in a myriad of open-source (publicly available) information channels. Open source means any of the material normally available to a public audience, including books, magazines, movies, television, videos, newscasts, web sites, pod casts, etc., etc. A critical consumer of information must know how to access and search the vast open-source material available. Open-source material is organized and found in hundreds of private, commercial, government, and academic databases, both online and on paper. Small local libraries up to large university research libraries offer access to a significant amount of open-source material. A significant amount of material is also available through the Internet. People should keep in mind that not all world knowledge is online. On many occasions the person must search for paper sources in physical libraries, archives, public records, church records, corporate records, etc., to locate relevant open-source material. Not all open-source materials are of the same quality. Some open-source material is valid (factual, meets scientific standards) and other material is faulty (misinformation that is not factual, primarily supported by opinions). Most libraries, archives, and record centers have reference specialists to assist in searching databases and locating specific information for the thinker (citizen, researcher) to assess.


Searching Open Sources


When searching open-source databases and assessing items found, three general categories of information and knowledge should be sought:


Facts (data, evidence, information) that directly concern the topic. A person should try to confirm and corroborate the facts uncovered by separate, independent sources. Facts may be found in raw reporting by those personally observing the event, such as in media stories where the journalist personally observed the events being reported. When the facts are reported by an actual observer, or if the facts are accessible readily from the source, it is usually classified as a primary source. Articles published by the observer, plus official documents or reports are usually considered primary sources. Secondary sources are generated by others using primary sources (or other secondary sources), including most statistical studies. (Empiricism)


Logic and reasoning lacking facts, such as statements or propositions that cannot be factually verified but employ theory (logic and reasoning) such that they may be considered knowledge. Logic and reasoning lacking facts are often classified as assumptions, which are included in critical thinking. See Chapter 7 for more detail on assumptions. (Rationalism)


Facts, combined with logic and reasoning, which normally are found in statements of causality, arguments, and contentions (theses, judgements, findings, conclusions, recommendations), or theoretical propositions (assumptions, axioms, theorems, postulates, laws). Before the material may be classified as scientific knowledge, the statements, arguments, contentions, and propositions must be checked to ensure they do not include cognitive biases and informal logic fallacies (see Chapter 4). (Science)


Prioritizing Open-Source Searches


Open sources vary in the validity (truthfulness, accuracy) of the information they provide. It is therefore recommended that multiple pertinent sources be searched in order of their reputation for validity. Figure 3.3 provides a recommended sequence for open-source searching based on the sources’ reputations for validity.


Figure 3.3Prioritizing Open-Source Searches
Priority / SourceDescription
1. Government Reporting and Official RecordsU.S. government agencies and international governmental organizations (IGOs) produce recurring and special reports on a wide range of subjects. Governments also maintain volumes of official public records containing data on individuals and programs.
2. Scholarly and Professional ArticlesAcademic research and professional articles published in peer-reviewed or professional journals.
3. Scholarly and Professional BooksBooks written by academic or professional experts and published through academic or professional presses.
4. Legal DatabasesDatabases that allow researchers to search legal materials—such as case transcripts, court documents, and law journals—which may also include public records and major news coverage.
5. Think Tank and Non-Governmental Organization (NGO) ReportingResearch reports produced by think tank scholars and NGO researchers, often published in journals, books, or on organizational websites.
6. Popular Media: Newspapers, Magazines, Television, Radio, and MoreOpen-source materials created for general audiences across multiple formats and platforms.
7. InternetThe last place to search for open-source material; includes websites, blogs, and social media, where information quality and accuracy vary widely and may include misinformation or disinformation.

Additional discussion of each source of open material is presented below.


Government reporting and official records. U.S. government agencies and international governmental organizations (IGOs) produce a number of recurring and special reports on a variety of subjects. IGOs are organizations or agencies where countries are the members (United Nations, North Atlantic Treaty Organization, etc.). IGOs issue generally valid reports. Most of these agencies and organizations employ professional researchers who can normally be trusted to provide objective and valid reporting. For example, U.S. Congressional Research Service (CRS) and U.S. Government Accountability Office (GAO) reports normally can be considered bipartisan and objective. People still should be wary of governmental reporting as at times it may slant toward supporting a particular government or IGO’s programs or ideology. Government reporting often is a good source of statistical data to support a thinking project.


Governments also maintain a plethora of official (public) records. These include both data on individuals and on government programs. These records can usually be considered valid primary sources. But beware of any reporting using government crime data. There is no U.S. single database on crime and no mandatory national crime reporting requirements, so any reports based on crime data are usually lacking complete information. Additionally, government crime report analysis is often focused on a partisan attempt to make the crime statistics comply with their political view. There is no “honest broker” on crime reporting and thus any studies using crime data are likely rife with bias.


Scholarly and professional articles. After governmental reporting, scholarly and professional articles are usually excellent sources of information. Scholarly articles are produced by academic researchers and published in academic journals. Most of the articles are refereed by panels of other academics (known as “peer review”). If not refereed, then the editors of the particular journal provide quality control on articles. Scholarly articles should be searched before scholarly books, as the scholarly publishing process usually first publishes the latest research results as journal articles before later expanding and/or including the material in books. Professional articles normally are published in journals or magazines. These articles usually are written by professional researchers or leaders in their professional field. For example, Foreign Affairs, Foreign Policy, National Geographic, and Popular Mechanics, among others, are good sources of professional articles. Thinkers must still assess the validity of scholarly articles, even if passing a refereed review, and of professional articles as they may still lack validity as they support particular programs or ideologies. It is sometimes difficult to identify what is scholarly or professional literature from popular media reporting. Figure 3.4 provides a template for assessing these differences.


Figure 3.4Classifying Scholarly/Professional and Media Reporting
CategoryScholarly / Professional LiteraturePopular Media
AudienceScholars, researchers, and professional practitioners.General public.
AuthorsExperts in the field (scholars, researchers, professionals). Articles are signed and often include author credentials and affiliations.Journalists or freelance writers. Articles may or may not be signed.
ReferencesIncludes a bibliography, references, footnotes, endnotes, and/or a works cited section.Rarely include references or sources.
EditorsReviewed by an editorial board of outside scholars (“peer review”) or by a professional editorial staff with subject expertise.Editors and staff may or may not possess subject expertise.
PublishersOften a scholarly or professional organization or academic press.Commercial, for-profit publisher.
Writing StyleAssumes prior knowledge in the field; contains specialized language (jargon). Articles are often lengthy.Easy to read—aimed at the general reader (approximately 7th-grade level). Articles are shorter and often entertain as they inform.
General CharacteristicsPrimarily text-based with few pictures. Includes tables, graphs, and diagrams. Minimal advertising. Often titled “journal,” “review,” or “quarterly.” Continuous pagination and narrow subject focus.Contains advertising and photographs. Often printed on glossy paper. Sold at newsstands or bookstores. Restarts pagination with each issue. Broad subject focus.

Source: Created from numerous materials.


Scholarly and professional books. There are a deluge of scholarly and professional books. Scholarly books may be historical or scientific (including social science). Historical books provide past details on a topic and are good sources for background and contextual information. Scientific books focus on presenting one or more theories and the facts, logic, and reasoning supporting the theories and their findings or conclusions. Some books will be combinations of historical and scientific studies. Some scholarly books provide edited compilations of scholarly journal articles or chapters in the same subject area. Some scholarly books are an expansion of ideas in a previously published scholarly journal article. For example, Harvard University political scientist Samuel Huntington took his journal article published in 1993 in Foreign Affairs magazine, “The Clash of Civilizations,” and later expanded it into an entire book of the same title—one of the most discussed articles/books of the last few decades on world conflict.[6] Professional books usually provide detailed descriptive material, lessons learned, and/or “how to” or other recommendations on a topic. Just as with scholarly and professional articles, books may contain biases in support of specific programs or ideologies.


Legal databases. Most research topics are embedded in a legal structure. Even if not lawyers, thinkers should understand and consider the legal structure applying to their thinking project. This is especially important in projects where revised or new policies are recommended, which must comply with existing laws and regulations. Legal databases have been developed to allow people to search legal material, public records, and news reporting. Legal material also is available for legislation, statutes, treaties, regulations, court case transcripts, and associated case documents (briefs, pleadings, motions, settlements, and verdicts). Lastly, the databases include law reviews and law journal articles. Westlaw and Lexis/Nexis are two competitors that organize and manage legal databases and allow access for a fee. Most law firms and large libraries, plus some large businesses and agencies, have subscriptions to either or both.

Think tank and non-governmental organization (NGO) reporting. Dozens of academic and professional think tanks and thousands of NGOs produce research reports useful for open-source searching. Academic think tank reports normally do not undergo a refereed or peer review and, at most, are reviewed by local editors. Professional think tank and NGO reports also normally do not undergo a review beyond local editors. All think tank and NGO sources require extra scrutiny to ensure validity in the reporting as they are often rife with ideological slants that degrade validity. Think tanks often focus on a single issue or narrow set of issues. Some are supported by universities, others by contracts, grants, and donations from those interested in the think tank’s issue areas and support its political orientation. Figure 3.5 provides a summary of selected professional think tanks indicating political and ideological orientations that may be useful.


Figure 3.5Selected Think Tank Political Orientations[7]
Political OrientationThink Tanks
Left / LiberalBrookings Institution, Center for American Progress, Inter-American Dialogue, Human Rights Watch, American Civil Liberties Union, Migration Policy Institute, Brennan Center for Justice
CentristAtlantic Council, Aspen Institute, Carnegie Endowment for International Peace, Cato Institute (libertarian), Center for Strategic and International Studies, Council on Foreign Relations, Freedom House, RAND Corporation, Woodrow Wilson International Center for Scholars, Congressional Research Service
Right / ConservativeAmerican Enterprise Institute, Claremont Institute, Heritage Foundation, Hoover Institution, Center for Immigration Studies

NGOs include individuals or groups that organize around an issue area but do not represent federal, state, or local governments. There are hundreds of thousands of NGOs. A local book club, scout troop, or homeowner’s association are considered NGOs. Many NGOs provide a combination of advocacy, research, and service in a related issue area. NGO political orientations often are highlighted by their title or by a quick review of their reporting or activities. Some NGOs, such as the American Civil Liberties Union, may be classified as both a think tank and an NGO, because it provides in-depth research, advocacy, and service (e.g., filing lawsuits) in the area of civil liberties. Those searching open sources should assess the political orientation and validity of information accessed, whether from a think tank or NGO.


Popular media: newspapers, magazines, television, radio, and more. Popular media has a major advantage in collecting some information because news reporters of larger media companies might be on location and have observed national and world events firsthand. The major weaknesses of popular media concern the sheer volume of information and how popular media often suffers from severe misinformation and validity problems. Figure 3.4 provides the general characteristics of popular media. Validity problems are often seen in material found in newspapers (including online news), magazines, television, radio, and other sources. Validity problems begin with the popular media’s political orientation and include their approach to using factual data and the accuracy and depth of their analyses.


Figure 3.6 provides a summary of the quality and political orientation of selected popular media. In general, centrist media sources of high- and medium-quality are the most useful to people seeking truthful information. The low-quality media sources listed should be avoided or accessed with extreme caution due to their proliferation of misinformation. If accessed, high- and medium-quality sources with left/liberal or right/conservative political orientations should undergo strict validity checks for bias. Highly biased information should not be considered as valid open-source material.


Figure 3.6Selected Popular Media Bias Analysis[8]
Political OrientationMedia Sources by Quality

High Quality: Good Use of Facts, Advanced/Complex Analyses

Left / LiberalThe Atlantic, Slate, Vox, The Guardian, Axios
CentristThe Wall Street Journal (news), AP, NPR (news), PBS, BBC, Politico, Bloomberg, Reuters, USA Today, Time, Foreign Policy
Right / ConservativeThe Wall Street Journal (opinion), The Economist, The Hill

Medium Quality: Good Use of Facts, Generally Fair Analyses

Left / LiberalLocal Newspapers (liberal cities/states), The New Yorker, The New York Times (opinion), MSNBC (opinion), CNN (opinion), Mother Jones, Huffington Post, Vanity Fair, BuzzFeed (news), Slate
CentristLocal Television News, The New York Times (news), MSNBC (news), CNN (news), The Washington Post, NPR (opinion), News Nation, Network News: ABC, CBS, NBC
Right / ConservativeLocal Newspapers (conservative cities/states), Fox News/Business (non-political news), The Washington Times, National Review, The Federalist

Low Quality: Poor or Made-Up Facts, Inaccurate/Unfair Analyses

Left / LiberalOccupy Democrats, U.S. Uncut, Forward Progressives, David Wolfe, Palmer Report, Splinter
CentristNational Enquirer
Right / ConservativeFox News/Business (political news and opinion), Breitbart, Newsmax, OANN, Gateway Pundit, The Blaze, The Daily Wire

Internet sites. Because of often-severe misinformation and validity problems with material posted on Internet sites, especially on blogs and social media, they are usually the last place a thinker should search for information. Often government reporting, scholarly and professional articles/books, legal databases, think tank and NGO reporting, or popular media sources described above may be accessed on the Internet and could include the biases previously discussed. This warning on using Internet sites is for material beyond these sources, because anyone can set up an Internet site and post any written, voice, or video material they choose. A particular problem with Internet-based blogs and social media, of which there are thousands, is the presentation of misinformation (lies, fake news) and disinformation (propaganda, deception).


In the United States, civil liberties conventions allow virtual unregulated publishing on the Internet. There are some Internet sites considered in a “gray area,” such as Wikipedia, because anyone can access it and change or update the site’s articles to support their personal perspective or ideological slant. Thus, there is no real method for validity control on Wikipedia. The best a person can use this site for is to determine if the Wikipedia article triangulates (see discussion below) with other information found on a topic and then use the reference list included with most Wikipedia articles as sources to expand the information search on a topic. The bottom line when dealing with Internet-based information is to “question everything,” in other words, be suspect of all open-source Internet material until its validity is determined.

Assessing Information


Assessing information in search of the truth is often difficult and frustrating. Social science offers we may never actually find the truth, but it is essential to strive for the truth and get as close as possible. Policies and actions based on the truth (or as close as we can get) are likely to be the most successful, while those based on untruths (misinformation) are more likely to end in failures. There are a multitude of techniques for generating misinformation to affect political policies and actions. Below are discussions of some general techniques for thinkers to assess the validity of information and hopefully uncover misinformation.

Corroborate, Corroborate, Corroborate


The best strategy for identifying and avoiding misinformation is to corroborate all information with multiple sources. In academic research this is called triangulation, a term taken from maritime navigation whereby bearings and ranges are taken on different geographic points to fix the mariner’s exact position. Triangulation of facts, data, and evidence based on different multiple sources is the best way to ensure the truth (validity) of the information collected.


It is not always possible to corroborate or triangulate information as at times there may only be one legitimate source. In this case the reputation of the source must also be evaluated. The logic entails if the source has been truthful in the past, it is assumed they will likely be truthful in the present and future. For example, in the 4-year (2017-2020) U.S. presidential administration, the president was found to have made false or untruthful public statements over 30,000 times—a poor reputation for being truthful. In addition to source reputation, the thinker also needs to consider the context of the information (supporting structure of facts and logic—see Chapter 4) and evaluate if the information makes sense considering the issue at hand. If the thinker is a specialist in the issue at hand, they will be more likely to identify whether misinformation has been presented. In any case, if the thinker’s “gut” says the information is too good to be true or otherwise suspect, then it should be investigated as possible misinformation.


For example, it is common for some politicians and U.S. citizens to offer “undocumented immigrants are substantially raising the crime levels in the United States.” In fact, numerous academic and professional studies have confirmed undocumented immigrants commit crimes at a far lower level than native-born U.S. citizens (see Chapter 7).[9] This is mainly because undocumented immigrants do not want to be arrested and deported. This is just one piece of misinformation used to create fear (fearmongering) of undocumented immigrants in the eyes of U.S. citizens. More on this topic below and in Chapter 7.

Gaslighting


When a flow of misinformation tries to purposely convince an audience of a reality that does not exist it is called gaslighting. This is also considered disinformation and/or propaganda. The term gaslighting is taken from a 1944 U.S. film Gaslight where a man tried to convince his wealthy wife that she was losing her mind and needed to be institutionalized. With his wife out of the way, the man could then access her fortune. The man created confusion in his wife’s thinking in several ways, including by random sudden adjustments (up or down) of their home’s interior gas lights.[10] With the increase of cascading U.S. political misinformation in the last decade, gaslighting has been a term widely used in U.S. public discourse to indicate the spread of false realities.


For example, the above political statements about undocumented immigrants and crime rates are only a part of a political gaslighting campaign to convince U.S. citizens to fear undocumented immigrants. Other misinformation in this campaign paints undocumented immigrants as presenting a major national security (criminal) threat, taking U.S. citizen jobs, not paying taxes, abusing public assistance programs, and bringing diseases into the United States. Multiple academic and think tank studies have shown none of these claims are true relative to similar information for native-born U.S. citizens (see details in Chapter 7). However, through the repetition informal logic fallacy (see Chapter 4) these falsehoods about undocumented immigrants have been repeated so many times that partisan anti-immigration politicians, news media, and citizens are convinced they are true and will continually repeat this false reality—in other words, they have been gaslit. The gaslighting campaign to paint U.S. immigrants in a bad light and by creating fear of immigrants among U.S. citizens was used to justify building a “wall” along the Mexico-U.S. border, deploying the U.S. military to secure the U.S. border, and increasing, with military support, detention and deportation of undocumented immigrants. Policies and actions based on gaslit misinformation are often costly to U.S. taxpayers and likely to fail in the end.


Below are some questions to ask if the thinker believes they are being gaslit:

1. Does the source try to convince others that their points of view are wrong or insignificant and the source’s point of view is the reality?

2. Does the source try to downplay or sweep under the rug harmful behavior resulting from their point of view?

3. Does the source never assume blame or take responsibility?

4. Does the source make others feel out of touch with reality?

5. Does the source make others feel anxious?[11]

Circular Reporting


While reviewing information from open sources, a thinker must be aware of potential circular reporting. This is a special type of misinformation and can occur when there is only one unverified or unreliable source for a piece of information, which then gets repeated through several open-source or official reporting channels without other confirming evidence. Circular reporting is most noticeable in popular media and Internet reporting; but, at times, it has been known to creep its way into government and other normally reputable reporting. “Echo chamber” is a term often used for when misinformation is amplified by circular reporting among popular media and official sources with partisan ideological orientations.


For example, in 2025, the U.S. president claimed Harvard University was offering remedial mathematics courses, which was later exaggerated to claim Harvard was teaching middle-school mathematics to its incoming students. Using circular reporting, the U.S. Department of Education also highlighted the same claim, in addition to other U.S. departments and partisan public media. It appears the president’s intent was to question the quality of Harvard’s curriculum and to disparage the university that was resisting his administration’s ongoing assault of placing restrictions on academic freedoms and the trying to cancel Harvard’s government grant funding. In fact, the original statement by the president was based on false premises that became a case of circular reporting as it was repeated by other government departments and the media.


The true reality found Harvard did advertise their remedial mathematics that had been taught for years. But the U.S. president took this advertisement out of context to serve his own political purposes. Many incoming Harvard students, as is also the case in other well-respected universities,[12] come from high schools that do not teach advanced mathematics, not allowing these students to easily transition into a rigorous first-year calculus course. Even though Harvard incoming students average a score of 750 (out of 800) on the Scholastic Aptitude Test (SAT) math section, some still need assistance to successfully complete a tough university calculus course. Harvard defines remedial courses and acted on their definition in two main ways. First, they offer a summer pre-calculus course to incoming freshmen to help prepare them for first-year calculus. Second, they offer a first semester calculus course that meets five days a week, instead of the normal three days, to give the students additional instruction; however, the five-day and three-day a week courses have the exact same homework and exams.[13] Thus, thinkers should assess whether their sources are taking information out of context and if the multiple sources found are a result of circular reporting.

Omission of Information


At times the omission of information can be considered misinformation. When a source provides a broad general statement without additional supporting information it should be suspected as misinformation. When the source is asked for specific supporting information, they will often resort to a red herring informal logic fallacy (see Chapter 4) whereby they will change the subject or go off on another tangent or even attack those asking for specific details. In effect, the source of the broad general statement refuses to provide specific supporting information to their original statement. This type of misinformation is found frequently in political discourse. It is common to find where a politician states they plan to cancel an existing policy or program but cannot outline the details of the replacement policy or program. The politician subsequently refuses to provide specifics on the new policy or program. This is a case of misinformation. For example, a source may sing the benefits of a Mexico-U.S. border wall to prevent illegal migration. However, they will not mention, nor refuse to address, how the wall can be breached with tall ladders or portable electric saws or avoided by transiting around the wall by boat, over by airplane, or under by tunnelling. Thus, the potential ineffectiveness of a border wall is never mentioned by those supporting such a policy that would cost U.S. taxpayers millions of dollars.


It is also common for the partisan (left/right) media and Internet sites rated as low quality (sometimes even medium quality)—see Figure 3.6—to publish stories that only support their partisan political views. This is a form of misinformation when the consumer (thinker) is not provided with complete arguments on important current events not necessarily supportive of the source’s partisan perspectives. Chapter 4 provides discussion on making good arguments, including how to recognize when arguments are structurally flawed, especially by partisan politicians and media reporting that omits information.

Lying with Numbers


U.S. novelist Mark Twain is attributed with popularizing the saying there are “Lies, Damned Lies, and Statistics.”[14] This statement implies the pervasive power of numbers (statistics) to support or bolster weak arguments. Not all statistics are lies, but if the statistics are made up, taken out of context, or misinterpreted, then misinformation is spread. To determine if numbers are valid, the thinker must know how the numbers were collected and analyzed. There are three types of lying with numbers most common in U.S. political discourse. First are broad statements by politicians similar to “Many people have told me….” The thinker should immediately ask “how many people” and “who were they (demographics, political partisanship, etc.).” What the politician is attempting to do with such broad statements is imply what the “many people told him or her” would be the same trend if a larger audience was polled, even up to and including the entire country. This is called the part-to-the-whole informal logic fallacy (see Chapter 4).


Second is when data from government or other sources are misreported. For example, in spring 2025, 238 immigrants, mainly Venezuelans, were deported from the United States to a maximum-security prison in El Salvador. The U.S. government reported these were all undocumented immigrants who were the “worst of the worst” and included criminals, terrorists, gang members, murders, rapists, savages, and monsters. Even when asked by journalists and Congress for more data on the deportees, it was never provided by the government. It turns out the original government statement was far from true. A study conducted by U.S. media outlets ProPublica and The Texas Tribune, plus several Venezuelan researchers,[15] investigated the background of each deportee and found over half, 130 to be exact, of the 238 immigrants had never been convicted of a criminal act in the United States. These 130 were only guilty of being undocumented in the United States—an offense only deemed a misdemeanor civil matter (not criminal) by the U.S. justice system. The study showed that U.S. government officials knew only 32 of the 238 deportees had been convicted of U.S. crimes and that most of these were nonviolent offenses, such as retail theft (shoplifting) or traffic violations. The study also found the government knew that only six of the immigrants were convicted of violent crimes: four for assault, one for kidnapping, and one for a weapons offense—but none for terrorism, murder, or rape. It turns out a few of those deported were temporarily in the United States legally as they were involved in immigration court proceedings. The lesson here is that even U.S. government statements can convey misinformation if it suits their political purposes.


Third are statistical studies most frequently seen in political discourse as survey or polling results. The validity of statistical surveys or polls is governed by a combination of the theories of sampling and probability. The tenets of these theories are often violated. For those not familiar with these theories there are several basic questions to ask in determining the validity of surveys and polls:[16]

  1. What was the population and sample size used? Was the survey (or poll) trying to generalize its findings to the entire country, a state, or another political division (city, county, etc.)? When a sample of a population is small the survey cannot usually generalize its results (trends) to a larger population but only to those surveyed. This is also a part-to-the-whole informal logic fallacy (see Chapter 4).
  2. Were the survey questions clear and without bias? Survey questions can often be evaluated as “leading questions,” which lead respondents toward the answer the survey-taker wants them to choose.  Examples: “The majority of Americans support Candidate A. Do you?” This question leads the respondent to agree with the majority. Or “Do you think Candidate A's new policy on climate change is terrible?” This question assumes the policy is terrible and leads the respondent to agree. Or “Don’t you not like the government’s policy on gun control? This is called a double negative question where two negatives are presented in a sentence thus making it positive. This question not only leads the respondent to agree with the question, as if the respondent answers yes they are effectively agreeing with the government policy. Unfortunately, in most surveys the consumer (thinker) is not given access to the survey questions so they may assess the potential biases.
  3. In selecting the survey sample was an Equal Probability of Selection Method (EPSM) used? The only true EPSM is to select a simple random respondent sample from a total known population. Sometimes stratified and systematic sampling techniques are considered to meet EPSM criteria, but these are not frequently used in political surveys. What is used most in political surveys is random telephone calling. These insert bias into the survey results in two main ways. First, not everyone has a telephone so some members of the population may be missed—this violates EPSM. Second, telephone surveys allow the person who answers the call to determine if they will participate or not. This results in a “self-selected” (non-random) survey also violating EPSM and adding bias to the results. The theories of sampling and probability cannot determine how much the bias in a random telephone survey affects the findings, so it is usually ignored—but it is still there.
  4. What is the size of the sample? Assuming ESPM, there are specific respondent sample sizes needed to generate acceptable confidence levels and confidence intervals in survey results. In social science, which includes political surveys, a 95 percent confidence level and 5 percent confidence interval are usually the goal. A 95 percent confidence level means it is acceptable for the survey findings to be wrong 1 out of 20 times (i.e., seeks to be correct 19 out of 20 times). The 5 percent confidence interval indicates whatever results the survey generates, the actual truth is 5 percent either side of the results. For example, if a survey of voter preferences found 60 percent of voters preferred Candidate A with a confidence interval of 5 percent, then the actual voter preference is somewhere between 65 percent and 55 percent (60 percent ± 5 percent). Confidence intervals are often called the survey’s “margin-of-error (MOE).” Again, assuming ESPM, as a standard rule surveys of a population of 1 million or even the total U.S. population of 330 million should strive for a sample size of at least 400 respondents if a 95 percent confidence level and 5 percent confidence interval are the goals. To obtain a 95 percent confidence level with a 3 percent confidence interval, most studies take a sample of 1,500 respondents.

The problems discussed above with survey research should make thinkers suspicious about the results of all political surveys. Unless the survey meets the criteria above, it should be disregarded. Overall, the above discussion should drive home when assessing any numerical or statistical information, the thinker must consider where the information came from and how it was analyzed. The number of respondents and who they are, the context of the situation, and the statistical methods used are all important to assess. Repeating Mark Twain, there are “Lies, Damned Lies, and Statistics.”

False Causality


To bolster their arguments, sources may offer how their information presents a case of causality (cause and effect). In other words, do conditions or changes in one or more factors (independent variables) cause conditions or changes in another factor (dependent variable). For example, in Chapter 6 it is shown a “strong democratic country will also normally have a free-market economy.” This is not an instance of strict causality but merely a correlation revealing most strong democratic states do have free-market economies. Otherwise, this statement violates the rules of casualty detailed below. Additional instances of correlations in political discourse are seen in Chapter 6, but the adage “Correlation does not mean causation” is usually true. It is critical to assess the rules of causation whenever direct or implied statements of causation seep into information and knowledge. These rules often are misused in political discourse resulting in misinformation. The rules of causation include:

  1. There must be time-ordering. The independent variable(s) condition or change must occur before the condition or change in the dependent variable, even if only a nanosecond before.
  2. There must be co-variation. The independent and dependent variables must move together. Once the independent variable(s) move up or down (increases or decreases or changes conditions), the dependent must follow by also moving up or down or changing conditions. If variables move together in the same direction, it is a direct relationship. If one moves up and the other moves down, it is an indirect relationship.
  3. There must not be a spurious relationship. There cannot be a third variable that causes both the independent and dependent variables to move or change conditions together. The classic (but unverifiable) example is a German story offering how one year an increase in storks caused an increase in human births. On its face, this was not a good causal relationship because storks do not play a role in human births (except in fairy tales). In this case, it was the result of an unusually cold winter in Germany that presented good conditions for both stork and human breeding. Thus, the cold winter was a third variable that created a spurious relationship. Identifying spurious relationships can be challenging even for experienced thinkers.
  4. There must be a theory. This is a decidedly positivist view; but there needs to be a theory that explains the relationships among the independent and dependent variables before they may be included in any cause-effect statements.[17] For example, Democratic Peace Theory offers the proposition that two democracies have never gone to war with each other (an empirical fact). Underlying this proposition is a causal mechanism, whereby when two democracies have a conflict, their similar democratic values allow them to cooperate and compromise and avoid violent conflict. Thus, this Democratic Peace Theory proposition meets the rules of causality.

Assessing information for instances of false causality is important in any political analysis.

Counterfeit and Hidden Agendas


Misinformation is often spread by the process of counterfeit and hidden agendas (traditions). In politics, counterfeit agendas concern a source who makes statements about policy or actions they plan to implement but never intend to follow through with their implementation. This commonly occurs when political candidates or elected leaders promise policy and actions in speeches or in their writings but intend to never carry them out. This is a case of misinformation. In these instances, the candidate or leader has hidden agendas, meaning there are other goals or motives driving their speech, writing, and decision-making they do intend to implement. The motivation for counterfeit agenda statements is usually to gain votes or convince legislators and citizens of their plans. The employment of counterfeit and hidden agendas highlights the analytic rule of “watch what they do and not what they say.”


For example, in the 2024 U.S. presidential election, one candidate promised potential voters a reduction in inflation and lower prices to improve citizen cost of living. Once elected, this same candidate took initial actions resulting in an increase in inflation, higher prices for many products, and proposed cutting of social assistance programs affecting citizen supplemental nutrition and health care programs—overall increasing cost of living. This is an extreme example of the employment of counterfeit and hidden agendas, but one common in political discourse. Thinkers must be sensitive to this type of misinformation.

Deception Detection


Politics is often considered a “contact” sport. To gain an advantage by misleading the public or opponents, political candidates and elected leaders can resort to deception (disinformation). In assessing any political information, the thinker must always be aware of deception, which can be defined as “[i]nformation…intended to manipulate the behavior of others by inducing them to accept a false or distorted perception of reality….”[18]Lying, omission, and distortion of information are common in deception campaigns. In other words, deception can entail any of the techniques for spreading misinformation discussed above and is very similar to gaslighting.


Deception has been used across history. Chinese General Sun Tzu (544-496 BCE) highlighted deception operations throughout his treatise The Art of War.[19] Operation Fortitude was the World War II Allies’ highly successful plan to deceive the Germans on the exact location of their 1944 European D-Day invasion landing.[20]Political campaigns have been known to use deception to influence final vote totals by misinforming parts of the citizenry.


A more recent case of deception occurred in the months before the 2003 U.S. invasion of Iraq. In a June 2002 speech at the U.S. Military Academy (West Point) graduation ceremony, the U.S. president outlined his administration's "preemption doctrine," a commitment to "confront the worst threats to the United States before they emerge by taking the battle to the enemy." This was the start of a deception effort to generate U.S. domestic and international support for the later invasion of Iraq. Over the next several months, the president and his advisors hinted at the possibility of a preemptive strike against Iraq in speeches and interviews. The president’s September 2002 speech to the United Nations (U.N.) General Assembly can be interpreted as suggesting the U.S. would act to stop Iraq’s transgressions (supporting terrorism with weapons of mass destruction (WMD) and cruelly abusing its own people) if the U.N. failed to do so.


By making frequent references to preemptive intervention, the president and his advisors gaslit U.S. domestic and international audiences as they created the illusion that preemption was a widely accepted solution to the threat posed by Iraq. Eventually, both the U.S. Congress and the U.S. public supported military engagement with Iraq. In October 2002, a U.S. Congressional resolution authorized the use of U.S. military force in Iraq. The resolution passed the House (296-133) and Senate (77-23). By March 2003 (just before the war start), surveys revealed that 70 percent of the U.S. public supported U.S. military intervention in Iraq. The U.S. also convinced the United Kingdom (U.K.), Poland, Spain, and Australia, to join in the coalition to attack Iraq. In November 2002, the U.N. Security Council unanimously passed a resolution declaring Iraq would face "serious consequences" if it failed to comply with previous resolutions mandating WMD disarmament and U.N. weapons inspections. While this U.N. resolution did not condone the later war, the U.S. president twisted its meaning to imply U.N. support of an Iraq invasion.


The spread of disinformation about the threat from Iraq was a classic deception campaign. The U.S./Coalition invasion forces did invade and defeat the Iraqi military, but no direct Iraqi links to terrorism were found. No Iraqi WMD were found. Over the 6-year Iraqi war and occupation, it cost the U.S. $100 billion and resulted in U.S./Coalition losses of 9,000 dead and 32,000 wounded. Iraq lost 16,600 military dead (wounded number unknown) and an estimated 250,000 civilians killed.[21]


Even when an opponent has a well-known history of deception, a thinker may overlook this deception. When the stakes are high, such as winning a war or a presidential election, a thinker must consider the possibility of opponents using deception. The possibility of deception should not be discounted even when there is no obvious evidence of deception. If deception is well done, the thinker should not expect to see any indicators of deception. The timing of the information and the bona fides (reputation) of the sources might be a first indicator deception is taking place. Political deception can destroy trust in the government, undermine democracy, and lead to poor decision-making.


People should routinely check their information sources for deceptive efforts by opponents. Items to consider include:

1. Does the opponent have the Motive, Opportunity and Means (MOM) to conduct deception efforts?

2. Would the potential deception be consistent with Past Opponent Practices (POP)?

3. Is there concern regarding the Manipulability of Sources (MOSES)?

4. What can be learned from the Evaluation of Evidence (EVE)? (see next section)[22]

Checklist for Assessing Information Quality


Skills in information literacy require people to assess the truthfulness (validity) of information found. As can be seen from the above discussions, there is potential for bias and misinformation in almost every information source. Assessing the quality of information found is a key factor affecting the validity of any thinking project. How much confidence a thinker places in their findings depends largely on the accuracy and quality of the information used in the project. Triangulation of information should be a goal of every project. At times; however, use of multiple sources for data collection to check and recheck information is not possible. In any case, having multiple sources of information on an issue is not a substitute for having good information that has been thoroughly assessed.


Examining the quality of the information used throughout a project helps the thinker avoid anchoring their analysis on weak information. Thinkers must strive to understand the context and conditions under which critical information used in their thinking was collected and reported. Thinkers should determine “what is known with some certainty” and “what is not known with some certainty,” and continually assess motivations, ideologies, and biases, plus check for inadvertent errors that may arise in the observation, interpretation, and reporting of the information. General Colin Powell, former Chairman of the U.S. Joint Chiefs of Staff and U.S. Secretary of State, reportedly would ask his intelligence staffs: “What do we know?” What do we not know?” and “What do we think?” This is good advice to follow for any thinker or decision-maker.


Thinkers should assess and annotate the quality of all information used in a project. It is always a good idea when searching for information to make notes regarding the information’s content, strengths, and weaknesses, which allow later review by others. It is important to determine the quality of all information used in a thinking project (facts, data, evidence, logic, reasoning, etc.). Figure 3.7 provides a checklist for assessing any information.


Figure 3.7

Checklist for Assessing Information[23]

Getting Started:

___Systematically review all information for accuracy (truthfulness). Assess existence of potential misinformation sources discussed above and in Chapter 4. Specifically look for:

___Lack of Corroboration

___Gaslighting

___Circular Reporting

___Omission of Information

___Lying with Numbers

___False Causality

___Counterfeit/Hidden Agendas

___Deception

___Cognitive Biases (Chapter 4)

___Informal Logic Fallacies (Chapter 4)

___ Assess violations of good argumentation (Chapter 4).

___ Assess violations of the scientific method (directly affects validity).

Also Consider:

___Reality Check: what is accurate and inaccurate about the content of this information?

___Private Gain or Public Good: who is benefiting politically, financially, or in other ways from the distribution of this information?

___What’s Left Out: what information is omitted that affects the point of view of this information?

___Values Check: how does this information align with or contradict accepted values?

___Read Between the Lines: what ideas are implied but not stated directly in the information (i.e., what assumptions)?

___Stereotype Alert: consider the ways the information uses stereotypes to influence the thinker’s emotions.

___Solutions Too Easy: does the information hope to attract the thinker’s attention by simplifying more complex ideas or concepts?

Finally:

___Identify information that appears the most critical or compelling in constructing good arguments or solutions. Look for sufficient and strong corroboration of critical or compelling reporting.

___Consider whether ambiguous information has been interpreted and caveated [noted] properly in sources.

___Indicate a level of confidence (high, medium, or low) that can be placed on sources used in the thinking project.

___Record/Save for Later: is the overall worth or value of the material such that it should be used in the current (or later) project? Use Figure 3.8 as needed.


Figure 3.8 provides a template for recording information quality after using Figure 3.7. Small thinking projects may only have two or three items of critical information. Larger thinking projects may uncover hundreds of items of information with facts, statements, propositions, and assumptions. It is not intended that all these items in larger projects would be recorded in Figure 3.8. Instead, only record the most critical information to be further assessed and used to generate the project’s findings. Figure 3.8 is intended to be used throughout the thinking project, adding new critical information as it is uncovered and assessed and deleting non-critical information as the project progresses.


Figure 3.8

Template for Quality of Information Checks*

SourceCritical Information ProvidedCorroboration of InformationConfidence Level (H, M, L)Comments

* Add additional rows as needed.


The main lesson of this chapter is to “question everything” until the information is assessed as valid (truthful). The material above provides the thinker techniques for assessing information truth and validity. Following in Chapter 4 are discussions on cognitive biases (heuristics) and informal logic fallacies and their debilitating effect on the quality of information and knowledge. The end of Chapter 4 includes material on constructing good arguments often violated in open-source material.


Notes

  1. Merriam-Webster online dictionary “misinformation,” https://www.merriam-webster.com/dictionary/misinformation. ↑

  2. Categories of Information People and their percentage of the population are author’s subjective compiled view based on a number of differing media reports addressing the subject. ↑

  3. H. Russell Bernard, Social Research Methods, Qualitative and Quantitative Approaches (Thousand Oaks, CA: Sage, 2000), 8-9.↑

  4. The epistemological origins of knowledge vary depending on the social science or philosophy references consulted. This list is a synthesis of several sources on epistemology and was originally published in Michael W. Collier, “A Pragmatic Approach to Developing Intelligence Analysts,” Defense Intelligence Journal 14, no. 2 (2005), 19.↑

  5. Bernard, 14-18.↑

  6. Samuel Huntington, The Clash of Civilizations and the Remaking of World Order (New York: Simon & Schuster, 1996). ↑

  7. James Barham, “Top Influential Think Tanks Ranked 2024,” October 2023, https://academicinfluence.com/inflection/study-guides/influential-think-tanks Also see The Best Schools, “The 50 Most Influential Think Tanks in the United States,” August 2020, https://thebestschools.org/features/most-influential-think-tanks/ (accessed October, 29, 2020). ↑

  8. Figure 3.6 compiled from a thematic content analysis by the author of several Internet sources evaluating news bias. For additional information see https://mediabiasfactcheck.com. Also see Adfontes, “Home of the Media Bias Chart,” https://adfontesmedia.com/ (accessed May 2025). ↑

  9. See Ted Hesson and Mica Rosenberg, “Trump says migrants are fueling violent crime, Here is what research shows,” Reuters, July 16, 2024, https://www.reuters.com/world/us/trump-focuses-migrants-crime-here-is-what-research-shows-2024-04-11/ (accessed August 18, 2025).↑

  10. Gaslighting (1944) followed a 1938 British play and a 1940 British movie with similar plots. In late-1880s Britian, natural gas lights were used for street and home and business interior lighting.↑

  11. Modified from Maura Hohman, “Ask these 5-word questions if you think someone is gaslighting you, therapist {Niro Feliciano) says,” Today, May 21, 2025. ↑

  12. The author found himself in this same situation. I took all the advanced mathematics courses my rural high school offered and obtained a decent SAT math score. On entering the U.S. Coast Guard Academy, they offered a summer math refresher course where the faculty identified those of us not ready for college calculus. In our first semester as freshmen those of us not ready for calculus took a Precalculus course (nicknamed Football Math) that prepared us for calculus in our second semester and eventual success in the Academy’s intensive STEM (Science, Technology, Engineering, and Math) curriculum.↑

  13. Melissa Goldin, “President claiming Harvard University offers remedial mathematics,” May 30, 2025, Associated Press. ↑

  14. Twain attributed this saying to British Prime Minister Benjamin Disraeli, but there is no record of it in Disraeli’s writings or public statements. It has been found in British political discourse in the mid- to late-1800s.↑

  15. Perla Trevio (The Texas Tribune), Mica Rosenberg (Propublica), et al., “Trump administration knew most Venezuelans deported from Texas to a Salvadorean prison had no U.S. convictions,” May 30, 2025, The Texas Tribune.↑

  16. For a deeper understanding of survey theory see Chapter 3 of Michael W. Collier, Security Analysis, a Critical-Thinking Approach (Richmond, KY, Eastern Kentucky University Libraries Encompass, 2023), https://encompass.eku.edu/ekuopen/6/ or https://manifold.open.umn.edu/projects/security-analysis.↑

  17. Ibid. A deeper understanding of causality is included in Chapter 3 of Collier Security Analysis, a Critical-Thinking Approach.↑

  18. Baron Whaley, “The Prevalence of Guile: Deception through Time and across Cultures and Disciplines,” essay prepared for the Foreign Denial and Deception Committee, DNI, Washington DC, February 2, 2007, reprinted in Robert M. Clark and William L. Mitchell, Deception, Counterdeception and Counterintelligence (Los Angeles, CA: Sage-CQ Press, 2019), 9. ↑

  19. Sun Tzu, The Art of War, trans. Samuel B Griffith (London: Oxford University Press, 1963).↑

  20. Colonel Michael J. Donovan, USMC, “Strategic Deception: Operation Fortitude” (Strategic Research Project, U.S. Army War College, 2002), 11-12, https://apps.dtic.mil/dtic/tr/fulltext/u2/a404434.pdf#:~:text=OPERATION%20OVERLORD%2C%20a%20cross-channel%20invasion%20of%20Hitler%27s%20%22Fortress,British%20planners%20had%20been%20developing%20the%20deception%20plan (accessed October 20, 2020).↑

  21. For a more in-depth analysis of the start of the 2003 Iraq War see Chapter 7 of Barnet D. Feingold and Michael W. Collier, Critical Belief Analysis for Security Studies (Richmond, KY, Eastern Kentucky University Libraries Encompass, 2024), https://manifold.open.umn.edu/projects/critical-belief-analysis-for-security-studies.↑

  22. Sarah Beebe Miller and Randolph H. Pherson, Cases in Intelligence Analysis, Structured Analytic Techniques in Action (Los Angeles. CA: Sage-CQ Press, 2015), 75-77.↑

  23. Modified from Mind over Media, Lesson 5, “Analyzing Propaganda with Critical Questions,” https://propaganda.mediaeducationlab.com/teachers (accessed November 9, 2022).↑

Annotate

Next Chapter
Chapter 4: Cognitive Biases, Logic Fallacies, Bad Arguments
PreviousNext
This work is licensed under a Creative Commons Attribution 4.0 International License. (see https://creativecommons.org/licenses/by-sa/4.0/).
Powered by Manifold Scholarship. Learn more at
Opens in new tab or windowmanifoldapp.org