Epistemology and Infectious Misinformation

"Oh say, what is truth? 'Tis the fairest gem... 'Tis the brightest prize to which mortals and Gods can aspire."  - LDS Hymn #272 - "Oh Say, What is Truth"

How many of our current beliefs might be wrong? If we are wrong, would we truly want to know?  Are there certain personal beliefs that we consider beyond questioning? Do the benefits of our beliefs outweigh the need to know whether they are true? If our ultimate goal is to believe the most number of correct things as possible, how much time are we actually putting into evaluating information that may disprove our beliefs?

These are some of the questions explored by the field of epistemology.  Simply put, epistemology is the study of "what distinguishes justified belief from opinion."  It's less concerned with what is objectively true (ontology) and more concerned with the methodologies used to arrive at our beliefs. We tend to spend a lot of time justifying and confirming our beliefs but very little time examining the reliability of the methods we used to develop our beliefs in the first place. So, as much as we would like to claim otherwise, many, if not most, of our beliefs, when critically examined, are not truly based on "hard evidence." This isn't to say they aren't actually objectively true or that such evidence does not exist - they might be true and evidence might exist - just that the methodology we used to arrive at our conclusions should leave significant room for epistemic humility, or the possibility that we may be wrong, even in many of our most deeply held beliefs.

Let's take a basic example that almost everyone "knows" to be true - the Earth revolves around the Sun.  While this may seem painfully obvious in our modern society, how many of us could actually explain the astronomical evidences that support the Earth's orbit around the Sun?  Beyond pointing to experts in the field or "general knowledge," could each of us truly explain the mechanics of the Earth's orbit?  I know I could not, but I wholeheartedly believe it to be true, and you would have a very difficult time convincing me otherwise.

Why do I not spend the time to research the astronomical evidences for the Earth's orbit?

Simply put, there are limitations on the resources I have at my disposal (mostly time and mental capacity) and so I have to prioritize which information to process and give attention to.  Our brains naturally use heuristics or shortcuts to bypass the impossibility of having to "know" all information. We rely on others who we consider to be knowledgeable and we are subject to known cognitive biases such as anchoring, confirmation bias, groupthink, the illusory truth effect and hundreds of others.  These serves a brilliant function of freeing up mental resources for daily routine functions and other tasks that hopefully bring fulfillment to our lives, but undoubtedly also lead to a distorted view of reality.

If we truly wanted to dispel all of our false beliefs, we would have to spend a considerable amount of time and energy regularly evaluating critical or opposing arguments. But we don't generally have the time, energy, desire, training, or even sufficient information to either debunk or confirm all opposing positions. We therefore rely heavily on our own natural assumptions and the assumptions of those in our immediate social circles. For the most part, this may serve us well - we feel protected and unified in our shared beliefs. Many of our deeply-held beliefs give us a sense of meaning, purpose or community such that the benefits of such beliefs may outweigh our need or desire to investigate and critically examine whether they are true.

But what happens when our biases are wrong?  At what point is it dangerous or irresponsible to believe something just because it agrees with my friend's view or my own pre-conceived notions or reality?  Is it irresponsible to believe every conspiracy theory presented that supports our point of view and aligns with our biases? Is it irresponsible for me to immediately dismiss all opposing theories without proper evaluation?  If information is blatantly false and potentially harmful to others, what responsibility do others have in pointing out the flaws?  People may have a right to state their opinion but conspiracy theories that feed on fear over facts can also be incredibly harmful to society. Not all opinions are created equal.

For example, with all of the uncertainty, fear and distress around the current Coronavirus pandemic, there are a number of conspiracy theories popping up: it was produced in a Wuhan lab, it was produced in a US lab, it's a socialist conspiracy to take away our constitutional rights, it was created by the US and Israel to attack Iran, it's part of Bill Gates' evil plans to depopulate the Earth, or even that the prophet of the LDS Church was partially behind it in an attempt to make money on their "Big Pharma" investments through vaccines (and no, I'm not making any of these up).

Whether we initially believe such theories (of which I am personally highly skeptical) will likely be a byproduct of my cumulative lived experiences. How do the implications of the claim impact my personal life? What do people I respect think of this? What do sources that I trust think of this? What natural biases do I hold that I may not even be aware of that impact how I view information presented? We don't generally consciously think of it in such terms, but our brains are naturally good at observing social cues from our immediate environment and making conclusions often based on little to no actual evidence. And then our opinions are bolstered when the "opposition" (YouTube in the case of recent COVID conspiracies) removes or "censors" this information. It's interesting that YouTube removing such sources doesn't seem to cause people to critically examine why the information might have been false or misleading and therefore removed; rather, people feel further justified in their conspiracy theory narratives.

(As a side note, while YouTube has every right to remove such videos as a private platform and I understand the desire to remove content that is blatantly false and potentially endangers others by encouraging them to disregard public health guidelines, I think censorship of misleading and false information often backfires.  The best way to combat false information is not to delete the information but to present better information - but it's also hard to get most individuals to even look at a video or fact-checking site that might suggest that they were wrong to trust the initial source. So YouTube seems compelled to try to prevent the spread of infectious misinformation rather than to try to contain it)

So what is a good framework for evaluating information?

This is ultimately a question that everyone has to answer for themselves.  Depending on the context, I think we all have different standards of evidence required for our beliefs.  In a religious context, many accept beliefs as "absolute knowledge" based on faith, feelings or fruits.  Are each of these reliable means of reliably discerning truth?  Could someone within a different belief system use the same methodology to arrive at contradictory beliefs and conclusions? Maybe the cost of discovering our beliefs were not based on reliable methodologies is greater than the benefit of discarding false beliefs? In a secular context, many demand scientific evidence and proofs - some of which may also be based on faulty assumptions or incomplete data.

I do not claim to be an expert in discerning truth, but the ideas around why we believe what we do are fascinating to me.  I know full-well that I am subject to the same cognitive biases that impact all human beings noted above that impact us all.

My internal questions -Brian's epistemological framework

Here are some of the questions that I try to ask myself in the instances where I do try to critically examine information:

1) What are my initial thoughts before even examining evidence?  What natural biases may I hold that impact my evaluation? How does this impact the sources I am examining to either prove or disprove the claim?

2) What would be considered sufficient evidence to disprove my current beliefs?  Am I applying a higher standard of evidence for information that disproves my beliefs and a lower standard of evidence for information supporting my beliefs?

3) Is the method used to arrive at my conclusions reliable?  Could someone holding a different and contradictory belief use the same method and arrive at a different conclusion?

4) What platform is producing the information?  Is the platform credible?  Are they known to produce reliable information or have they been discredited in the past?  What is the natural bias of the source (and while some sites are worse than others, ALL sources are biased)? (With respect to media bias, there are great sites that analyze bias such as https://mediabiasfactcheck.com).

5) Is the individual making the claim credible?  What sources is the individual using to make claims? Are the claims corroborated by other reliable sources, including perhaps sources with a very different bias? Does the individual have a history of presenting credible information?  Does the person making a claim serve to benefit financially from the claim? Are they transparent about that financial benefit?

6) How confident do I feel that my conclusion is correct?  Might that confidence be higher than I can truly justify?

These questions can help me to better evaluate my own personal beliefs and try to embrace more truth and disregard error.  At the end of the day, I will still likely be wrong on many things, but the better I can become at identifying and counteracting my personal biases, the closer I will hopefully come to the "truth."

Comments

Popular posts from this blog

The One Big Beautiful Bill Act - Personal Income Tax Changes

My stance on the "controversial" social issues

Thrive Day 2019