It’s Okay To Be Uncertain
Confirmation bias is also the most prominent way that we fail to simply follow the evidence. If we perform research and keep an open mind, our task is simple: just follow the arrows where they point. But all too often, we are seduced into following the wrong arrows. These include the cognitive distortions of focusing on “must” and “should”, black-and-white thinking, the Dunning-Kruger Effect, and labeling.
Hear it Here - https://adbl.co/3n6a2fz
Show notes and/or episode transcripts are available at https://bit.ly/social-skills-shownotes
Learn more or get a free mini-book on conversation tactics at https://bit.ly/pkconsulting
#ConfirmationBias #DavidDunning #DunningKrugerEffect #IndependentThinker #InterpersonalCommunications #JustinKruger #Kruger #It’SOkayToBeUncertain #RussellNewton #NewtonMG #PatrickKing #PatrickKingConsulting #SocialSkillsCoaching #TheIndependentThinker
Transcript
As an extension of denying your confirmation bias, another aspect of openness and mental flexibility is to follow the evidence. Wherever it points is where you go. Inevitably, a narrative begins to unfold as you delve deeper and seek to understand. All you have to do is look in that direction without regard to how happy or unhappy it makes you.
You might find real evidence that supports your point of view—great. But you’ll also find evidence that you don’t necessarily want to face, the kind that offers cogent and reasonable arguments against your position. Even people who have devoted themselves to fearless truth-seeking might bristle at this evidence and try to avoid it. To a certain point, it’s fine, but especially if it pierces the shields of their cherished identity—religious, political, social, allegiance to a fictional movie character—they may try to sidestep it and turn it away.
You know what I’m going to say: That’s exactly the evidence you should need to follow and follow to its utmost. It’s a deceptively simple task—if you can let go.
Treat all the evidence you receive by the same standards of reliability. All of it needs to pass the same sniff test. You must be circumspect of all evidence, which means tending toward high-quality information more than high quantities of information.
Imagine an arrow pointing in one direction after reviewing some information and perspectives. This is a green arrow to symbolize that it is correct. However, you will have to be able to pick it out from a multitude of red arrows, which seem helpful but aren’t. Sometimes, it’s more important to eliminate those red arrows, as you can never quite be sure that you are indeed following the green arrow.
Beware of black-and-white thinking. Black-and-white thinking is easy. That’s why people practice it. But it’s also dangerous. Starkly contrasted, right-or-wrong belief systems are the downfall of modern civilization. Unfortunately, they’re not going away anytime soon, but to maintain a path of intellectual openness, you must learn to avoid black-and-white thinking—and keep it from inserting it into your own beliefs.
Black-and-white thinkers only see two options for anything: “You’re with us or against us.” “If it’s not Mars, it’s Venus.” “If I don’t follow that red arrow, it must be this red arrow.” If the evidence doesn’t point one way, it definitely points to the other. Maybe the middle ground doesn’t exist because it’s more important for them to be certain than right.
But that’s an error of epic proportions. If someone doesn’t like the color red, it doesn’t mean they like blue. One discovery does not necessarily rule out another, and there isn’t a causal link between very many things.
Only a few truths are absolute, and they’re the ones provable by evidence. But all other truths—more accurately, beliefs—are more nuanced. There’s more to consider and think about when deciding what’s true and what’s not.
In their natural state, most people are not sharply one way or the other. The truth is the same. Perhaps the tendency for black-and-white thinking is because of the following point.
It’s okay to be uncertain—it’s not okay to pretend you know what you don’t. Saying “Maybe” is a perfectly fine conclusion, and an opinion isn’t mandatory. Being unknowledgeable about current issues simply isn’t an option. Saying “I don’t know” is almost shocking to some people, because they’ve internalized that statement as a sign of failure or some shortcoming. All of the arrows you can currently see might be incorrect.
Many of us offer an uninformed opinion off the top of our heads to avoid that appearance. We think it’s better to have some insight—even if it’s completely off-base—about something we don’t understand than to remain quiet or express doubt. And then, as we do, we stick to that stance for no good reason.
Sure, uncertainty is uncomfortable or at least can be. But it shouldn’t be so disarming that we try to conquer it by finding something, anything, to believe. The fear of being uncertain is why people accept conspiracy theories or the rantings of a charismatic cult leader. They may be completely without merit, but they admire the sureness they provide. Even if the beliefs are absolute rubbish, they’re better than having no beliefs at all.
Almost invariably, the information they’re getting is ill-formed, unsound, slanted, and even flat-out false. But that doesn’t matter, because they feel like they know something. It doesn’t even matter to them that they’re not right—because it’s important to feel certain.
Your biggest stumbling block in this situation is emotional. It is the emotion of anxiety you feel from a lack of certainty. You need to understand and believe that there’s nothing wrong with being uncertain, that ambiguity is not an affliction. Some might even say that being uncertain or ambiguous is exciting, because it opens up possibilities. In any event, being uncertain is far, far more preferable to believing in something false.
De-stigmatize the dreaded three words “I don’t know.” You won’t lose points in the eyes of others. In fact, they might even appreciate that you’re the rare breed who doesn’t feel they have to have a ready set of opinions about something they don’t know anything about. It’s much, much better to be unsure than to be misguided.
Just remember this: Your search for truth is rooted in a desire to understand. You’re seeking knowledge—you are not necessarily seeking answers. The key to getting through that uncertainty is to accept the chance to test or confirm our beliefs.
Thinking “must” or “should”. This is one of the leading causes of biased and closed-minded thinking. Like other shortcomings we have discussed, this leads to you looking at a set of evidence with a conclusion already in mind, based on how you picture something should occur, and trying to mold the evidence to fit that. It is when you expect the world to be different than it is, and it is the opposite of what you should do.
Must and should thoughts are beliefs that you unknowingly treat as fact. If they don’t materialize, even after you see clear arrows to proceed another way, you’ll still be hesitant to follow the evidence. For example, you could carry the belief that dogs “should” be friendly—how might this “should” hurt you if you encounter a wild dog frothing at the mouth with rabies? Shoulds and Musts masquerade as evidence, and for that reason, they are a red arrow to be avoided.
Beware of the Dunning-Kruger Effect. This was coined by Cornell psychologists David Dunning and Justin Kruger in the late ‘90s. Simply put, some people aren’t informed or knowledgeable enough to know what they don’t know. Even worse, they’re usually over-confident in their abilities because there is little nuance and only simple questions and answers to them. The more they fall prey to this effect, typically the more confident they are in themselves.
Dunning-Kruger occurs in just about any setting where people assume they know best. You might see it during a chess match, where a novice feels that chess is extremely simple—while missing all the behind-the-scenes planning and nuance. It’s just not possible to follow the evidence if you have no idea what you’re looking for (and you don’t know that you don’t know).
Dunning himself recently noted that the effect isn’t necessarily fatal. Many people appear to have it simply because they don’t know the standards for success or accomplishment, so they’re essentially flying blind, but giving themselves credit for keeping afloat. Once people become aware they suffer from Dunning-Kruger and acknowledge it, they can always reverse its effect by learning and putting it into practice.
You can take a few steps that will ameliorate the Dunning-Kruger effect. A lot of it involves things we’ve already talked about: being humble and realistic about your current state of being, not being intellectually lazy, and not thinking that you’re above anybody else in terms of intelligence or accomplishment. The world is usually not devoid of complexity, so if you feel there is little nuance, you are probably missing entire levels of analysis. To combat this, embracing self-challenge is key because it turns out that if something appears too simple to be true, you probably just don’t know what to look for.
Labeling. This is when you, quite naturally, attach labels to people, places, things, and perspectives. The problem is that any label’s purpose is to reduce to a single word and the abbreviation of information. These labels, usually not carefully chosen or given much thought, go on to form your beliefs. These are hazards with relatively accurate and descriptive labels—what about when you unwittingly use labels that are ambiguous, inconsistent, or inaccurate?
One big risk with labeling is that instead of describing the specific situations or perspectives in isolation, you create negative and absolute labels. In such instances, it’s not about the situation in the moment; it’s rather about making judgments about all future circumstances based on what you have just observed. It’s similar to when we confuse feelings for facts.
Jumping to conclusions is a theme that underlies everything that happens when you don’t follow the evidence. Sometimes there is no battling this, but the key is always to slow down. Try to honestly answer when you have been fooled and made incorrect judgments, and when you have failed to see the whole picture. Think about interpersonal communications and how many misunderstandings you have experienced in your life, despite your best intentions and deep friendships.