Nobody gets by without cognitive biases. They are, by definition, patterns of thought that are undetectable. Cognitive biases lead one to jump to conclusions and make instant rash assessments—and you just can’t build a meaningful relationship with practical intelligence if they’re all you depend upon. There’s reality, and then there’s the version of it that your biases create.
In order to use practical intelligence, you must have a firm grasp on reality in its objective form. There are a few specific cognitive biases we’ll take a brief look at in hopes that you’ll be aware of them and can keep them under control. I’m not going to suggest you avoid them, because sometimes they’re unavoidable. But I will say they’re ones to watch out for—to try to realize when they’re happening so you can pause and make at least a brief attempt to think more deeply when they come up. More metacognition.
The availability heuristic. The brain tends to prefer information that’s most readily available or comes to awareness rapidly. If something simply comes to mind swiftly or is easily rememberable, we tend to attach an importance to it that it might not really deserve. This heuristic excludes supporting information that might be important to consider, along with countering details that might be used to argue against it.
For example, you might see the topic “tsunamis” trending on Twitter. You follow a couple links to recent “news” reports that say tsunamis are expected to happen more often in the near future. The reports are compelling. You feel a little nervous. You become worried that you’ll be a tsunami victim. You start thinking that you haven’t prepared enough. You get to the point where you think it’s inevitable that someday you’ll get swallowed up by a tsunami and there’s nothing you can do about it.
In all this concern, you temporarily forget that you live in Kansas, a landlocked state in the middle of the United States where tsunamis never happen. Tsunamis require a large body of water and are best known for happening to small island nations.
That’s the availability heuristic in a nutshell: you got spooked by a bunch of instantly accessible information that made you forget the chances of you getting swept away by a tsunami in Kansas are virtually impossible. When you are asked about your fears, you answer “tsunamis” and ignore the rash of home burglaries that might be occurring, or that you are in danger of losing your job. Just because something is available or notable does not mean it is important or representative.
This common cognitive bias magnifies the importance of past events in the prediction of future outcomes. The bias dictates that conditions and previous results point to the inevitability of something happening down the road—when in reality, each subsequent event is independent of the previous. This bias wants to create a cause-and-effect relationship where none exists. For instance, just because a coin has flipped to the heads side one hundred times in a row doesn’t mean it’s more likely for the next flip to land on the tails side. There is no relationship between each flip.
This particular cognitive bias is called “gambler’s fallacy” because it’s responsible for a lot of out-of-control gambling addictions. Somebody betting on a football game may say a certain side will win because they’ve always done so before, or because they’ve lost so many times that they are due for a win: “The Packers are due a win this week after all the tough losses, and they are going to get it against the Lions!”
Forget that this guy would be a terrible gambler if that’s the information he used to lay a bet, but it illustrates the point. The past history of the Packers-Lions rivalry doesn’t have anything to do with how well those teams have played in recent years. The Packers’ losses in recent weeks don’t mean their turn for a win is coming. Just because something happened doesn’t mean something else will happen. But someone employing gambler’s fallacy would not pay those factors any mind.
This cognitive bias seeks to reduce regret, and it’s based on a fairly common consumer behavior.
Say you’re shopping for home theater equipment. You go to a showroom and see a couple different models. One set of speakers is extremely expensive, features a lot of bells and whistles, and takes up a lot of space. The other’s a bit cheaper and smaller, but to the naked ear, doesn’t seem to be much different in terms of quality.
You might be persuaded to buy the bigger and more expensive one because, since it’s bigger and more expensive, it must work better. But it puts a serious dent in your bank account and is too big for your living room. And you might not even really be able to tell how well the sound’s working.
If you employed post-purchase rationalization, you’d convince yourself that you made the right decision, that it’s what you wanted to do all along. You’d tell yourself that you can hear the difference in sound, and you do indeed need fifteen different plugs and ports. You might know deep down inside that you went overboard, but that knowledge makes you uneasy. Regret makes you feel stupid, and no one likes that. So you talk yourself into believing you did the right thing and got exactly what you wanted. No more regret, just eating boxed macaroni and cheese for dinner for the next two months because you spent so much on new speakers.
This type of post-anything justifying behavior extends far beyond purchases. These are more commonly known as defense mechanisms, and they allow us to defend our egos from scrutiny and shame. We do this sometimes when we defend ourselves from others, but here, instead of trying to convince someone else, we are trying to convince ourselves. Sometimes, we are not even aware we are using them. That’s the scary part.
Confirmation bias is when we want to believe in a certain “fact” so badly that we only look for and agree with information that supports the belief—and ignore compelling evidence that disproves it. Instead of “I’ll believe it when I see it,” confirmation bias dictates a standpoint of “I’ll see it when I believe it.”
Confirmation bias happens all the time. Those of us with any type of political beliefs tend to consume news from sources that support our opinions and block out legitimate sources that offer countering views. If you are looking to buttress a certain stance that you believe in, you will feel that each supporting source is legitimate and thorough, while each critical source is flawed or has an agenda. There is only one view that you’ve pre-decided, and you’ll mold the evidence to fit it. You see it when you believe it.
Even scientists who supposedly hold objectivity in high esteem might suppress or ignore data that challenges their findings just so they have a better chance of getting published by scientific journals. In fact, this is why an experimental technique called the “double blind” was invented—so scientists didn’t know who they were subjecting to certain conditions, and therefore couldn’t act in confirmatory ways.
We practice confirmation bias because, quite simply, it hurts to be wrong. Taking the emotional component out of our “truths” as much as possible and being open-minded about divergent views help us combat confirmation bias and think smarter.