Imagine someone who’s experiencing serious and debilitating health concerns. The doctors are stumped. The person starts seeking alternative medicine options, but after lots of promising treatments, the condition persists. The person finally decides that their problems must be due to a food allergy, and if only they can stop eating the offending food, they’ll get better.
They eliminate and re-introduce all the usual food culprits. The symptoms get worse. They eliminate more food groups—the resulting headaches and stomach troubles must be down to “detoxing,” they think. The symptoms get worse. They’re now more convinced than ever that their hard-to-pin-down food allergy was more dangerous than they thought. They end up telling everyone about this food allergy, rearrange their entire lives to accommodate a new diet, and even start educating others about the condition. Their symptoms get worse.
Many years later, a conventional doctor accidentally identifies the real problem: it has nothing to do with food at all, and is fixed within a matter of weeks with a commonly available medication. Oops.
It’s easy to see what went wrong in this example—the person simply became too invested in one particular perspective. Unable to consider other alternatives, they got stuck seeing their situation in a narrow, fixed way, unable to absorb new information, adapt, question assumptions or drop old strategies when they simply didn’t work anymore. Like a horse wearing blinkers, they have “tunnel vision” that actually prevents them from behaving in their own best interests.
This isn’t to pick on alternative medicine, either—the story could have easily gone the other way, and most of us know of conventional doctors who doggedly pursue what they think is the problem, all the while missing the answer staring them in the face.
Nothing stops us from learning more than the belief that we already know what we need to know.
As human beings, we all need to make models of the world we find ourselves in. We need to make predictions, assumptions and best guesses—we would get nowhere if we didn’t! The trick, however, is to know when to abandon an old model when presented with new information. An inaccurate perspective is only a problem if we fail to adjust it when we know better.
There are other limitations to being close-minded, a big one being how it makes you a real bore! Thinking you know it all already means you can’t act with humility or curiosity. Every situation you encounter simply gets turned into extra proof of your particular viewpoint, whether it fits or not. You end up discounting other equally valid perspectives—including those that could have taught you something.
Getting overly attached to just one favorite viewpoint is actually the fastest way to ensure you learn nothing. Doing so makes you feel as if you know a lot, but in reality you’re lacking self-awareness. What’s more, how could you ever identify that you are mistaken if you’re not even able to acknowledge the possibility?
Another obvious downside is that thinking you know everything makes you naturally more judgmental and intolerant of others. If you’re correct, that naturally means that everyone else isn't. You have put yourself in a position where all difference is bad, and everyone who disagrees with you is plainly an idiot—what else would they be by disagreeing with the obvious truth?
You become worse at communication, and less willing to communicate in the first place, since there’s not much to be gained, except perhaps the joy of lecturing others. Your original impulse came from wanting to know, but the cruel irony is that it leads you to shutting yourself off entirely from genuine knowledge.
As we saw in the fixed mindset chapter, not only are you less likely to be compassionate and understanding of others, but some of this judgmental attitude will reflect back on you, and you may be so self-critical that you never dare risk trying something new and different, for fear of being wrong.
Again, when it comes to success or failure, achieving goals, and fulfilling our true potential, it’s all about mindset. It almost doesn’t matter what adversity we encounter, what data we’re faced with, what random events befall us, or the skills and qualities we’re blessed with—what matters is how we frame it all, and how that perspective influences the way we act.
Just as we saw in the chapter on self-delusion, however, it can be tricky to really know how you’re thinking, what mental models you’re using, or how limited your perspective is. This is because a really useful model of the world will seem invisible, and you’ll look at it and sincerely believe you’re looking at reality itself, and not a model of reality.
The ability to honestly appraise your thought processes and think about your thinking is what distinguishes truly conscious, intentional people. Einstein famously said that you cannot solve a problem with the same thinking that created that problem in the first place. In the same way, we never really make big changes if we only work within our self-imposed limits—we need to step outside of them, and look at the limits themselves.
In our example, the person was engaging in purpose-driven, intentional and self-directed problem solving. But it didn’t matter because the frame inside which they were doing it was inaccurate. They allowed a set of faulty assumptions to go unchallenged, and missed the real insight of the situation in front of them.
Quick, here’s a question for you: Right now, what is your dominant way of interpreting the information that comes your way (including this question)?
We all like to think that we are neutral; that we are objective data-processing machines in an objective universe, and we are simply taking in information and responding to it. But we don’t acknowledge a powerful intermediate step in which we interpret what we perceive. This interpretation is total, constant, and may happen without you being in the least aware of it.
Julia Galef’s famous TED talk outlined what she saw as two fundamentally different mindsets, or approaches to interpreting incoming data from the world. The first she called a “soldier” mindset. The soldier approaches new information with a handful of beliefs they’ve already decided on. The aim is to defend and protect those beliefs from new data, if necessary—as though conflicting data were an enemy that needed to be shot down.
This was broadly the approach taken in our example. There was one key piece of new information that the person was bombarded with: the fact that their symptoms continually worsened, despite dietary changes. Their response to this new data was to double down on the already-held belief, even twisting conflicting information until it looked like evidence for the current model.
We can see this mindset any time we encounter warring factions, be they literal tribes and nations, or simply feuding families or groups who have decided they are one another’s enemies—believers vs. atheists, liberals vs. conservatives, rich vs. poor and so on.
This reminds us of the fixed mindset—i.e. we settle on an idea before taking on new information, and new information is filtered according to what we already “know.” As an example, consider a person who holds the sexist belief, “Women are bad drivers.” One day they encounter a brilliant woman driver. This data point is a threat; the person simply concludes “well, they’re kind of a masculine sort of woman anyway, they’re clearly an exception.”