full

full
Published on:

25th Jun 2025

Chapter 2. Watch Yourself from Practical Intelligence-Overcoming Bias & Mastering Logical Arguments

00:00:12 Today's featured book from Patrick King is Practical Intelligence

00:04:55 Two Systems of Thought

00:16:22 Battling Biases

00:28:01 Four Cognitive Biases to Watch Out For

00:36:03 Beating Cognitive Bias

00:41:52 Logical Arguments

00:52:51 Takeaways

Practical Intelligence: How to Think Critically, Deconstruct Situations,

Analyze Deeply, and Never Be Fooled By Patrick King



Hear it Here - https://bit.ly/practicalintelligenceking



https://www.amazon.com/dp/B08263KGHK



Are you ready to become an expert in practical intelligence? In this

episode, we dive into Chapter 2 of Patrick King's groundbreaking book,

'Practical Intelligence.' Discover the fascinating neuroscience behind

our thought processes and learn how to harness both systems of thinking

for optimal decision-making.


Biases can cloud our judgment, but with the strategies outlined in this

chapter, you'll learn to identify and overcome them. From improving your

critical thinking skills to mastering logical arguments, this video

will empower you to think faster and more accurately.


Don't miss out on these essential tips to stay ahead of the curve! Click

the link below to get your hands on a copy of 'Practical Intelligence'

and start your journey towards becoming smarter ASAP:

https://bit.ly/practicalintelligenceking




Transcript
Speaker:

Hello listeners, welcome to Social Skills Coaching on June 25, 2025, where you become more likeable, more charismatic, and more productive.

Speaker:

Today's featured book from Patrick King is Practical Intelligence, How to Think Critically, How to Construct Situations, Analyze Deeply, and Never Be Fooled.

Speaker:

Our episode today comes from chapter 2, entitled Watch Yourself.

Speaker:

In this chapter, we talk about the concept of metacognition, or thinking about our thoughts and how taking that step back and looking at a different level can assist us in our day-to-day lives.

Speaker:

We'll explore how becoming aware of our thinking patterns is crucial for maintaining clear and logical thoughts.

Speaker:

So get ready to discover the battle between system one and system two thinking, if you're familiar with Daniel Kahneman, and understand why overcoming the brain's natural inclination towards quick, instinctual thoughts is vital for accurate decision making.

Speaker:

We'll also unravel the concept of cognitive biases, those mental shortcuts that can lead us astray, and we'll learn about common pitfalls like the availability heuristic or confirmation bias.

Speaker:

Through this episode, you'll gain insights into powerful strategies to overcome these biases and improve your thinking habits.

Speaker:

And also, we'll delve into the world of logical arguments, uncovering the truth behind statements and learning to identify flaws in reasoning.

Speaker:

This is a lengthy episode, so get prepared for an enlightening journey as we equip ourselves with tools to enhance our cognitive abilities and navigate the complexities of thought.

Speaker:

And don't forget, if you get lost or need a quick summary, check out the show notes.

Speaker:

There'll be a time link there to get a summary of the highlights, a summary of takeaway points at the end of the episode.

Speaker:

Thanks for joining us today.

Speaker:

Let's get it on.

Speaker:

In the first chapter of this book, we discussed how our psyches are pre-programmed to make the wrong judgments about what we see in a world that often isn’t what it seems to be.

Speaker:

We also talked about how skepticism and critical thinking can help us vanquish those errors in discernment.

Speaker:

In this chapter, we’ll face many of the same issues, but this time, we’ll start by talking about how biological factors can lead us to the same errors in judgment.

Speaker:

We are biologically programmed to lack self-awareness in many ways, and when we don’t know what is driving us, we will often be driven right off the cliff.

Speaker:

A doctor probably shouldn’t operate unless they know the true causes of an illness; we should understand the baseline that our brains operate from in order to properly move forward.

Speaker:

The brain seems like a complex machine, and in many ways, it is.

Speaker:

But you could view it as nothing more than an electrical network housed inside your skull.

Speaker:

Information goes from neuron to neuron via electrical impulses that flicker across the synapses.

Speaker:

Nothing more than trains and relay stations.

Speaker:

Just like the electric currents flowing from power outlets, the brain impulses that control our thoughts prefer to follow the path of least resistance.

Speaker:

The brain maintains these easy passageways of thought for maximum effectiveness and speed.

Speaker:

This means that the more you think about a certain thing, the more the brain isolates and reinforces the path that thought travels along—therefore making it easier to think about that thing again in the future.

Speaker:

That’s both good and bad news.

Speaker:

It’s good in that refining those thought pathways makes certain mental tasks easier to accomplish.

Speaker:

It’s helpful for creating good habits and reinforcing positivity.

Speaker:

But it’s bad because if we latch on to negative or errant thoughts, they can become our default thought patterns, and this can be difficult to avoid.

Speaker:

This is exactly where bad habits come from, and it’s why even though we logically know the spider won’t eat us, we can’t bear to face it.

Speaker:

Our biological makeup overrules smart and clear thinking, and in effect robs us of our common sense and practical intelligence.

Speaker:

Our brains become wired for flawed thinking, and this chapter uncovers how that happens and how to reverse it.

Speaker:

You can take measures to counteract the faulty thoughts that take hold in the brain, and gain clarity of thought.

Speaker:

It takes some dedicated and deliberate monitoring of our mental processes and understanding how they work.

Speaker:

We call this practice metacognition—very simply, thinking about our thinking and watching yourself dispassionately and objectively.

Speaker:

Two Systems of Thought

Speaker:

39 00:05:00,760 --> 00:05:03,000 The brain is a wonder of biology.

Speaker:

However, just like all our body parts, it gets exhausted when it’s overused.

Speaker:

To prevent exhaustion, the brain regulates some of its processes so it can conserve the energy needed to power all its functions.

Speaker:

This means it’s always seeking shortcuts, so we don’t have to think every last thing through, thereby conserving energy.

Speaker:

In reality, the brain ends up cutting corners and ignoring important information in the interest of saving precious energy.

Speaker:

In past eras, this was helpful because it allowed us to prioritize survival instincts above all else.

Speaker:

But it’s much less useful to us now.

Speaker:

The brain’s search for shortcuts has led to two systems of thought—one focused on speed and conservation of energy, and the other on accuracy and analysis.

Speaker:

This is something we must be vigilant about, especially when we are introduced to new information or concepts.

Speaker:

The brain would rather save energy for survival situations, but little does it realize that it can actually cause them through flawed thinking.

Speaker:

This concept was popularized by professor Daniel Kahneman in his seminal book Thinking, Fast and Slow.

Speaker:

Through a series of experiments, Kahneman developed a model that explains the separate processes the brain uses to absorb and react to various bits of information, imaginatively titled System 1 thinking and System 2 thinking.

Speaker:

System 1 is “fast” thinking.

Speaker:

This mode is automatic and instinctive.

Speaker:

It’s what we use when we happen upon a situation that we’re familiar with and don’t need to process that much, like recognizing a friend, riding a bicycle, or doing single-digit math calculations.

Speaker:

Since it’s intuitive, System 1 thinking is also associated with emotional reactions, like crying or laughing when seeing an old photograph.

Speaker:

The main facet of System 1 thinking is effortlessness.

Speaker:

It doesn’t require anything in the way of analysis or consideration, instead using a framework of associations that we’ve already experienced time and time again.

Speaker:

System 1 is a series of mental shortcuts—called heuristics—that help us decode situations very quickly (more on those soon).

Speaker:

And because there’s little time or effort used in System 1 thinking, it expends less energy and isn’t terribly exhausting.

Speaker:

You’re not going to need a list of pros and cons to make decisions with System 1.

Speaker:

Although System 1 is quicker, it’s aimed at doing the fast thing versus the right thing.

Speaker:

System 2, on the other hand, is “slow” thinking.

Speaker:

This style is much more contemplative and analytical, and it’s generally used when we’re absorbing a new experience or learning.

Speaker:

It’s employed for any situation that requires more mental labor and effort.

Speaker:

System 2 controls decision-making in events that could result in high consequences, like choosing a college, buying a new car, or quitting your job.

Speaker:

You also use System 2 when you’re doing something that needs more focus or effort, like driving through a foggy night, striving to hear someone speak in a noisy room, trying to recall a conversation you had a few weeks ago, or learning a complex school subject that’s new to you.

Speaker:

Skepticism and critical thinking, like the kind we discussed earlier in this book, fall under System 2.

Speaker:

Whereas System 1 thinking is fluid and instinctive, System 2 thinking is the opposite: it’s deliberate, conscious, and methodical.

Speaker:

System 1 thinking is the proverbial skydiver, where System 2 thinking is the proverbial cautious lawyer.

Speaker:

System 2 needs time and labor to process new information—and as a result, it uses more brain energy and can be tiring or draining.

Speaker:

That flustered and fatigued feeling you might get while studying or reading a book isn’t because you can’t understand it or are bored; it’s an actual biological imperative.

Speaker:

You’re using up your System 2 energy.

Speaker:

Both systems of thinking are important to us, as we use them for different situations.

Speaker:

Most times, you don’t need to stop and think about whether to go at a green light or hug a close friend you haven’t seen in a while.

Speaker:

They’re just things you automatically do, and there aren’t a lot of considerations or analyses you have to make before you do them.

Speaker:

You draw from your past experiences and make an instant decision.

Speaker:

On the other hand, if you’re reading a calculus textbook, planning a family budget, or deciding whether to ask someone to marry you, you can’t just think casually or rely on your instincts to pull you through.

Speaker:

There’s a time and place for both System 1 and 2 thinking.

Speaker:

Remember when we said the brain prefers the path of least resistance?

Speaker:

This means it favors System 1 thinking whenever possible, and you must force yourself to take a step back and switch into System 2 thinking, which is what we’re after most of the time in clear thinking.

Speaker:

System 1 thinking is ultimately quite limiting, which is unfortunate because it’s where our minds go first and foremost.

Speaker:

It makes us susceptible to accepting things and situations at first glance, not thinking skeptically, being more gullible, and overall faulty thinking.

Speaker:

It makes us impulsive and rash without considering consequences or implications.

Speaker:

It makes us think dumber.

Speaker:

For things you encounter on a regular basis or have deep familiarity with, it’s great—this is where System 1 thinking shines.

Speaker:

If you have a plethora of experience with a situation, this system can indeed help you make a good decision.

Speaker:

It’s also obviously useful when dangerous elements are present, as System 1 thinking springs you into action where analysis and careful consideration would leave you dead.

Speaker:

But in situations you’re a stranger to (most situations), the instincts of System 1 are almost worthless.

Speaker:

System 1 uses familiarity and pre-existing associations to work.

Speaker:

Your instincts are based on what you’ve already known, seen, or experienced; they won’t do you any good in a completely unfamiliar situation.

Speaker:

You can’t go straight from driving a car to operating a sailboat—even though you know how to change direction with a steering wheel, those instincts won’t help you when you’re trying to change direction with sails.

Speaker:

System 1 also presents certain road bumps in learning.

Speaker:

Remember, your brain’s preferred default mode is easy (some would say lazy).

Speaker:

When presented with new material to learn, there’s always a chance your brain will take a look at the information and attempt to reduce and simplify all the nuance out of it.

Speaker:

Perhaps Kahneman’s overall lesson is that we must fight our natural instinct to be lazy thinkers.

Speaker:

But System 2 ain’t always what it’s cracked up to be either.

Speaker:

We can’t do it all the time because it would be impractical and too time-consuming.

Speaker:

But more importantly, it’s plain exhausting, especially if you have to keep forcing yourself to do it.

Speaker:

Maintaining your self-control, which is what metacognition is, is an exhausting effort—if you use up all your resources to force yourself to do something, chances are, you’ll be less able to do it next time.

Speaker:

This leads to a side effect called ego depletion.

Speaker:

Expending the mental energy required in System 2 thinking can leave one so drained that it turns their brain into mush and prevents them from exercising it in the near future.

Speaker:

If you’re pushing yourself to read a chapter of James Joyce’s Ulysses—not the easiest work of fiction to consume, a System 2 book if there ever was one—you might not feel up to it next time unless you’ve formed a System 1-type connection with the material.

Speaker:

It’ll just be too much.

Speaker:

Hand over that cheap romance novel with Fabio’s chiseled jawline on the cover.

Speaker:

Ego depletion simply leads to mental fatigue, which doesn’t just cause System 1 thinking; it leads to incredibly poor decision-making out of stress, discomfort, and anxiety.

Speaker:

After a bruising day at work, it’d be understandable if you decided not to make dinner from scratch—which would probably taste better and be healthier—and just warm up something in the microwave.

Speaker:

(On a side note, additional studies on ego depletion have turned up inconclusive results, but we can still make a credible argument that consciously having to think about ten tasks in a day is more mentally strenuous than thinking about two tasks in a day.)

Speaker:

What can we do about this?

Speaker:

First and foremost, attempt to be aware that there are two levels of thinking, and realize that you are probably using System 1 more often than not.

Speaker:

Consciously ask yourself if you are making the right choice or merely the quick one.

Speaker:

Ask what is driving you—truth or speed.

Speaker:

In other words, use metacognition and watch yourself dispassionately and objectively.

Speaker:

Draw a contrast between the two types of thinking and become familiar with what each feels like to you.

Speaker:

Awareness is the first step to any type of change, because it lets you know where you need to go.

Speaker:

Second, and this is much more difficult, is to transform your System 2 diligence and struggles into System 1 instincts through intentional repetition.

Speaker:

Think of chess masters.

Speaker:

For extended periods of time, they have to employ System 2 thinking to understand all the complexities of the game.

Speaker:

But after years of experience, they’ve catalogued thousands of scenarios, effective strategies, and defenses.

Speaker:

With all that information now at immediate disposal, they can make most of their opening moves without thinking, as all combinations are mere patterns they have seen before, and they only have to switch to System 2 if they run into trouble in the game.

Speaker:

Simply put, exposure and consistency shifts us from System 2 to System 1 thinking over time.

Speaker:

It eventually becomes the path of least resistance for you.

Speaker:

When you can turn something that requires concentrated effort into something you do habitually, it can be dramatically transformative.

Speaker:

The hardest part of any new thing—starting a diet, a new job, learning to drive—always comes at the beginning, when you’re using System 2.

Speaker:

But after a while, and it’s usually not that long, you start to feel more natural in what you’re doing, and it becomes a positive habit that you don’t have to think too much about (System 1).

Speaker:

If you want chess to be less mentally taxing, it takes time to move all the strategy and combinations from System 2 to System 1.

Speaker:

It’s exactly the same way with critical thinking.

Speaker:

You are mired in bad thinking habits in System 1.

Speaker:

Therefore, begin to intentionally slow yourself down and walk through the steps of analytical and skeptical thinking.

Speaker:

It will take time, but it will eventually become a habit, and a habit is really just another way of saying that something has moved from System 2 to System 1 thinking.

Speaker:

And eventually it’ll be something you just automatically do.

Speaker:

Battling Biases

Speaker:

132 00:16:26,560 --> 00:16:31,880 Another way in which we must watch our thinking is in relation to cognitive biases.

Speaker:

In short, they are additional ways in which the brain instinctively seeks shortcuts and the path of least resistance, which then result in faulty thinking that distorts or misinterprets the truth and could subsequently lead to bad decisions.

Speaker:

A cognitive bias is a pattern of thought that favors one’s own experience, beliefs, and subjectivity over reason and objectivity.

Speaker:

We all have our own perceptions and opinions about the world and how it works.

Speaker:

These beliefs come from our own experience, and we use them to make predictions and judgments about what happens.

Speaker:

The problem with cognitive biases is that they only reflect a single, solitary person’s experience: yours.

Speaker:

(Or mine.)

Speaker:

Your memory of a certain event is colored by your own beliefs about it.

Speaker:

And let’s face it—that viewpoint is probably biased.

Speaker:

Every time a relationship breaks up, each person has their own perception of what went wrong and tends to retell the story of the breakup in ways that make themselves look better and alleviate their fault in the matter.

Speaker:

Now imagine this tendency as applied to each and every situation that has open-ended interpretations.

Speaker:

Other times, cognitive biases are a byproduct of downright incorrect thinking and interpretations of the world that, once again, are caused by the brain’s proclivity for certainty and System 1 thinking.

Speaker:

If we aren’t aware that we are operating from a sorely limited worldview and perspective, then we are doomed.

Speaker:

Biases can also come from the two ways in which we organize the limited information we have about the world: schemas and heuristics.

Speaker:

They serve to put what we know about the world into action and facilitate quick decision-making.

Speaker:

Each of them goes a long way in producing our psychological triggers.

Speaker:

Schemas.

Speaker:

A schema is a model by which we arrange and decipher the information we’re currently receiving.

Speaker:

It allows us to say, “Okay, based on these three factors I can observe, I know what this is and how to act.” Imagine a schema as a snapshot of a certain situation, and we use that snapshot to arrange unfamiliar information.

Speaker:

Introduced by psychologist Jean Piaget, schemas are contextual, and we have schemas for different types of situations.

Speaker:

Schemas develop throughout our entire lives, though they’re most prevalent when we’re learning about something for the first time.

Speaker:

But while schemas are extremely useful, they can steer us toward unwarranted biases or errors.

Speaker:

Heuristics.

Speaker:

Where schemas help (and hurt) us in learning and interpreting certain things, heuristics are more about how quickly we solve problems and make decisions.

Speaker:

A heuristic is a shortcut our minds take when choosing a certain course of action, and as with a schema, it can be helpful or harmful.

Speaker:

How do they differ?

Speaker:

Where schemas are about understanding a situation at large, heuristics are about your role in a situation and how to act within it.

Speaker:

“If this is the situation,” a heuristic says, “then I should act in this way.”

Speaker:

161 00:19:50,600 --> 00:19:54,360 We make hundreds of decisions every day.

Speaker:

Most of them are small, ultimately trivial ones: what we’ll have for lunch, what radio station we’ll listen to on the way home, what grocery store we’re going to shop at, and so forth, unlike major life decisions that could have long-term consequences.

Speaker:

We simply can’t evaluate every last detail or possible ramification of small decisions.

Speaker:

It would be a waste of valuable time and mental energy.

Speaker:

That’s where heuristics come in.

Speaker:

They’re mental guidelines based on past experiences that we use to make basic daily decisions.

Speaker:

Think of heuristics as flashcards: they give us quick, abbreviated information to help us make speedy choices about daily decisions that we can’t stop and deliberate over.

Speaker:

Like the prior section discusses, our brains are crazy about System 1 thinking and using schemas and heuristics: they take less effort, energy, and time, and make everything simple.

Speaker:

Cognitive biases enforce that preference because they encourage snap judgments and quick decisions.

Speaker:

As convenient as they are, they blind us to the complex realties that dwell underneath almost everything under the sun.

Speaker:

Cognitive biases do have their uses, however.

Speaker:

Sometimes you have to clear the decks of your brain’s queue, and to do that, certain positive applications of cognitive bias can be good things.

Speaker:

But even in those situations, you need to be aware enough to keep your bias from becoming too strong.

Speaker:

Having acknowledged this, there are four particular life scenarios where cognitive bias can actually be beneficial:

Speaker:

176 00:21:33,800 --> 00:21:36,040 First.

Speaker:

When there’s too much information to absorb.

Speaker:

We live in a time when there’s a deluge of facts, data, statistics, stories, accounts—basically too much information.

Speaker:

The overload can be exhausting, and usually contains at least some bits of info that are of no use to us whatsoever.

Speaker:

We can become overwhelmed and paralyzed.

Speaker:

So it becomes necessary to filter out the information that’s relevant and retain only the parts we can use.

Speaker:

Cognitive bias can help reinforce that filter, and it does so in several ways.

Speaker:

The brain tends to latch on to the most repeated or recently activated memories (something called the availability heuristic, which we’ll explain more in a bit).

Speaker:

It also tends to remember events or people that are strange or humorous, and notices more strongly when something has changed.

Speaker:

So especially when you’re in a situation where you experience a lot of repetition—like a day-to-day “grind” job—your brain tends to relate more to what it already knows, as well as when something is odd or different about it.

Speaker:

Of course, when we’re experiencing a flood of information, we could fall prey to confirmation bias and deliberately exclude anything that doesn’t support our most highly cherished beliefs.

Speaker:

And that could mean we’re missing out on something extremely important.

Speaker:

Second.

Speaker:

When we’re struggling to find “meaning” in certain things.

Speaker:

We need context for the confusing and varied events that happen in our lives and the things we observe in the world.

Speaker:

If we can’t derive any meaning or significance from them, we feel adrift and lost.

Speaker:

This puts us in an immediate state of vulnerability.

Speaker:

To avoid that, we take whatever information we’ve already filtered and try to find patterns and connections.

Speaker:

Cognitive bias has already reduced the amount of information we have.

Speaker:

Now the brain tries to find a certain story in that limited data.

Speaker:

In doing so, it relies on our personal experience, looking for past events to compare with this new happening so that it makes some kind of sense to us.

Speaker:

The risk in this, however, is that the brain may rely on outdated stereotypes we have set up or sweeping generalities and judgments we already believe.

Speaker:

Also, the brain tends to favor people or things we’re comfortable or familiar with—it considers them “better” than people or things we don’t like or don’t know much about.

Speaker:

The brain considers that information too.

Speaker:

In this situation, the cognitive bias won’t give a full picture, of course.

Speaker:

It’s all based on our very limited past experience.

Speaker:

But for a moment or two, it’s enough for our brains to develop some meaning from the situation.

Speaker:

Third.

Speaker:

When we need to act quickly.

Speaker:

Time puts us into a crunch.

Speaker:

Decisions need to be made in fast order.

Speaker:

If we let ourselves get bogged down by inactivity or don’t react swiftly enough, we can fall behind or risk our survival.

Speaker:

Cognitive biases can be helpful in that regard—although, again, not without potential hazards.

Speaker:

Our egos have a role in this action.

Speaker:

We have to feel that we’re capable of making a positive and important impact.

Speaker:

So the cognitive bias may fill us with a sense of confidence (or, more likely, overconfidence) to get the motor running.

Speaker:

In doing this, we may jump to conclusions.

Speaker:

But we’ll get stuff done and things will be in motion.

Speaker:

Sometimes this is indeed the most important factor.

Speaker:

Cognitive biases cause us to fall back on the things that are most familiar and comfortable to us.

Speaker:

We rely on the most immediate and available resources.

Speaker:

We focus on the present situation, preferring to ponder that instead of the past or the future.

Speaker:

We concentrate on things we can more easily relate to and eschew tools or assets that don’t make as much sense to us.

Speaker:

We strongly prefer solutions that look simple, thorough, and relatively risk-free, rather than answers that are overly complicated, vague, or unsafe.

Speaker:

When the clock’s running low, this may be a perfectly reasonable course of action.

Speaker:

And it’s almost entirely fueled by cognitive bias.

Speaker:

But since it comes fast and furious, there might be some cleanup required once everything’s settled down.

Speaker:

Fourth.

Speaker:

When we’re deciding what we need to remember for the future.

Speaker:

The final scenario in which cognitive bias might be of assistance concerns memory.

Speaker:

If only fragments of our constant information overload are useful to us now, then even less of it will be relevant to us in the days and years to come.

Speaker:

So again, we have to cherry-pick the things and details that we remember.

Speaker:

Our cognitive bias steps in to shape these memories.

Speaker:

This process basically involves reduction.

Speaker:

We’ll discard some of the finer specifics of things and events and form broader, more general memories.

Speaker:

We trim some of the multiple smaller events off and reshape them into a few basic key points.

Speaker:

Maybe we’ll pick out only a couple events and elevate them so they represent the whole experience.

Speaker:

In processing these new memories, our cognitive bias again defers to those that are most meaningful or familiar to the brain.

Speaker:

It will also “edit” certain memories so they become more accessible to us, but in this process, certain details might accidentally be removed or inserted—so we remember the event slightly differently than how it really happened.

Speaker:

And our memory of the experience is more affected by outside circumstances (our condition while the memory was happening, how the information’s being presented, and so forth) than by how crucial the information might be.

Speaker:

There are times when cognitive bias can help you, but biases are decidedly not the path to practical intelligence.

Speaker:

In fact, they veritably put a blindfold on you.

Speaker:

Thus, we must delve into a few of the most prominent biases to understand how to battle them when we can.

Speaker:

Four Cognitive Biases to Watch Out For

Speaker:

241 00:28:06,960 --> 00:28:11,320 Nobody gets by without cognitive biases.

Speaker:

They are, by definition, patterns of thought that are undetectable.

Speaker:

Cognitive biases lead one to jump to conclusions and make instant rash assessments—and you just can’t build a meaningful relationship with practical intelligence if they’re all you depend upon.

Speaker:

There’s reality, and then there’s the version of it that your biases create.

Speaker:

In order to use practical intelligence, you must have a firm grasp on reality in its objective form.

Speaker:

There are a few specific cognitive biases we’ll take a brief look at in hopes that you’ll be aware of them and can keep them under control.

Speaker:

I’m not going to suggest you avoid them, because sometimes they’re unavoidable.

Speaker:

But I will say they’re ones to watch out for—to try to realize when they’re happening so you can pause and make at least a brief attempt to think more deeply when they come up.

Speaker:

More metacognition.

Speaker:

The availability heuristic.

Speaker:

The brain tends to prefer information that’s most readily available or comes to awareness rapidly.

Speaker:

If something simply comes to mind swiftly or is easily rememberable, we tend to attach an importance to it that it might not really deserve.

Speaker:

This heuristic excludes supporting information that might be important to consider, along with countering details that might be used to argue against it.

Speaker:

For example, you might see the topic “tsunamis” trending on Twitter.

Speaker:

You follow a couple links to recent “news” reports that say tsunamis are expected to happen more often in the near future.

Speaker:

The reports are compelling.

Speaker:

You feel a little nervous.

Speaker:

You become worried that you’ll be a tsunami victim.

Speaker:

You start thinking that you haven’t prepared enough.

Speaker:

You get to the point where you think it’s inevitable that someday you’ll get swallowed up by a tsunami and there’s nothing you can do about it.

Speaker:

In all this concern, you temporarily forget that you live in Kansas, a landlocked state in the middle of the United States where tsunamis never happen.

Speaker:

Tsunamis require a large body of water and are best known for happening to small island nations.

Speaker:

That’s the availability heuristic in a nutshell: you got spooked by a bunch of instantly accessible information that made you forget the chances of you getting swept away by a tsunami in Kansas are virtually impossible.

Speaker:

When you are asked about your fears, you answer “tsunamis” and ignore the rash of home burglaries that might be occurring, or that you are in danger of losing your job.

Speaker:

Just because something is available or notable does not mean it is important or representative.

Speaker:

Gambler’s fallacy.

Speaker:

This common cognitive bias magnifies the importance of past events in the prediction of future outcomes.

Speaker:

The bias dictates that conditions and previous results point to the inevitability of something happening down the road—when in reality, each subsequent event is independent of the previous.

Speaker:

This bias wants to create a cause-and-effect relationship where none exists.

Speaker:

For instance, just because a coin has flipped to the heads side one hundred times in a row doesn’t mean it’s more likely for the next flip to land on the tails side.

Speaker:

There is no relationship between each flip.

Speaker:

This particular cognitive bias is called “gambler’s fallacy” because it’s responsible for a lot of out-of-control gambling addictions.

Speaker:

Somebody betting on a football game may say a certain side will win because they’ve always done so before, or because they’ve lost so many times that they are due for a win: “The Packers are due a win this week after all the tough losses, and they are going to get it against the Lions!”

Speaker:

275 00:31:56,400 --> 00:32:04,160 Forget that this guy would be a terrible gambler if that’s the information he used to lay a bet, but it illustrates the point.

Speaker:

The past history of the Packers-Lions rivalry doesn’t have anything to do with how well those teams have played in recent years.

Speaker:

The Packers’ losses in recent weeks don’t mean their turn for a win is coming.

Speaker:

Just because something happened doesn’t mean something else will happen.

Speaker:

But someone employing gambler’s fallacy would not pay those factors any mind.

Speaker:

Post-purchase rationalization.

Speaker:

This cognitive bias seeks to reduce regret, and it’s based on a fairly common consumer behavior.

Speaker:

Say you’re shopping for home theater equipment.

Speaker:

You go to a showroom and see a couple different models.

Speaker:

One set of speakers is extremely expensive, features a lot of bells and whistles, and takes up a lot of space.

Speaker:

The other’s a bit cheaper and smaller, but to the naked ear, doesn’t seem to be much different in terms of quality.

Speaker:

You might be persuaded to buy the bigger and more expensive one because, since it’s bigger and more expensive, it must work better.

Speaker:

But it puts a serious dent in your bank account and is too big for your living room.

Speaker:

And you might not even really be able to tell how well the sound’s working.

Speaker:

If you employed post-purchase rationalization, you’d convince yourself that you made the right decision, that it’s what you wanted to do all along.

Speaker:

You’d tell yourself that you can hear the difference in sound, and you do indeed need fifteen different plugs and ports.

Speaker:

You might know deep down inside that you went overboard, but that knowledge makes you uneasy.

Speaker:

Regret makes you feel stupid, and no one likes that.

Speaker:

So you talk yourself into believing you did the right thing and got exactly what you wanted.

Speaker:

No more regret, just eating boxed macaroni and cheese for dinner for the next two months because you spent so much on new speakers.

Speaker:

This type of post-anything justifying behavior extends far beyond purchases.

Speaker:

These are more commonly known as defense mechanisms, and they allow us to defend our egos from scrutiny and shame.

Speaker:

We do this sometimes when we defend ourselves from others, but here, instead of trying to convince someone else, we are trying to convince ourselves.

Speaker:

Sometimes, we are not even aware we are using them.

Speaker:

That’s the scary part.

Speaker:

Confirmation bias.

Speaker:

Confirmation bias is when we want to believe in a certain “fact” so badly that we only look for and agree with information that supports the belief—and ignore compelling evidence that disproves it.

Speaker:

Instead of “I’ll believe it when I see it,” confirmation bias dictates a standpoint of “I’ll see it when I believe it.”

Speaker:

304 00:34:47,280 --> 00:34:50,760 Confirmation bias happens all the time.

Speaker:

Those of us with any type of political beliefs tend to consume news from sources that support our opinions and block out legitimate sources that offer countering views.

Speaker:

If you are looking to buttress a certain stance that you believe in, you will feel that each supporting source is legitimate and thorough, while each critical source is flawed or has an agenda.

Speaker:

There is only one view that you’ve pre-decided, and you’ll mold the evidence to fit it.

Speaker:

You see it when you believe it.

Speaker:

Even scientists who supposedly hold objectivity in high esteem might suppress or ignore data that challenges their findings just so they have a better chance of getting published by scientific journals.

Speaker:

In fact, this is why an experimental technique called the “double blind” was invented—so scientists didn’t know who they were subjecting to certain conditions, and therefore couldn’t act in confirmatory ways.

Speaker:

We practice confirmation bias because, quite simply, it hurts to be wrong.

Speaker:

Taking the emotional component out of our “truths” as much as possible and being open-minded about divergent views help us combat confirmation bias and think smarter.

Speaker:

Beating Cognitive Bias

Speaker:

315 00:36:06,680 --> 00:36:12,000 So how do we work to defeat cognitive biases that show up in our thinking?

Speaker:

The list of four is helpful, but nowhere close to exhaustive.

Speaker:

The one thing you can start doing immediately, now that you know what cognitive biases are, is become aware of them in your thinking and interactions and note how they affect your sense of belief.

Speaker:

But still, that feels inadequate against some of these thought patterns that have been left unchecked our entire lives.

Speaker:

There are a few specific mental exercises that can help retrain your thinking to become clear-minded and measured.

Speaker:

Practice thinking of alternative explanations.

Speaker:

Instead of making a snap decision about why a certain thing is the way it is, try to think of multiple reasons or causes.

Speaker:

Reserve your judgment and stop jumping to conclusions.

Speaker:

You don’t need to find an answer immediately; emphasize the truth instead of speed.

Speaker:

For example, if you’re sitting in your favorite coffee shop and you notice a huge drop-off in business, you might think it’s because the quality of the coffee has declined.

Speaker:

But it could also be because more people are making their own espresso drinks, or because it’s summer and more people are doing other things outside.

Speaker:

Or perhaps it’s that the prices the store is charging are keeping people away.

Speaker:

(It’s usually that, actually.)

Speaker:

In a sense, this is like reverse storytelling.

Speaker:

You are starting with the conclusion, but you aren’t sure what’s happened.

Speaker:

Instead of jumping to conclusions, you would work backward and theorize what could have contributed to what you currently see.

Speaker:

You might try an exercise of taking a scene, a person, or any other thing, and observing five details or characteristics about it.

Speaker:

Then, for each of those details, write down five possible causes that may have led that particular detail to be the way it is.

Speaker:

Try to vary the potential causes you list, ranging from the plainly realistic to the downright bizarre.

Speaker:

This will train your ability to create a story around every detail, thus giving you twenty-five trains of thought instead of defaulting to the quickest and easiest for your brain to process.

Speaker:

Most of us think only linearly in terms of cause and effect.

Speaker:

But that’s ineffective at best in understanding a situation.

Speaker:

Reword your statements as questions.

Speaker:

Think of something you consider a declarative, absolute truth.

For example:

“E-books and e-readers are killing literature.” That’s a pretty strong statement.

For example:

But try rephrasing it: “Are e-books and e-readers really killing literature?” The mere act of turning it into a question makes your brain start looking for answers: “Well, maybe e-readers are encouraging more people to read—that’s good.” “They may be changing how we read, but they’re not really killing how literature is made.

For example:

Maybe I’m just overly sentimental about physical books.” With just that one shift in your statement, you’ve opened up your mind to a new line of inquiry and exploration.

For example:

Get behind and challenge your assumptions.

For example:

Let’s say you have a very broad belief about poor people: “They’re poor because they don’t want to work.” Challenge that assumption immediately: “Do poor people just not want to work?

For example:

Or do they really have less opportunities?

For example:

They’ve been closing plants and stores in town for a few years now—maybe they don’t have anywhere else to go.

For example:

And it’s hard to get the proper training for a skilled position when you can’t afford it .

For example:

.

For example:

.

For example:

What if there is something else that causes poverty?

For example:

What if there are about fifty shades of gray to this matter?” The harsh truth is that whatever you think you know about a topic, especially if it involves people’s thoughts and motivations, you probably know only about 10 percent of what’s truly happening.

For example:

It’s always best to be proactive about challenging your assumptions through self-interrogation and especially through valid news and information sources—including people who have deep experience in the subject you’re thinking of.

For example:

It’s uncertain where many of our assumptions come from anyway, so it’s good to reevaluate them from time to time.

For example:

Remove pride, ego, and your need to be right.

For example:

The truth is a separate pursuit entirely from each of those things, and sometimes there is a stark contrast because you want to feel a certain way about yourself, especially in front of others.

For example:

Truth becomes a lot easier to discern when you take your emotional rewards (and punishments) out of the equation and simply try to determine what’s real.

For example:

If you face opposition, it’s just going to cause you to dig your heels in and deny, defend, and stonewall.

For example:

You’ll be seduced into caring more about dominating someone than understanding.

For example:

You’ll want to avoid that sour feeling of shame when conceding defeat to someone—anyone.

For example:

Even if you’re right, very few people make friends by saying “I told you so.”

For example:

361 00:41:19,680 --> 00:41:27,000 Picture how a desperately stubborn person would act—is that similar to how you are acting?

For example:

Could anyone make an honest comparison between the two?

For example:

Hopefully not.

For example:

Even more so, explore being wrong and understand the feelings that outcome evokes.

For example:

Play out scenarios where you are indeed wrong.

For example:

What feelings will you experience?

For example:

There may be embarrassment, anger, humiliation, or shame—but do they affect the world or your life?

For example:

Only if you let them.

For example:

Logical Arguments

For example:

371 00:41:57,320 --> 00:42:07,040 A final element of watching yourself from afar (not dissimilar to a voyeur) is recognizing the naked truth of what’s being said.

For example:

This is deceptively difficult, because just like with cognitive biases, by nature, illogical arguments go unnoticed.

For example:

It’s rare that we dissect statements just to understand their logical underpinnings, and that makes for habitually sloppy arguments and poor understanding.

For example:

It’s easier than you might think, and it is equally frustrating to see people commit these errors on a daily basis.

For example:

There’s a funny, if somewhat cynical, piece of “advice” for people who are a little unsettled about speaking in public: “If you can’t dazzle them with brilliance, baffle them with B.S.” In this context, “B.S.” does not stand for “Bachelor of Science.”

For example:

377 00:42:47,400 --> 00:42:57,280 We’ve all been in conversations at least once or twice in our lives in which we realize that the person we’re speaking with is making not one shred of sense.

For example:

They might be talking about a crazy theory they have or speaking from a viewpoint that has no basis in reality.

For example:

Someone may try to convince you that things are a certain way, but for whatever reason, their words don’t add up.

For example:

It causes a cognitive clog in your brain, and you might end up feeling disoriented or lost.

For example:

They probably think they’re making sense—they don’t think they’re trying to baffle you with B.S.

For example:

But on the other hand, maybe they are.

For example:

They might be trying to convolute your thinking with distorted logic and crazy talk.

For example:

Whatever the case, you can’t quite put your finger on what’s wrong, and thus you can’t form a rebuttal.

For example:

That’s annoying, to say the least.

For example:

Someone has gotten the last word on something you’re pretty sure is a bad argument, but you can’t figure out exactly why.

For example:

It’s important for you to know that, when these bits of nuttiness are hurled in your direction, it’s not your brain’s fault that you’re not getting them.

For example:

The problem isn’t with your comprehension or ability to think—in fact, it’s the opposite.

For example:

You’re dealing with someone who is defying the laws of logic, and while your ears are taking it all in, your brain’s not having any of it.

For example:

Rightly so.

For example:

For the most part, this happens by accident in normal, everyday conversations where people are well-intentioned.

For example:

People might be so eager to push a stance that they haven’t done their due diligence, or someone doesn’t pay attention to details and only wants simple sound bites.

For example:

We’ve all done it before.

For example:

We get caught up in making a firm point, get flustered if we’re not convincing enough, and end up making statements that don’t seem to make any sense, because they don’t.

For example:

We spitball on earlier statements in an attempt to salvage an argument, and hope they aren’t picked apart.

For example:

This is a situation where it’s beneficial to understand the basic nature of logical thinking and construction.

For example:

In the world we live in, this understanding is a crucial mental skill to develop.

For example:

It helps us ferret out the truth and process problems.

For example:

It imparts the ability to parse arguments and statements and know if they need to be questioned further.

For example:

In a way, it’s a mental model in itself—actually more of a mental supermodel.

For example:

Dissecting logical arguments sounds complicated—something you’d need an advanced technical or philosophy degree for—but the foundation of logical thinking is actually pretty easy to understand.

For example:

The concepts are straightforward.

For example:

They use sentence structure and equations to illustrate how certain ways of thinking are more effective than others.

For example:

Understanding them breaks down to assessing the different kinds of statements people make in explaining a concept or an argument.

For example:

As a quick example, a friend may be trying to remember their shoe size.

For example:

They say, “If I am wearing sandals, they will be size nine.” So far, so good.

For example:

And then they say, “Therefore, if I am wearing size nine, I am wearing sandals.” Hopefully an alarm has been set off in your brain.

For example:

That doesn’t logically add up, and you’re about to learn why.

For example:

Conditional statements X -> Y. We’ll use a conditional statement as the core example for all these arguments: “If you feed my dog kibbles, then he’ll be friendly to you.” Just to make things easy to understand, let’s pretend in this discussion that this statement is always true.

For example:

This is called a conditional statement because it says, “If this condition is met, then this result will 100 percent happen.” The condition is your feeding your friend’s dog kibbles.

For example:

The result is that the dog will be friendly to you.

For example:

There is a straight cause-and-effect relationship between the condition and the result, and it only functions in one direction.

For example:

Once again, we’re pretending this will always be the case—every time you give this dog a kibble, he’s going to love you.

For example:

Using this as a fact, the statement is sound.

For example:

We also call the relationship between the condition and the result one of premise and conclusion—broader terms that can be used for other statements.

For example:

If a certain premise is true, then you can expect the conclusion or outcome to be true as well.

For example:

These types of statements generally don’t present as issues, unless someone is trying to pass off that the conclusion will always be true when it isn’t.

For example:

It’s when you start to play with it that problems arise.

For example:

Converse statements Y -> X.

For example:

Now, consider this statement: “If my dog’s friendly to you, it’s because you fed him kibbles.”

For example:

422 00:47:45,360 --> 00:47:53,120 Well—that’s certainly a possibility, since we’ve determined that feeding the dog kibbles is a surefire way to win his friendliness.

For example:

But is it the only way to make the dog friendly?

For example:

Maybe you petted him.

For example:

Perhaps you spoke to him in a gentle, friendly tone of voice.

For example:

Maybe you played a game of fetch with him that made him extremely happy, and he returned his happiness with affection.

For example:

Maybe the dog is in a good mood.

For example:

Dogs do that.

For example:

This is an example of a converse statement: it reverses the conclusion and the premise, or the result and the condition—it is saying that the prerequisite is true if the end result is true.

For example:

And it’s turned the statement into a logical flaw.

For example:

It’s true that feeding the dog kibbles will make him your friend.

For example:

But there’s no indication that he’s friends with you strictly because you fed him kibbles.

For example:

There are other ways you can make a dog friendly to you.

For example:

You’ve just caught someone.

For example:

Remember, a statement only has cause and effect in one direction—from condition to result, and not the other way around.

For example:

A converse statement is the direct parent of something called the false syllogism—basically, a false premise.

For example:

Its fallacy is also exposed in making leaps of judgment based on misunderstood connections, like this:

For example:

439 00:49:04,200 --> 00:49:06,240 • Dogs love kibbles.

For example:

• Monkeys love kibbles.

For example:

• Therefore, dogs are monkeys.

For example:

In this statement, the two premises might be true.

For example:

But the fact that both dogs and monkeys like kibbles doesn’t mean they’re the same thing.

For example:

The premise used for establishing the conclusion—mutual kibble love—is therefore false, as is the conclusion.

For example:

Inverse statements Not X -> Not Y.

For example:

Okay, let’s try this one on for size: “If you don’t feed my dog kibbles, he won’t be friendly to you.”

For example:

448 00:49:39,600 --> 00:49:40,800 Really?

For example:

That’s the kind of dog you have?

For example:

If I don’t feed him kibbles—if I’ve run out or, you know, just don’t carry kibbles on me out of habit—then he’s going to turn on me?

For example:

What an ingrate.

For example:

This is an inverse statement.

For example:

It preserves the premise-conclusion relationship of the original statement but turns it into a negative: “If this doesn’t happen, then this won’t happen as a result.” It assumes a deeper relationship between the two than actually exists.

For example:

Cause and effect certainly doesn’t work if the lack of a cause means the lack of an effect.

For example:

Inverse statements are trickier, because not all of them are wrong.

For example:

Sometimes they’re right: “If you don’t brush your teeth, then they won’t be healthy.” Well, that’s true.

For example:

But it leaves out that there are other ways to make your teeth unhealthy.

For example:

Constantly eating food that’s bad for your teeth, for example (even if you do brush).

For example:

It could very well be that the dog rejects all who do not bring him kibbles.

For example:

I don’t know this particular dog’s neurosis when it comes to being fed kibbles at the appropriate time; I suppose it’s possible lack of kibbles turns him into a hostile, nervous wreck.

For example:

Still, the dog may be unfriendly for other reasons.

For example:

Maybe he just got back from chasing a car that he didn’t catch, so he’s a little disappointed.

For example:

Maybe he’s in a bad mood.

For example:

Maybe you’ve insulted him.

For example:

Maybe he was recently neutered.

For example:

There are plenty of things that can tick this dog off besides kibble deprivation.

For example:

So while certain inverse statements might be right, not all of them will be.

For example:

Be extra cautious and don’t take them at face value.

For example:

Many things will try to pass themselves off as true statements, but you can begin to see that most of them are logical flaws.

For example:

Contrapositive statements Not Y -> Not X.

For example:

These are statements that negate both the premise and the conclusion—both backward and forward.

For example:

If the original conditional statement is correct, then the contrapositive is also always true, unlike the converse or inverse statements.

For example:

This type of relationship does exist both ways, because it’s about a negative.

For example:

In our trusty dog food analogy, the contrapositive would be “If my dog is unfriendly to you, then you didn’t give him kibbles.” This is true.

For example:

There could be many reasons why the dog is being a jerk (see above).

For example:

But one thing’s for sure: if he’s unfriendly, then for sure you haven’t given him any of his cure-all kibbles.

For example:

If you did, the dog would be more agreeable.

For example:

But he isn’t, so you haven’t.

For example:

Remember, that part is a given, so if the result is not true, then the given is also not true.

For example:

Another quick example: if you go swimming, you will be wet.

For example:

What might the contrapositive statement sound like?

For example:

If you are not wet, you did not go swimming.

For example:

That certainly seems to make sense.

For example:

It can take a bit of effort to decipher these types of logical statements, but once you do, you’ll find that you can understand the truth of matters instantly.

For example:

3040

For example:

Takeaways:

For example:

488 00:52:55,400 --> 00:53:00,600 • This chapter has a tall task—to get you to think about your thinking.

For example:

When we’re not engaging in what is known as metacognition, it’s easy to veer off the path of clear thought.

For example:

You must become aware of your thought patterns and where you tend to stray.

For example:

Watch yourself and try to evaluate what’s happening inside your brain.

For example:

• This process begins with System 1 and 2 thinking, as conceived by Daniel Kahneman.

For example:

System 1 thinking is quick, instinctual, and decisive—and also often incorrect.

For example:

System 2 thinking is measured, calm, and analytical—and far slower and more difficult.

For example:

Unfortunately, the brain operates on the principle of least resistance, so while System 1 thinking is first and foremost, we want to get into the habit of System 2 thinking on a consistent basis.

For example:

The easier and more familiar a task becomes, the more instinctual and quick it can be, so the way to clearer thinking is consistent repetition and practice.

For example:

• Cognitive biases are a similar concept, where we leap to conclusions because they appear to fit schema or heuristics we are familiar with, or are simply in line with our personal experiences.

For example:

These can also be effective, but often wrong.

For example:

Notable biases include: availability heuristic (I can remember it, so that means it is important), gambler’s fallacy (X happened, which means Y must happen), post-purchase rationalization (I made a good decision .

For example:

.

For example:

.

For example:

), and confirmation bias (I only read what I want to read).

For example:

• How can you overcome cognitive biases, aside from simple awareness and metacognition?

For example:

There are four keys: alternative explanations and reverse storytelling, rewording statements and assumptions as questions, getting behind your implicit assumptions, and removing pride and ego from the equation.

For example:

• Finally, it’s important to understand logical arguments—and especially illogical arguments.

For example:

We hear these every day but may not be able to pick out their logical flaws.

For example:

You can think of these as a combination of math and argumentation that allows us to see the reality versus what you see or hear.

For example:

There is the conditional statement (X -> Y, true), the converse statement (Y -> X, usually a flaw), the inverse statement (Not X -> Not Y, usually a flaw), and the contrapositive statement (Not Y -> Not X, true).

For example:

It’s not just word games; it’s understanding the foundations upon which real and false arguments are built.

For example:

Alright, that brings us to the end of another episode, and that was a tough one.

For example:

A lot of material in there, thanks for staying on to the very end.

For example:

You know, by understanding the dual processing nature of our brains, we can learn to recognize when our thoughts may be leading us astray due to those cognitive biases we all have.

For example:

And overcoming those biases is a challenging task, but with awareness and commitment, we can improve our decision-making abilities and navigate towards clearer thought processes.

For example:

And remember that practice makes perfect in all endeavors of life, by consistently and consciously engaging in system 2 thinking and in analyzing the logical foundations upon which arguments are built.

For example:

We can train ourselves, gradually, away from common pitfalls of incorrect reasoning.

For example:

This not only makes us better thinkers, but also helps us become more adept at discerning truth amidst an ever-increasing sea of information that surrounds us daily.

For example:

We'll leave you with a quote by Benjamin Franklin to help inspire your day.

For example:

Tell me, and I forget.

For example:

Teach me, and I may remember.

For example:

Involve me, and I learn.

Show artwork for Social Skills Coaching

About the Podcast

Social Skills Coaching
Become More Likable, Productive, and Charismatic
While everyone wants to make themselves and their lives better, it has been hard to find specific, actionable steps to accomplish that. Until now...

Patrick King is a Social Interaction Specialist, in other words, a dating, online dating, image, and communication, and social skills coach based in San Francisco, California. He’s also a #1 Amazon best-selling dating and relationships author with the most popular online dating book on the market and writes frequently on dating, love, sex, and relationships.

He focuses on using his emotional intelligence and understanding of human interaction to break down emotional barriers, instill confidence, and equip people with the tools they need for success. No pickup artistry and no gimmicks, simply a thorough mastery of human psychology delivered with a dose of real talk.

About your host

Profile picture for Russell Newton

Russell Newton