Written a little bit before about confirmation bias, but this will be a deeper dive. You probably inherently know what “confirmation bias” is or means, but let’s start out with a quick definition. Per a pretty detailed Wikipedia, it’s the tendency to “search for, interpret, favor, and recall information in a way that confirms one’s pre-existing beliefs, while giving considerably less consideration to alternative possibilities.”
Right. Phrased another way: life for most of us right now.
Confirmation bias has always been one of the most powerful of the cognitive bias theories. I’ve even seen stuff that once a person has a belief about you, it requires 144 exposures to a different belief for them to change. Think on that for a second. If your first week of a new job is crappy, you maybe gotta spend 1–2 full years not being crappy before your boss is like “You know what? Nicky is pretty good after all…” So one or two bad days can cost you 10–12 weeks of being put in a box. That’s nuts. It’s also how the human brain processes information.
How did we get here, and can we change it? Let’s explore.
Confirmation bias in this new New Yorker article
Read this — “Why Facts Don’t Change Our Minds” — last night. Elizabeth Kolbert is killing targets over there. She previously wrote a really interesting thing on the rise of automation.
There are a million and five interesting things in this article right here, but here’s one pull quote:
If reason is designed to generate sound judgments, then it’s hard to conceive of a more serious design flaw than confirmation bias. Imagine, Mercier and Sperber suggest, a mouse that thinks the way we do. Such a mouse, “bent on confirming its belief that there are no cats around,” would soon be dinner. To the extent that confirmation bias leads people to dismiss evidence of new or underappreciated threats — the human equivalent of the cat around the corner — it’s a trait that should have been selected against. The fact that both we and it survive, Mercier and Sperber argue, proves that it must have some adaptive function, and that function, they maintain, is related to our “hypersociability.”
Mercier and Sperber prefer the term “myside bias.” Humans, they point out, aren’t randomly credulous. Presented with someone else’s argument, we’re quite adept at spotting the weaknesses. Almost invariably, the positions we’re blind about are our own.
Ruh roh. Stuff just got real.
“The positions we’re blind about are our own”
This is true in almost every possible context: political leanings, for sure. Sex? Yes. How we think about and relate to work? Unquestionably. Relationships and past issues? 837 percent in the affirmative. Many people almost completely and utterly lack any form of self-awareness, and I’d argue a few different things make this worse:
Social Media: Very trite to say this, admittedly. But social media is used by a lot of first-world peeps, and it does group us into specific bubbles based on algorithms. It’s hard to get new positions, or develop empathy, when those platforms essentially morph into your news and communication source.
Work: We all spend a bunch of time at work. (The hard ceiling should be 55 hours/week, but because we deify the workaholic and seek self-worth and relevance through work, we often spend way more than that on it.) Here’s the problem with spending a lot of time on work, aside from health concerns: work actually makes us stupider. Most jobs are a bunch of digital paper-pushing re-animated as hair-on-fire urgency projects that you’re eventually evaluated against as “KPIs.” Some dudes write books now calling work “chimp rape,” and while maybe that’s a little far, not much thinking is happening in most offices. It’s rush rush rush project project project.
How exactly are we going to overcome confirmation bias — not be blind to our own positions — if we spend most of our day in dumb’ed-down situations clicking like on stuff we already believed anyway?
Confirmation bias and the question of the modern age
I’d argue the currency of the modern age is “how to convince someone” — of anything, really. Of course, if you’ve seen any 17 car pile-ups on Facebook in the last few years, you know that’s pretty hard. People stick to their poles, embrace their confirmation bias, and yelp about what they know and believe. Facts don’t matter. It’s the opinions age now, or maybe Post-Truth.
One of the issues is this: there is a lot of nervousness and confusion about the world right now. Politically, economically, salary-wise, job force-wise, etc. Confirmation bias, or the herding of peeps into like-minded pens, kind of seems like a massive Freudian rationalization so that we don’t have to have real conversations about issues. This is logical: we are nervous people right now. But growth typically comes from friction/challenge/discomfort, not from “Everything is how I like it and want to see it.” We’re obsessed with growth at a business level, but maybe we’re whiffing on that target on a personal/community level.
Can we get better about confirmation bias?
Probably not. It’s very deep-seated.
The most obvious advice would be “get comfortable with ideas opposite your own” and/or “spend less time with people like yourself.” Unfortunately, that’s really hard for people in their day-to-day life. When a company talks about “culture,” it’s usually these big, lofty words. That’s all bullshit. What the decision-makers mean by “culture” is usually “like-minded people working on stuff they want to work on and getting perks/compensation for it.” Like-minded people running a place becomes homophily, and that seeps into hiring. You may hate some of your co-workers, but all of you are more alike than you really want to admit. In fact, that’s one reason collaboration is often so challenging.
You could start with a micro-level experiment of trying to get outside your comfort zone and preconceived notions once per day. If you stuck to that, you’d hit 365 confirmation bias-reducing targets in a given year. That’s way better than most people. And hey — developing additional self-awareness and moving away from confirmation bias blindness could actually make you richer. So that’s a plus.
Obviously this is a very nuanced topic — entire sections of advanced textbooks are dedicated to it — but is there anything else you’d add on confirmation bias?