Confirmation Bias

If confirmation bias had an image, it would be something like this:

How many black dots do you see in there? If you’re like most people. chances are you see 1, maybe 2. Until you move your eyes. And then you see another 1, maybe 2. And the first ones you saw are gone.

There are 12 black dots in that image. At most you can probably only see a fraction of them at a time.   And yet your mind convinces you that it can see the whole image perfectly… so much so that it helpfully fills out the grey intersections where the other dots are, leaving them empty. Don’t mind them. Nothing to see there.

Our minds are so good at pattern recognition they will ignore data about reality to complete the patterns they perceive. And if you’re one of the rare people who can see all 12 at once, don’t worry: there are plenty of other things that will fool you instead.

If your brain can trick you into only seeing one or two of the dots on this image at the same time, you can rest assured it can trick you into thinking you know more than you actually do about data that isn’t even all in front of you at the same time, let alone data that isn’t all the data in the world about that topic.

That’s the trickiest half of confirmation bias. Not just focusing on data that confirms what you believe, but ignoring evidence that goes against what you believe, so thoroughly that something in your mind filters it from your senses or memory before it even reaches “you.”

There’s nothing wrong with having a belief without having all the data concerning it.  We’re imperfect beings, and can’t go through life having 0% confidence in things just because of unknown unknowns.

But believing something isn’t the same as being 100% or even 90% confident that it’s true. You can believe in something and acknowledge that you’re only 70% confident in it, or 53.8% confident, as long as you know of things that would increase or decrease your confidence if brought to your attention.

Unfortunately most people don’t think that way. They don’t talk about their beliefs in probabilities. Even if they acknowledge that they “might be wrong,” they’re confident that what they think is true. And research has shown people to be overconfident in their beliefs again and again and again.

Thanks to empiricism and reason, we do have good reasons to believe certain scientific and philosophical ideas. They’ve been vigorously tested and used to make correct predictions about the world. They’ve been used to change things in our external, shared reality. It’s okay to be confident in things like “I exist” and “things in the world can be measured.”

But things like political beliefs? Beliefs about people you’ve never met? Beliefs about systems you’ve never studied?

Lower your confidence in all of those. All of them.

When you’ve reached the point where you know what you value (Equality, Justice, Health, etc) but are not quite positive about what the right things to do to achieve the optimal balance of those values in the real world are, you’re a bit closer to understanding what you think you know and why you think you know it.

If you’ve reached solipsism, dial it back. That way lies madness, and insufferable, pointless arguments.