A few years ago, I found myself in a debate — not an angry, performative one, but the old-fashioned kind. A friend and I were discussing free speech, and though we approached it from different angles, we were both respectful, curious, even amused by our disagreements.
But something strange happened. Halfway through, I noticed something tugging at the edge of my mind — not a new argument or flash of insight, but a small, embarrassing thought: What if I'm wrong? For a second, I hesitated. I felt a familiar temptation to double down, to find firmer ground. But instead, I said the thing you’re never supposed to say in debates:
"I'm not sure."
He paused, smiled, and said: "Neither am I."
It was one of the best conversations I’ve had.
We live in a culture increasingly allergic to that kind of honesty. Social media rewards certainty. Institutions reward orthodoxy. The public square is filled with slogans, not caveats. In that context, epistemic humility — the idea that we should remain aware of the limits of our knowledge — can feel like weakness. In fact, it may be our last intellectual defence against dogma.
Bertrand Russell once said: “The whole problem with the world is that fools and fanatics are always so certain of themselves, and wiser people so full of doubts.” It’s a line that gets quoted often, but I suspect we don’t quite grasp the depth of its warning. Russell isn’t simply urging us to be modest. He’s describing a dynamic — a tragic asymmetry — in how certainty operates in public life.
The more nuanced you are, the more doubts you hold, the less likely you are to shout. And in an age of shouting, that can be fatal.
We saw this dynamic in full force during the pandemic. One could watch, in real time, as uncertainty hardened into orthodoxy. Experts who expressed caution were cast out. Dissenting scientists were labelled dangerous. And citizens, understandably frightened, found comfort in moral certainty — even when that certainty demanded silencing others, closing schools, or isolating the elderly.
This isn’t to relitigate Covid policy. It’s to point out that in moments of collective fear, epistemic humility becomes especially difficult — and especially important. When the stakes are high, the incentive to bluff confidence becomes overwhelming. The incentive to pretend we know — to act as if we’ve got it all figured out — can override the much harder task of living with ambiguity.
And that’s not just a pandemic problem. Increasingly, politics, media, and online culture demand the same bluff. The appearance of certainty has become a kind of status symbol — a social credit system where doubt is penalised. Express hesitation about a policy or protest, and you’re branded an enemy. Ask a question at the wrong time, and you’re lumped in with extremists. Even universities, once proud bastions of critical thought, now hesitate to platform difficult conversations for fear of reputational harm.
But knowledge doesn't work this way. Nor does wisdom. Philosophy, science, literature — these are traditions built not on the assertion of truth, but on the careful admission of ignorance. Socrates didn’t claim to know more than his peers — he claimed to know less, and thereby understood more.
Epistemic humility doesn’t mean we never act. It doesn’t mean paralysis. It means remembering that convictions must remain open to revision — not because we’re unsure of everything, but because our own minds are fallible. Because the cost of certainty, when wrong, can be far higher than the discomfort of doubt.
And that’s the paradox. In a world increasingly obsessed with being right, it is often the person willing to say “I don’t know” who should be trusted most.
It may not win arguments on social media. But it might just save public discourse.
Further reading
Intellectual Humility by Michael P. Lynch et al.
A multidisciplinary exploration of what it means to think with humility — from philosophy to cognitive science.
On Bullshit by Harry G. Frankfurt
A brief but razor-sharp meditation on why indifference to truth is more dangerous than lying.
The Righteous Mind by Jonathan Haidt
Essential reading on moral psychology and why intelligent people can still fall into tribal thinking.
Being Wrong by Kathryn Schulz
A beautifully written look at why we hate being wrong — and what it reveals about us.
Mistakes Were Made (But Not by Me) by Carol Tavris & Elliot Aronson
An insightful and often funny book about cognitive dissonance, self-justification, and the human resistance to admitting fault.