How do you know you’re “right?”
In her Ted Talk, Kathryn Shulz asks “What does it feel like to be wrong?” Take a moment right now to answer that for yourself.
It’s worth doing for what you’ll learn.
Most people interpret the question as “How did you feel when you realized you were wrong?” They answer something like “I felt embarrassed,” or “I relished learning something new.” Shulz points out that this is the answer to a different question: “How do you feel when you discover you are wrong?”
Being wrong is what were are before you discovered you were wrong, when what you felt was… like you were right!
Being wrong is always preceded by feeling we are right. And we’re all often wrong. So the feeling of being right is weak evidence we really are.
We’re all aware we have cognitive biases that influence how we think thanks to work by Amos Tversky and Daniel Kahneman, public intellectuals and researchers like Dan Ariely, and Richard Thaler (whose Misbehaving is a romp through the genesis of Behavioral Economics).
How do we avoid being misled by these biases? How can we become better thinkers?
Knowledge and Belief
Tim Urban, who writes the hilarious and insightful Wait But Why, illustrates where we go when we stray from a balance of conviction and knowledge.
He observes that:
One of the biggest impediments to this [balance] is when you start to identify with certain beliefs, stances, or ideologies. Once that happens, your Primitive Mind enters the equation and will do whatever possible to keep you from changing your mind, which cripples your ability to learn.
He is inspired in part by Paul Graham’s essay which calls out two situations in which should question our beliefs diligently:
When our belief is tied to our identity.
We are likely to believe arguments that tie into related belief systems or our tribal affiliation.When it’s about a subject which everyone feels entitled to an opinion, say politics, religion, or design, as opposed to about something specialized and arcane, like chemistry.
Here our opinion may well be based on very little information, and we become prey to the Dunning-Kruger effect, fooling ourselves into thinking a subject we know little about is far simpler than it really is.
Methods
How can we improve our chances of not being duped by our own biases? Among the heuristics we can try:
Start with the realization that belief is not a good indicator of correctness.
Be willing to question our beliefs.
Don’t discount out of hand, and listen carefully to counter-arguments. Especially when those positions make us uncomfortable or angry.
Cultivate an evidence-based, scientific mindset. But keep in mind that scientists are prone to their own blindspots, such as confirmation bias.
Strive for an "iconoclastic/skeptical/inquisitive” attitude. Asking Why? a lot helps cut through our own preconceptions.
Check our gut for how our belief makes us feel and how we’d feel if the opposite is true. If our stance makes us happy and the opposite unnerves us, we need to pay close attention as we’ve got skin in the game.
If what we believe is not beneficial to us, we are more likely being more objective. Upton Sinclair said:
It is difficult to get a man to understand something when his salary depends upon his not understanding it.
Celebrate changing our minds. If we consider it a triumph and a point of pride to learn that something is different from what we’d previously concluded, we will be on the lookout for further opportunities to evolve and perfect our beliefs.
Have strong opinions, loosely held, a framing due to Paul Saffo.
Beyond “Correct”: Complexity, Leadership, and Subjectivity.
Often, though, being “right” isn’t the right goal.
In many real world situations there is no right answer, there are just different paths too complex to fully evaluate. Jarod Lanier made the point that while we’re all aware that to get a simple acceptable stream of water in the shower requires a subtle blending of the hot and cold taps, we also think that in really complex realms like policy and politics the best solutions come from adopting a single, simple precept, like no government, or big government, using just one tap at full tilt. It’s obvious that most problems are more nuanced than we give them credit for when we follow ideology or bias.
Moreover, in many situations inaction and indecision are worse than choosing an imperfect path. This is especially true when leading. Deploying people in a concerted and organized way requires clear directions. And choosing them must be done even under uncertainty and time pressure. Voltaire put it notably—“don’t let the perfect be the enemy of the good.”
Lastly, subjectivity is more important than objectivity in many life-situations. We are individuals, and the ways in which we experience the world are based on our subjective values and affections. Our kids, our friends, our values, are all valid and unavoidable priorities and starting points for our decisions. Nevertheless, awareness that one is making decisions for oneself based on selfish and subjective considerations can help us respect others’ decisions and not impose ours on them.
Some Fresh Perspectives
There’s obviously a near infinite amount written on this subject, including most everything in psychology and philosophy. I happen to be enjoying the following. Maybe you will as well.
Clearerthinking.org publishes articles, podcasts and tools to help “close the gap between insights from research about human behavior and actions in the real world.”
Zeynep Tufecki in her fascinating substack on real-world issues, also discusses practical applications of her ideas on theories of knowledge, and what she calls metaepistemology.
Matt Yglesias writes about political science and economics with an admirably iconoclastic and evidence-centric pen.
Paul Graham, the creator of Y Combinator, is an insightful essayist on thoughts about technology, innovation and company creation.
Tim Urban’s Wait But Why is quirky and entertaining, and has spot-on answers to life’s most complicated questions, such as who How to Pick Your Life-Partner and What Makes You You.
Marc Meyer is a long-time Silicon Valley technologist, founder (6 startups, 4 exits, 1 IPO), executive, investor, advisor, teacher and coach. He has invested in and advised over 100 companies, chairs the Advisor Council at Berkeley SkyDeck, is on the Selection Committee of HBS Alumni Angels, advises at multiple accelerators, and has an Executive Coaching practice helping leaders achieve their greatest potential.