Why I like people with opinions
Published:
Or: Are more opinions and stronger selection pressure strictly superior?
[Epistemic status: have thought about this regularly, reading Sean Carroll’s “Something Deeply Hidden” just was a final drop in the bucket - wrote this up in half an hour, generally decently confident this is pointing at something important]
Sean Carroll explains things well because he’s got a clear idea of how they work. He stands in for his current best guess of what physics looks like. That eases the explanation because, in his mind, there’s no confusion. He’s bought into it.
Luckily, humans seem to have this skill of metacognition, where you can fully buy into something and be aware of the fact that you might be wrong, too. This metacognition means Sean will update his beliefs if presented with the right kind of evidence.
Now, why does this seem so much better? If we look at the evolution of ideas, brains are just vehicles. They carry around ideas and clash whenever they encounter one another. The strongest idea wins. What defines the local version of strength is a social matter.
I want to live in the social reality where ideas are judged for their value in improving the world. Only secondarily do I want to judge them based on their truth value. I think they only differ marginally in importance but this does imply something fundamentally different for how we interact.
Why do I think we should judge ideas coming from single brains differently than what is assumed fact, or common knowledge? I think in domains where consensus doesn’t exist, where there is still room for the evolution of ideas, the process of seeking truth looks different than in domains where questions are a matter of fact. In most cases, truth is obscure and we can only ever edge closer to it, asymptotically.
Single agents have a hard time letting beliefs clash because they need to maintain at least an illusion of coherence to themselves to operate in a fit manner. So what single agents do well is steel man ideas; given their knowledge of the world and processing capacity, they can produce their strongest possible version of an idea.
Having the strongest possible version of an idea usually results from bulletproofing it - one can do so internally to some extent. But bulletproofing your idea is likely to be much more efficient by interfacing with a more independent belief structure. Information bottlenecks will force you to find ever better formulations of your idea, becoming ever better at communicating clearly. Ideally, we even arrive at the point of formalising our ideas to enable testing.
Here’s where having opinions comes in handy. Having opinions forces you to enter the arena of ideas if you want to stand up for who you are. That, in turn, forces you to clarify your ideas for them to attract any attention at all, otherwise, you’ll just vanish into nothingness. Given a social context in which beliefs are judged for their utility and truth value, if your idea appears superior on the battleground, our well of knowledge improves in quality.
Everybody wins, even if you lose. Having opinions allows others to grapple with them. Suspending judgement and waiting for insight to be bestowed upon you doesn’t generate data and your ideas will become stale. You might encounter great insights occasionally any feel good about it. But in comparison to those who go into the battleground, you will stagnate.
Musings regarding evolution: assuming strong selection pressures increase robustness and fitness, is it always better to have more ideas? Assuming that self-selection effects sort battlegrounds into different levels of quality, you just need to make an effort of finding your battleground, always pushing the envelope a little bit, but not too much. If you get demolished without seeing what happened, that’s of no use to you. If you win without having had an additional insight, that’s of no use to you.
Find your battleground. Let evolution do its thing, even if it feels like it’s taking you apart. Long-term, it’s for the best. Probably also for you, if you have sufficient resources to change social circles whenever you level up too far. And then too many bad ideas are unlikely to get to you and bother you too much. Embrace your bubble but stay curious - if everybody keeps a toe on the outside and does some epistemic spot checking, that’s likely to result in a healthy battleground.
Don’t be too open-minded. Protect your battleground. Cherish it, and the gladiators. Learn how to be a vessel for ideas, how to be untouched by the violence when they break apart. Learn to seek the truth, but understand that you’re not going to do it by yourself. That’s why often you will have to accept “practically useful” as a proxy for truth. It’ll eventually get you closer to it if your social incentives are aligned appropriately. Also, choose your environment carefully, you won’t always end up in fair fights.