Skepticism is necessary to have an accurate worldview. You can’t simply believe something because somebody told you. You have to doubt, often. At the beginning of a philosophical journey, you must even doubt yourself – are you perhaps an unreliable narrator? Can your own mind be trusted?
In the last several years, this perspective has become crystallized in my mind and developed into something like a life motto. It’s a simple principle:
Everybody is wrong about everything all the time.
The more I interact with people, the more this principle is affirmed. There are exceptions, of course, but it’s an incredibly reliable rule of thumb.
It’s difficult and time-consuming to study something deeply. Philosophy is often tedious. But without deep knowledge of a topic – including a metaphysical theory and epistemological justification – I just can’t see how anybody can understand anything. They might know various facts from a textbook, but that doesn’t mean they actually have any clue what they’re talking about (see virtually any college undergraduate as an example).
So, it makes sense to assume people are wrong. And because of the sinister nature of philosophy, they’re probably wrong about everything. When very foundational beliefs are inaccurate, all of the beliefs which follow are likely inaccurate. It’s like the root of a tree rotting, or the pillars of a house crumbling. For example, if somebody believes that “the government” exists independent of individuals, their entire political theory will include errors throughout. Whether or not they view taxation as theft will determine a massive amount of other beliefs – if they’re wrong, their entire political worldview becomes poisoned (as that belief is justification for a myriad of other beliefs).
Let me be clear: I am in no way making the case for intellectual dismissal. I’m not saying “throw their ideas out without evaluation.” I’m really saying the opposite – evaluate the ideas purely on their merit, without any connection to the person communicating them. When you don’t trust people or give their ideas special treatment because of their “expertise”, you’ll discover that nearly everybody’s worldview is fuzzy and ill-justified.
To me, it appears that the majority of people believe something by happenstance – by chronology and geography. They believe the ideas they heard first – in school or from their parents and families. They end up believing what their neighbors believe, or what their broader culture teaches. Most people are entirely unaware of the silent presuppositions in their culture – they’ve never experienced a contrast. The unquestioned social norms of a man born in New York will be wildly different than those born in Tokyo. If these beliefs are never examined or rigorously challenged, we’ve no reason to believe they’re accurate. It seems most sensible to simply assume they are wrong and unjustified, unless proven otherwise.
We must go one step deeper. The assumption of error should also be paired with another friendly principle: the assumption of confusion. Not only are most people wrong, but they think they’re right. They are confused. Rare is the man who is open-minded about what he doesn’t know. Common is the man who will passionately defend his unjustified beliefs.Remember this when you listen to people argue, and things become crystal clear. It’s the blind arguing with the blind about the color of the sky.
I realize this sounds curmudgeonly; that’s because it is. But if you aren’t concerned about social condemnation, then you’ll quickly realize the accuracy of this perspective. The same is true professionally: most people seem to be fakers who are excellent at giving the illusion of productivity and competence. Though, by comparison, there seem to be many more competent professionals than competent thinkers.
Don’t get me wrong: I am not saying that most people are stupid. I don’t have any firm conclusions about the average person’s capacity for accurate beliefs. I’m not judging intelligence, but rather the accuracy of people’s worldviews and the independence of their thought. If I had to guess, I’d say most people are perfectly capable of critical thinking. The problem, appropriately enough, is the unchallenged beliefs they hold.
For example, we’re taught from childhood to respect authority, whether it’s the teacher, the cop, the parent, etc. The same happens in adulthood, where “the experts” become unquestioned authority figures. If somebody has a PhD, we’ll by golly, of course they know what they are talking about! They couldn’t have become a professor otherwise!
These beliefs, when critically evaluated, start looking shaky. Ever wonder why so many “experts” disagree on any given topic? Why professional economists claim such radically different things? It’s because, necessarily, a large part of them are wrong – and they are wrong because they don’t know what they are talking about. I am convinced that your average PhD in economics doesn’t understand the basics. Surely some do, but I think the majority do not. Book knowledge – the understanding of “facts” and opinions of other thinkers – does not constitute understanding. Under pressure, I think your average PhD will start revealing the contradictions and leaps of faith in his worldview.
How, then, can people who don’t know what they are talking about become teachers and professors? The answer is simple: they know a little bit more information than average. They have a slight edge of knowledge, which gives the illusion of depth to people who can not evaluate the ideas themselves. It’s like a race where you don’t know by how much the winners won – turns out, in the world of ideas, is usually only a foot. The highschool history teacher needs to understand the textbook just an hair more than the students – he doesn’t need a deep, abstract understanding of his subject matter. The same is true in college or in the workplace. The difference in real knowledge between your typical authority figure and regular folks is much smaller than we’ve been taught, and in some cases, it’s razor-thin.
Other times, it’s not even that an expert is wrong – it’s that they don’t care about the truth. Paul Krugman influences a lot of economic thinking, and he is a political hack. Thomas Piketty is now a household name for his fraudulent book about economics; he’s a liar, plain and simple. The man is disingenous, with a political agenda, and he cooked the books. Yet, somehow, he is still regarded as an expert.
But don’t take my word for it. Study a topic deeply (especially in the soft sciences) – find and evaluate all the contrarian “heterodox” schools of thought you can – and then judge the “mainstream” consensus. Chances are, you’ll start to see some large holes. To use a Wizard of Oz analogy, get up the courage and curiousity to peek behind the curtain. You may be shocked what you find.
In my defense, I didn’t always believe this way. It’s only through conversation and experience that I started to doubt people’s authority. I used to be a flag-waving patriot before I started questioning my beliefs in political authority. Now, it seems clear as day: politicians are windbags, full of hot air, confusion, and lies.
Most people aren’t as ill-intentioned as politicians. But, if we’re being honest, it seems reasonable to assume everybody’s worldview is equally inaccurate from the beginning, until proven otherwise.