 Skepticism is necessary to have an accurate world view. You can't simply believe something because somebody told you. You have to doubt, and often. At the beginning of a philosophic journey, you must even doubt yourself. Are you perhaps an unreliable narrator? Can your own mind be trusted? In the last several years, this perspective has become crystallized in my mind, and it's developed into something like a life motto. It's a simple principle, and here it is. Everybody is wrong about everything all the time. The more I interact with people, the more this principle is affirmed. And there are exceptions, of course, but it's an incredibly reliable rule of thumb. It is difficult and time-consuming to study something deeply. Philosophy is often tedious, but without deep knowledge of a topic, including a metaphysical and epistemological justification, I can't see how anybody can understand anything. Now, they might know various facts from textbook, but that doesn't mean they actually have a clue what they're talking about. See virtually any college undergraduate as an example. So it makes sense to assume people are wrong. And because of the sinister nature of philosophy, they're probably wrong about everything. When foundational beliefs are inaccurate, all of the beliefs which follow are likely inaccurate. It's like the root of a tree rotting, or the pillars of a house crumbling. For example, if somebody believes that the government exists independent of individuals, their entire political theory will include errors throughout. Whether or not they view taxation as theft will determine a massive amount of other beliefs. If they're wrong, their entire political worldview becomes poisoned because your conception of taxation is a justification for a myriad of other beliefs. So let me be clear, I'm no way making the case for intellectual dismissal. I'm not saying throw out people's ideas without evaluation. I'm really saying the opposite. Evaluate the ideas purely on their merit without any connection to the person communicating them. When you don't trust people or give their ideas some special treatment because of their expertise, you'll discover that nearly everybody's worldview is fuzzy and ill-justified. To me, it appears that the majority of people believe something by happenstance, by chronology and geography. In other words, they believe the ideas they heard first in school or from their parents and families. They end up believing what their neighbors believe, or what their broader culture teaches. Most people are entirely unaware of the silent presuppositions in their culture. They never experienced a contrast to their own worldview. The unquestioned social norms of a man born in New York will be wildly different than one born in Tokyo. At these beliefs are never examined or rigorously challenged, we have no reason to believe they're accurate. It seems most sensible to simply assume they're wrong and unjustified, unless proven otherwise. But we must go one step deeper. The assumption of error should also be paired with another friendly principle. The assumption of confusion. Not only are most people wrong, but they think they're right. They're confused. Rare is the man who is open-minded about what he doesn't know. Common is the man who will passionately defend his unjustified beliefs. Remember this when you listen to people argue and things become crystal clear. It's the blind arguing with the blind about the color of the sky. Now I realize this sounds curmudgeonly, and that's because it is. But if you aren't concerned about social condemnation, then you'll quickly realize the accuracy of this perspective. The same is true professionally. Most people seem to be fakers, who are excellent at giving the illusion of productivity and competence. Though by comparison, there seems to be many more competent professionals than competent thinkers. And again, don't get me wrong, I'm not saying that most people are stupid. I don't have any firm conclusions about the average person's capacity for accurate beliefs. I'm not judging their intelligence, but rather the accuracy of people's worldviews and the independence of their thought. The problem is, appropriately enough, the unchallenged beliefs that they hold. For example, we're taught from childhood to respect authority, whether it's the teacher, the cop, the parent, etc. The same happens in adulthood, where the experts become unquestioned authority figures. If somebody has a PhD, well by golly, of course they know what they're talking about. They couldn't have it become a professor otherwise. These beliefs, when critically evaluated and not taken for granted, start looking very shaky. Ever wonder why so many experts disagree on any given topic? Why professional economists claim such radically different things? It's because, necessarily, a large part of them are wrong. And they are wrong because they don't know what they're talking about. I am convinced that your average PhD in economics does not understand the basics. Surely some do, but I think the majority do not. Book knowledge, the understanding of facts and the opinions of other thinkers, does not constitute understanding. Under pressure, your average PhD will start revealing the contradictions and leaps of faith in his worldview. So how, then, can people who don't know what they're talking about become teachers and professors? The answer is simple. They know a little bit more information than average. They have a slight edge of knowledge, which gives the illusion of depth to people who cannot evaluate the ideas for themselves. It's like a race where you don't know by how much the winners won. It turns out, in the world of ideas, it's usually about a foot. The high school history teacher needs to understand the textbook just a hair more than the students. He doesn't need a deep, abstract understanding of a subject matter, and the same is true in college and in the workplace. The difference in real knowledge between your typical authority figure and regular folks is much smaller than we've been taught, and in some cases, it's razor thin or even non-existent. Other times, it's not even that an expert is wrong. It's that they don't care about the truth. Paul Krugman, for example, influences a lot of economic thinking, and he is a political hack. Thomas Piketty is now a household name for his fraudulent book about economics. He is a liar. It's plain and simple. I have links in this article which demonstrate it. The man is disingenuous with a political agenda, and he cooked the books. Yet somehow, he's still regarded as an expert, and even an expert's expert, one of the most influential economists in existence. If this sounds preposterous, then don't take my word for it. Study a topic deeply, especially in the soft sciences, and find and evaluate all of the contrarian schools of thought you can find. And then judge the mainstream consensus. Chances are you'll start to see some large holes. To use a Wizard of Oz analogy, get up the courage and curiosity to peek behind the curtain and you may be shocked what you find. Now, in my defense, I didn't always believe this way. It's only through conversation and experience that I started to doubt people's authority. I used to be a flag-waving patriot, for example, before I started questioning my beliefs about political authority. But now, it seems clear as day, politicians are windbags. They're full of hot air, confusion, and intentional lies. Most people aren't as ill-intentioned as politicians, but for being honest, it seems reasonable to assume that everybody's worldview is equally inaccurate from the beginning until proven otherwise.