The Joe Rogan Effect
The Illusion of Expertise in the Age of Overconfidence, Information Bubbles & Internet Idiocy
The Epidemic of Internet-Educated Idiots
We live in an age where expertise is optional, but confidence is at an all-time high.
Spend five minutes in any conversation about politics, health, psychology, or finance, and you’ll inevitably meet someone who is absolutely convinced they know what they’re talking about.
They’ve “done their research.”
They’ve read “a lot” about this.
They “heard on a podcast” that XYZ is true.
And they will die on this hill.
It doesn’t matter that the actual experts—people who have spent decades studying the field—are saying something different. These folks have read the books, watched the documentaries, scrolled the Reddit threads, and now they just know.
This is what I call The Joe Rogan Effect—the idea that if you read a few books, Google some articles, and listen to a couple of interviews with contrarian thinkers, you can suddenly think outside the box.
Except there’s one problem.
You can’t think outside the box if you never adequately learned what’s in the box.
The Box and Beyond: What Real Expertise Looks Like
A real expert does two things:
1️⃣ They know what’s inside the box—the foundational, textbook knowledge of their field. This is the stuff that is taught in universities, the widely accepted research, the consensus-built understanding that forms the basis of the field.
2️⃣ They know what’s outside the box—the emerging theories, cutting-edge ideas, and unconventional perspectives that challenge the status quo.
Here’s the key distinction: You can’t effectively challenge something you never understood in the first place.
Most self-proclaimed experts skip step one entirely. They don’t have the deep foundation of their subject—they just start absorbing random theories, contrarian viewpoints, and cherry-picked data that fit their worldview.
And yet, because they are smart and successful in other areas, they believe they can shortcut the expertise process. They mistake being well-informed for being knowledgeable.
This is why you see doctors acting like geopolitical analysts.
Fitness influencers trying to dissect macroeconomic policy.
Bitcoin bros reinventing the field of finance.
People who have read two books on trauma suddenly positioning themselves as healers.
It’s the same pattern: a surface-level understanding packaged with unearned confidence.
The Joe Rogan Effect in Action
I have close friends who have spent their entire careers in finance. They’ve studied markets for decades, worked in investment banking, and understand economic cycles inside and out.
I also have family members who bought some Bitcoin in 2017, read a few books, and now believe they have cracked the code of the financial system.
Guess which ones talk about finance with absolute certainty?
I know people who are actual geopolitical analysts—trained professionals who have spent years studying history, war strategy, and foreign policy.
I also know people who have traveled to Russia a few times, watched a couple of YouTube documentaries, and now think they understand international diplomacy.
Guess which ones are the loudest?
This is what happens when people confuse information with understanding.
The biggest symptom? The illusion of certainty.
The less someone knows, the more convinced they are that they are right.
The more someone actually knows, the more nuanced, cautious, and self-aware they become.
Or as Osho put it:
“The less a person knows, the more stubbornly they know it.”
How Real Experts Think (And Why Fake Experts Are So Sure of Themselves)
There’s a dangerous paradox of expertise:
When you first start learning about a subject, everything seems simple. You see patterns. You spot what seems like obvious flaws in the mainstream narrative. You think you’ve figured it out.
The deeper you go, the more complexity you uncover. You start to realize that there are no simple answers, that every issue has layers of nuance, and that no single viewpoint is entirely correct.
The more you know, the more you realize how little you actually know. This is the difference between real experts and fake ones.
This is why Nobel Prize-winning physicists often say things like:
"We don’t fully understand quantum mechanics."
And why some guy with a TikTok account will confidently say:
"Quantum mechanics proves that you can manifest money into your life by thinking about it!"
The Bias Bubble: Why “Doing Your Own Research” Usually Makes You More Wrong
The second big problem? Most people don’t actually do research. They just reinforce their existing beliefs.
Research isn’t about proving yourself right. It’s about trying to prove yourself wrong.
A true researcher spends time looking at opposing arguments. They search for the strongest counter-evidence to their viewpoint. They don’t just confirm their bias; they actively try to disprove it.
But that’s uncomfortable. That takes effort.
Most people do what’s easier:
They find sources that align with their existing worldview.
They ignore or discredit anything that contradicts it.
They mistake selective reinforcement for deep knowledge.
And in the internet age, this is worse than ever. Because you can find “research” to back up literally any belief.
You think the earth is flat? There’s a documentary for that.
You believe COVID is a hoax? There’s a Reddit thread with thousands of “sources.”
You think drinking turpentine cures cancer? Someone on Instagram is selling a course about it.
The internet doesn’t just provide information. It provides confirmation—endless streams of it.
And once people find their bubble, they stop looking outside of it.
How to Not Be Another Internet-Educated Idiot
If you want to actually understand something—rather than just feel like you do—here’s what you need to do:
1️⃣ Learn the box first. Before you question a field, actually study it properly. Read the textbooks, learn the foundations, and understand what the mainstream consensus is before you dismiss it.
2️⃣ Seek out counterarguments. Don’t just read what aligns with your beliefs. Read the smartest people who disagree with you. Instead of trying to prove yourself right, look for evidence that you might be wrong.
3️⃣ Notice your own certainty. The more absolutely sure you are about a complex topic, the more likely it is that you don’t actually understand it. Confidence should be proportional to depth of understanding.
4️⃣ Admit what you don’t know. Real intelligence isn’t about having answers. It’s about knowing what you don’t know and being open to learning more.
Final Thought: The Loudest People Know the Least
Here’s a good rule of thumb:
The loudest, most self-righteous, most absolutely certain person in the room almost always knows the least.
The quiet one who listens more than they speak? Who admits when they’re uncertain? Who understands that reality is rarely black-and-white?
That’s probably the person you should actually listen to.
Unfortunately, we live in a time where social media amplifies certainty over wisdom, where the misinformed have the most confidence, and where nuance has been replaced by clickbait opinions.
So next time someone says, “I did my research”, ask yourself:
Did they really?
Or did they just create a more sophisticated version of their own information bubble?
I wasted hours of my life, attempting to argue with someone on the Internet yesterday. When will I learn? They were spreading blatant misinformation. I even shared a Fox News article (gasp!) to show them they were spreading lies. They continued to argue, pointing me to some rando’s Facebook post (of just text) as their “source.” I’m not sure how we move forward in a world where people can be this blatantly, willfully, contentedly ignorant.
This article really hits home. As an oncologist, I’ve seen trust in doctors shift. Patients come in with things they’ve heard on podcasts or read online, and instead of a conversation, it sometimes feels like I have to prove why decades of research and experience still matter. I don’t mind questions. Trust should be earned. But it’s tough when confidence in surface-level knowledge outweighs real expertise. Skepticism is healthy when it leads to learning, but too often it just creates more doubt and confusion.