My uncle is intelligent, educated, and successful. He runs a business, reads extensively, and can hold sophisticated conversations about history, economics, and technology.
He also believes:
- COVID-19 was created in a lab as a bioweapon
- The 2020 election was stolen
- A global elite controls world events through secret organizations
- Vaccines contain tracking microchips
- Climate change is a hoax to implement global government
When I try to discuss evidence, he responds with:
- “That’s what they want you to think.”
- “Follow the money.”
- “The mainstream media is controlled.”
- “Do your own research.”
How does a smart, rational person come to believe things that seem obviously false to me?
More importantly: How can I be sure I’m not doing the same thing in areas where I’m wrong?
This is the question that haunts me about conspiracy theories. Not “Why are they stupid?” but “What psychological mechanisms make conspiracy thinking so seductive, and how can I avoid them myself?”
Because here’s the uncomfortable truth: The same cognitive processes that lead to conspiracy theories are universal human traits.
Understanding conspiracy psychology isn’t just about understanding “those people.” It’s about understanding how all of us think, and how we can all be wrong with total confidence.
What Is a Conspiracy Theory? (And What Isn’t)
Let’s start by defining terms.
Actual Conspiracies vs. Conspiracy Theories
Real conspiracies exist:
- Watergate (government officials broke the law and covered it up)
- Tuskegee syphilis experiments (U.S. government lied to Black men and withheld treatment)
- COINTELPRO (FBI infiltrated and disrupted political groups)
- Tobacco industry’s lies about cancer
- Big Pharma hiding negative drug trial data
These share features:
- Limited number of participants (conspiracies are hard to maintain with many people)
- Documentary evidence emerged
- Whistleblowers came forward
- Investigations revealed truth
- They were eventually proven
Conspiracy theories are different:
- Involve massive numbers of alleged conspirators (thousands or millions)
- Lack credible evidence
- Rely on “absence of evidence is evidence” logic
- Resist falsification (every counter-evidence is part of the conspiracy)
- Grow more elaborate over time
The Characteristics of Conspiracy Theories
Psychologist Rob Brotherton identifies key features:
1. Pattern-seeking in randomness
- “There are no coincidences”
- Seeing connections where none exist
2. Agency detection
- Attributing events to intentional actors rather than chance or complex systems
- “Someone must be responsible”
3. Proportionality bias
- Big events must have big causes
- “A virus couldn’t cause a pandemic; it must be engineered”
4. Immune to evidence
- Evidence against the theory is dismissed as “part of the cover-up”
- Unfalsifiable
5. Escalating commitment
- Once invested, people add layers to explain contradictions
- The theory grows more complex, not simpler
Understanding these features helps us see: conspiracy thinking isn’t a simple error. It’s a systematic pattern of reasoning that feels compelling but leads to false conclusions.
The Psychological Needs Conspiracy Theories Fulfill
People don’t believe conspiracy theories because they’re stupid. They believe because conspiracy theories meet deep psychological needs.
Need #1: Understanding and Certainty
The world is complex, chaotic, and often senseless.
- Why did a pandemic happen? (Complex evolutionary biology, ecology, globalization)
- Why is the economy struggling? (Hundreds of interacting factors)
- Why do bad things happen to good people? (Random chance, systems, luck)
The honest answer to many questions is: “It’s complicated. Multiple factors. Uncertainty.”
This is psychologically unsatisfying.
Conspiracy theories offer:
- Simple explanations (“It was planned”)
- Clear villains (“The elites did it”)
- Illusion of understanding (complex world reduced to intentional actors)
Research shows:
People gravitate toward conspiracy theories when feeling a lack of control or understanding.
After major events (9/11, COVID, mass shootings), conspiracy theories spike.
Why? Because randomness and complexity are terrifying. A malevolent plan is actually more comforting than chaos.
At least with a plan, there’s an explanation. With chaos, there’s just… chaos.
Need #2: Feeling Special and In-the-Know
Conspiracy theories make you a hero in your own narrative.
You’re not just a regular person. You’re:
- Someone who “sees through the lies”
- Part of an enlightened minority
- A truth-seeker in a world of sheep
- Brave enough to question authority
This is deeply appealing.
Research on conspiracy belief shows:
People who feel powerless, marginalized, or unrecognized are more likely to believe conspiracies.
Why?
Conspiracy belief restores a sense of importance:
- “I know something most people don’t”
- “I’m smarter than the masses who believe the official story”
- “I have special insight”
This is the “special knowledge” appeal.
It’s the same appeal as:
- Secret societies
- Insider trading
- Exclusive clubs
- Spiritual enlightenment
Knowing secrets makes you feel significant.
Need #3: Protecting Identity and Worldview
Conspiracy theories protect your sense of who you are and how the world works.
Example:
If you believe:
- “Hard work leads to success”
- “Meritocracy is real”
- “The system is fair”
Then data showing:
- Systemic inequality
- Inherited privilege
- Randomness in outcomes
This threatens your worldview.
Conspiracy theory solution:
“The system would be fair, but a secret cabal has corrupted it. The problem isn’t the system I believe in-it’s a conspiracy against it.”
This preserves your worldview while acknowledging problems.
Another example:
If your identity is built on:
- Being a patriot
- Supporting your political party
- Trusting certain leaders
And that leader loses an election:
Option 1: “My candidate lost. Maybe they weren’t as popular as I thought.”
(This requires updating your beliefs and admitting you were wrong.)
Option 2: “My candidate didn’t lose. The election was stolen.”
(This preserves your beliefs and identity.)
Conspiracy theories protect self-concept.
They allow you to maintain your beliefs even when reality contradicts them.
Need #4: Tribal Belonging
Believing conspiracy theories bonds you to a community.
You gain:
- In-group identity (“We’re the awakened ones”)
- Shared language and references
- Mutual validation
- Purpose (fighting the conspiracy)
- Social support
Online communities dedicated to conspiracy theories provide:
- Acceptance
- Friendship
- Status (based on knowledge and contribution)
- Meaning
For people who feel alienated, disconnected, or unappreciated, this is powerful.
Research shows:
Social isolation and loneliness predict conspiracy belief.
Conspiracy communities fill a social void.
Leaving the conspiracy means losing your community. That’s hard.
Need #5: Explaining Suffering and Injustice
Bad things happen. Inequality exists. Suffering is real.
How do you make sense of it?
Option 1: “The world is complex. Systems are flawed. Randomness plays a huge role. Many problems have no clear solution.”
(Honest, but demoralizing.)
Option 2: “A shadowy group is deliberately causing suffering for their benefit. If we expose and defeat them, things will get better.”
(False, but energizing.)
Conspiracy theories turn suffering into a story with villains, victims, and potential heroes.
This is psychologically easier than accepting complexity and ambiguity.
The Cognitive Biases That Fuel Conspiracy Thinking
Conspiracy theories aren’t just about emotional needs. They’re also fueled by universal cognitive biases-mental shortcuts that usually help us but can lead us astray.
Bias #1: Pattern Recognition (Apophenia)
Humans are wired to detect patterns.
This helped our ancestors:
- Recognize animal tracks
- Predict weather
- Identify edible plants
But pattern detection is oversensitive.
We see faces in clouds, hear messages in static, find meaning in coincidences.
In conspiracy thinking:
-
“Multiple celebrities died this year? That’s suspicious!”
-
(Ignoring base rates-people die all the time)
-
“The numbers 666 appear in this legislation? Satanic conspiracy!”
-
(Ignoring that numbers appear everywhere)
We’re biased toward false positives (seeing patterns that aren’t there) over false negatives (missing real patterns).
Evolutionarily, this made sense:
- Better to mistake a shadow for a predator (false positive) than miss a real predator (false negative)
But in the modern world, it leads to seeing conspiracies in randomness.
Bias #2: Confirmation Bias
We seek, interpret, and remember information that confirms our existing beliefs.
Example:
Belief: “Vaccines are dangerous.”
Search behavior:
- Google “vaccine dangers”
- Read articles that confirm dangers
- Ignore articles about safety
- Remember the one story about a bad reaction
- Forget the millions with no reaction
Result: Your belief strengthens, even though the evidence overwhelmingly contradicts it.
Confirmation bias is universal.
I do it. You do it. Everyone does it.
In conspiracy thinking, it’s supercharged:
- You look for evidence of the conspiracy
- Everything that fits is “proof”
- Everything that doesn’t fit is ignored or dismissed
- Over time, you build a self-reinforcing belief system
Bias #3: Proportionality Bias
We expect big events to have big causes.
Example:
President Kennedy was assassinated.
Proportional explanation: “A vast conspiracy involving the CIA, Mafia, and government officials orchestrated his death.”
Disproportionate explanation: “A lone, mentally unstable man with a rifle shot him.”
The second is true, but feels unsatisfying.
A world-historical figure killed by a nobody? Doesn’t feel right.
Similarly:
- COVID pandemic killed millions → “Must be bioengineered!”
- (Rather than: a virus jumped species, as has happened countless times)
Our intuition says: big effects need big causes.
But reality often involves:
- Small causes with cascading effects
- Randomness and accident
- Systemic complexity
Proportionality bias makes conspiracy theories feel more plausible than mundane truth.
Bias #4: Conjunction Fallacy
We judge specific scenarios as more likely than general ones, even when that’s logically impossible.
Classic example (Tversky & Kahneman):
Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy and was deeply concerned with discrimination and social justice.
Which is more probable? A) Linda is a bank teller B) Linda is a bank teller and is active in the feminist movement
Most people say B.
But that’s logically impossible. B is a subset of A. It can’t be more probable.
In conspiracy thinking:
“The government is corrupt” (general)
vs.
“The government is secretly controlled by a cabal of elites who orchestrated 9/11, COVID, and control the media” (specific)
The specific story feels more probable because it’s vivid and detailed.
But logically, it’s less probable.
Bias #5: Illusory Correlation
We perceive relationships between events that aren’t actually related.
Example:
“Every time there’s a mass shooting, it happens during a political controversy. They’re distracting us!”
Reality: Mass shootings happen regularly. Political controversies happen regularly. Sometimes they overlap. That’s coincidence, not causation.
Or:
“This celebrity spoke out against the elite, and now they’re dead. They were silenced!”
Reality: Many celebrities speak out. Many celebrities die (often from lifestyle factors). Sometimes these overlap.
We remember the hits (when timing aligns) and forget the misses (when it doesn’t).
Bias #6: Motivated Reasoning
We reason to reach conclusions we’re motivated to reach, not to find truth.
Example:
If you want to believe vaccines are safe:
- You scrutinize anti-vax arguments critically
- You accept pro-vax arguments uncritically
- You find flaws in anti-vax studies
- You overlook flaws in pro-vax studies
If you want to believe vaccines are dangerous:
- You scrutinize pro-vax arguments critically
- You accept anti-vax arguments uncritically
- You find flaws in pro-vax studies
- You overlook flaws in anti-vax studies
Same evidence. Opposite conclusions. Based on motivation.
In conspiracy thinking:
People are motivated to believe because it:
- Protects their identity
- Provides community
- Explains suffering
- Makes them feel special
So they reason in ways that support the conspiracy, not in ways that seek truth.
Why Smart People Believe: Intelligence Doesn’t Protect You
Here’s the part that should scare you:
Intelligence doesn’t protect against conspiracy thinking. It might even make it worse.
Research Findings
Studies show:
Higher intelligence correlates with:
- LESS belief in some conspiracy theories
- But BETTER at rationalizing beliefs once formed
What this means:
Smart people are better at:
- Generating justifications for their beliefs
- Finding supporting evidence (even weak evidence)
- Creating elaborate explanations
- Defending against counter-arguments
But they’re not better at:
- Recognizing when they’re wrong
- Updating beliefs with evidence
- Avoiding motivated reasoning
Intelligence is a tool. It can be used to find truth OR to defend falsehoods.
Smart people don’t believe fewer false things. They’re just better at defending the false things they believe.
The Backfire Effect
Providing evidence against conspiracy theories can backfire.
Research (though recently debated) suggests:
When people are presented with strong evidence contradicting their beliefs, they sometimes believe more strongly afterward.
Why?
1. Psychological reactance:
- Feeling pressured makes them resist
- “Don’t tell me what to think!”
2. Identity threat:
- Evidence against belief is experienced as attack on self
- Defense mechanisms activate
3. Motivated counter-arguing:
- They generate reasons why the evidence is wrong
- Each counter-argument strengthens conviction
This is why arguing with conspiracy theorists often fails.
The Social Dynamics: Echo Chambers and Radicalization
Conspiracy theories aren’t just individual psychological phenomena. They’re social.
The Internet Changed Everything
Pre-internet:
- If you believed a fringe theory, you were probably alone
- Social pressure moderated extreme views
- Access to alternative information was limited
Post-internet:
- You can find communities of like-minded believers
- Algorithms amplify content that engages you (often extreme content)
- You can live entirely within your information bubble
The Echo Chamber Effect
Once in a conspiracy community:
1. Confirmation bias on steroids:
- Your entire feed is pro-conspiracy
- Counter-evidence is absent or mocked
2. Social proof:
- “All these people believe it. They can’t all be wrong.”
3. Identity formation:
- Your in-group is conspiracy believers
- Your out-group is “sheep” who believe mainstream narratives
4. Escalating commitment:
- As you invest time and social capital, it’s harder to leave
- Admitting you were wrong means losing community
5. Radicalization:
- Exposure to more extreme versions of the theory
- Normalization of previously unthinkable beliefs
This is how someone goes from:
“I’m skeptical about pharmaceutical companies”
to
“Vaccines contain microchips for population control”
to
“We must take up arms against the medical establishment”
It’s a gradual process, fueled by social dynamics.
The Grifters and Bad Actors
Not everyone in conspiracy spaces is a true believer.
Some are:
- Grifters (selling books, supplements, speaking engagements)
- Foreign actors (spreading disinformation to divide societies)
- Trolls (enjoying chaos)
- Opportunists (using conspiracy theories for political gain)
These actors weaponize psychological vulnerabilities:
- They know what makes content viral (fear, outrage, belonging)
- They understand motivated reasoning
- They optimize for engagement, not truth
The result: Conspiracy theories spread faster and wider than ever before.
The Real Harms of Conspiracy Thinking
“Who cares if people believe weird things? Let them.”
I used to think this. I don’t anymore.
Harm #1: Public Health Disasters
Anti-vaccine conspiracy theories have led to:
- Measles outbreaks (previously eradicated in many countries)
- Polio resurgence in some regions
- Lower COVID vaccination rates → more deaths
COVID conspiracy theories led to:
- Refusal to mask or distance
- Rejection of treatments
- Overcrowding hospitals with ivermectin overdoses
- Hundreds of thousands of preventable deaths
This isn’t abstract. People die.
Harm #2: Political Violence
Conspiracy theories have motivated:
- January 6th Capitol attack (election fraud conspiracy)
- Pizzagate shooting (child trafficking conspiracy)
- QAnon-related violence
- Attacks on 5G towers (conspiracy that 5G spreads COVID)
When you believe a satanic cabal is trafficking children, violence seems justified.
When you believe elections are stolen, overthrowing government seems patriotic.
Conspiracy theories radicalize and justify violence.
Harm #3: Erosion of Trust
Conspiracy thinking erodes trust in:
- Science
- Medicine
- Journalism
- Democracy
- Institutions
Once trust is gone, collective action becomes impossible:
- Can’t respond to pandemics (people won’t comply)
- Can’t address climate change (people deny it exists)
- Can’t have functional democracy (people reject election results)
Society depends on shared reality. Conspiracy theories fracture it.
Harm #4: Personal Costs
For individuals, conspiracy belief often leads to:
- Damaged relationships (alienating family and friends)
- Financial costs (buying into scams)
- Mental health decline (paranoia, anxiety)
- Lost opportunities (refusing medical treatment, avoiding education)
I’ve watched people lose jobs, marriages, and relationships over conspiracy theories.
It’s tragic.
How to Think Better: Protecting Yourself from Conspiracy Thinking
If you’ve read this far, you might be thinking:
“How do I know I’m not the one in a conspiracy theory? How do I avoid these biases?”
Good questions. Here’s what helps.
Strategy #1: Epistemic Humility
Recognize the limits of your knowledge.
Instead of: “I know this is true.”
Try: “Based on current evidence, I believe this is likely true, but I could be wrong.”
Practice saying:
- “I don’t know.”
- “That’s outside my expertise.”
- “I could be wrong about this.”
This doesn’t mean you can’t have beliefs. It means holding them loosely.
Strategy #2: Actively Seek Disconfirming Evidence
Fight confirmation bias deliberately.
When you believe something, ask:
- “What would convince me I’m wrong?”
- “What’s the strongest counter-argument?”
- “What do smart people who disagree say?”
Then actually engage with those arguments.
Not to defeat them. To understand them.
Strategy #3: Diversify Information Sources
Don’t live in an echo chamber.
Follow:
- People who disagree with you
- Experts in the field (not YouTube commentators)
- Multiple news sources across the political spectrum
- Primary sources (not summaries)
Expose yourself to discomfort.
If you never encounter views that challenge yours, you’re probably in a bubble.
Strategy #4: Check Your Motivated Reasoning
Before sharing or believing something, ask:
“Do I believe this because:
- The evidence is strong?
- Or because I want it to be true?”
If the answer is “I want it to be true,” dig deeper.
Motivated reasoning is hardest to spot in yourself.
Strategy #5: Understand Base Rates and Probability
Many conspiracy theories ignore base rates.
Example:
“Healthy athletes are dying from heart attacks! It must be the vaccine!”
Base rate: How many athletes died of heart attacks before vaccines? (Answer: quite a few; it’s called sudden cardiac arrest and has always happened)
Learning basic statistics helps you avoid being misled by anecdotes.
Strategy #6: Trust (But Verify) Expertise
You can’t personally research everything.
You have to trust experts sometimes. But wisely.
Good heuristics:
- Trust scientific consensus (not individual scientists)
- Trust peer-reviewed research (not preprints or blogs)
- Trust people with relevant credentials (not “I did my own research”)
- Trust people who admit uncertainty (not those with unshakeable confidence)
Skepticism is good. But so is recognizing your limits.
Strategy #7: Notice Emotional Reactions
Conspiracy theories trigger strong emotions:
- Fear
- Anger
- Outrage
- Righteousness
When you feel strong emotions, pause.
Ask:
- “Am I being manipulated?”
- “Is this designed to provoke outrage?”
- “What is this making me feel, and why?”
Emotional hijacking impairs critical thinking.
Strategy #8: Be Willing to Change Your Mind
The strongest protection against false beliefs: updating when you get new evidence.
This requires:
- Intellectual humility
- Detaching beliefs from identity
- Seeing changing your mind as strength, not weakness
I’ve changed my mind about:
- Nutrition science (what I thought was healthy wasn’t)
- Economic policy (complexities I didn’t appreciate)
- Specific psychological findings (replication crisis revealed)
Each time, it was uncomfortable. But I’m better calibrated now.
How to Talk to Someone Deep in Conspiracy Thinking
What if someone you care about is deep into conspiracy theories?
Can you help them?
What Doesn’t Work
1. Arguing with evidence:
- They’ll dismiss it as “part of the conspiracy”
- Backfire effect might make them believe more strongly
2. Mocking or shaming:
- Drives them deeper into the community
- Makes you the enemy
3. Cutting them off:
- Confirms their narrative (“The truth isolated me”)
- Removes your influence
What Might Work
1. Maintain the relationship:
- Stay connected
- Show you care about them, not just their beliefs
- Be a tether to reality
2. Ask genuine questions:
- Not “How can you believe this?”
- But “What convinced you? What would change your mind?”
- Sometimes, articulating their reasoning reveals holes
3. Focus on shared values:
- “We both want to protect people.”
- “We both care about truth.”
- Find common ground
4. Plant seeds of doubt:
- Not through argument, but through questions
- “Has this source been wrong before?”
- “What do experts in this field say?”
5. Offer alternative community:
- Help them find belonging elsewhere
- Leaving the conspiracy means leaving community
- Provide alternatives
6. Be patient:
- Changing deeply held beliefs takes time
- Expect setbacks
- Small progress is still progress
7. Know when to protect yourself:
- If the relationship is abusive or toxic, distance is okay
- You can’t help someone who doesn’t want help
- Protect your own mental health
My Personal Reckoning: Where Am I Wrong?
Writing this, I’m forced to ask:
Where am I the conspiracy theorist?
Where do I:
- Believe things that make me feel good rather than things that are true?
- Dismiss evidence that contradicts my worldview?
- Surround myself with people who reinforce my beliefs?
I don’t know. That’s the scary part.
But I try to:
- Read people who disagree with me
- Update beliefs when I encounter strong evidence
- Admit uncertainty
- Recognize my biases
Am I successful? Probably not as much as I think.
That’s the human condition.
Final Thoughts: Compassion and Vigilance
Conspiracy theories are seductive because they meet real psychological needs:
- Understanding
- Belonging
- Significance
- Identity protection
These needs are universal.
That means conspiracy thinking is a universal vulnerability.
Not “those crazy people.” All of us.
The question isn’t “Am I immune?” (You’re not.)
The question is “How do I protect myself while maintaining compassion for those who’ve fallen into it?”
My answer:
1. Epistemic humility: Recognize I could be wrong.
2. Intellectual honesty: Seek truth, not comfort.
3. Compassion: Understand that conspiracy believers are meeting needs, not just being irrational.
4. Vigilance: Constantly check my reasoning, sources, and motivations.
5. Community: Surround myself with people who value truth and will call me out.
6. Responsibility: Speak up against harmful conspiracy theories, but with empathy.
The world is complex, uncertain, and often unjust.
Conspiracy theories offer simple answers.
But simple answers to complex questions are almost always wrong.
The truth is hard, uncomfortable, and uncertain.
But it’s still worth seeking.
Have you encountered conspiracy thinking in your life? How do you navigate it? Where do you check your own reasoning? I’d love to hear your thoughts.