I used to think Tailwind CSS was terrible.

Not because I’d used it extensively. I’d tried it for maybe an hour, felt uncomfortable, and decided it was “just inline styles with extra steps.”

Then I spent the next six months seeing only evidence that confirmed my belief:

  • Blog posts criticizing Tailwind? Bookmarked and shared.
  • Tweets praising Tailwind? Scrolled past or found reasons to dismiss them.
  • Projects struggling with Tailwind? “See, I knew it was problematic!”
  • Projects thriving with Tailwind? “They would’ve been fine with CSS modules.”

I wasn’t evaluating Tailwind objectively. I was collecting ammunition to defend a conclusion I’d already made.

This is confirmation bias: the tendency to seek, interpret, and remember information that confirms what we already believe.

And in the age of social media—where algorithms serve us exactly what we want to see—it’s become dangerously easy to never encounter a challenging thought again.

What is Confirmation Bias?

Confirmation bias is the tendency to:

  1. Seek information that confirms our existing beliefs
  2. Interpret ambiguous information as supporting our beliefs
  3. Remember information that confirms our beliefs more readily
  4. Dismiss or forget information that contradicts our beliefs

It’s not conscious deception. It’s a cognitive shortcut that made sense in evolutionary terms but creates problems in the modern information landscape.

Why it evolved:

In a dangerous environment, having strong beliefs that you could act on quickly was survival-positive. Questioning everything is slow. Conviction is fast.

But what helped us survive saber-tooth tigers is now preventing us from thinking clearly about technology choices, business decisions, and pretty much everything else.

The Social Media Amplifier

Confirmation bias has always existed, but social media has turned it into a supercharged, algorithmic echo chamber.

Here’s how:

The Algorithm Knows What You Want

Every platform optimizes for engagement.

How they do it:

  • You click on articles criticizing remote work → Feed shows more anti-remote content
  • You watch videos about microservices → Algorithm recommends more microservices content
  • You like tweets from functional programming advocates → Timeline fills with FP evangelism

The result: Your feed becomes a hall of mirrors, reflecting your existing beliefs back at you.

You think you’re staying informed. You’re actually being algorithmically isolated from disagreement.

The Filter Bubble Effect

Eli Pariser coined the term “filter bubble” in 2011, and it’s only gotten worse.

The bubble:

If you believe “serverless is the future,” your bubble will show you:

  • Success stories of serverless migrations
  • Cost savings from going serverless
  • Thought leaders praising serverless
  • Frameworks built for serverless

Your bubble will NOT show you:

  • Companies leaving serverless for traditional servers
  • Cost disasters from serverless
  • Limitations and gotchas
  • Thoughtful critiques of serverless hype

You don’t even know what you’re not seeing.

The Engagement Trap

Platforms profit from engagement. Outrage and validation drive engagement.

What gets promoted:

  • Content that makes you angry (outrage drives shares)
  • Content that validates you (validation drives likes)
  • Content that confirms your beliefs (comfort drives time-on-site)

What gets buried:

  • Nuanced takes (boring, low engagement)
  • Moderate positions (no strong emotions)
  • Content that challenges you (uncomfortable, people leave)

The algorithm isn’t trying to inform you. It’s trying to keep you scrolling.

The Echo Chamber Effect

You follow people who think like you. They share content that reinforces shared beliefs. You share it too. The cycle continues.

The tech Twitter echo chambers:

The “JavaScript is bloated” chamber: Everyone agrees frameworks are too complex, vanilla JS is better, dependencies are dangerous.

The “React is the only answer” chamber: Everyone agrees React is best, other frameworks are niche, component architecture is the future.

The “TypeScript is essential” chamber: Everyone agrees JavaScript without types is reckless, TypeScript saves projects, untyped code is legacy.

Notice: These beliefs contradict each other. But within each chamber, everyone agrees, so each group thinks they’re objectively correct.

How Confirmation Bias Manifests in Tech

Let me show you how this plays out for developers and founders:

Example 1: The Framework Wars

The pattern:

  1. You choose React for your first project
  2. You invest time learning React
  3. Your identity becomes “a React developer”
  4. Now you seek evidence that React is the best choice

What you see:

  • React job postings (confirms it’s in-demand)
  • Successful companies using React (confirms it’s enterprise-ready)
  • New React features (confirms it’s actively developed)
  • Blog posts praising React (confirms it’s well-designed)

What you ignore:

  • Companies migrating away from React
  • Performance issues at scale
  • Developer experience complaints
  • Successful projects built without React

Why it happens: Admitting another framework might be better feels like admitting you wasted time learning the wrong thing.

Example 2: The Architecture Religious Wars

The belief: “Microservices are always better than monoliths.”

How confirmation bias reinforces it:

Seek:

  • Case studies of successful microservice migrations
  • Articles about monolith limitations
  • Talks from companies like Netflix and Uber

Interpret:

  • “This company has scaling issues” → “They should’ve used microservices”
  • “This startup is growing fast” → “Must be because of their microservice architecture”

Remember:

  • That blog post about how microservices saved a company
  • That conference talk about monolith problems

Forget:

  • The companies that tried microservices and reverted
  • The successful companies still running monoliths
  • The nuanced take that it depends on context

Result: You now “know” microservices are better, and you have a mental catalog of evidence to prove it.

Example 3: The “10x Developer” Myth

The belief: “The best developers are 10x more productive.”

The confirmation bias:

What you notice:

  • That senior dev who ships features fast (ignoring they’re working on tasks they’ve done 100 times)
  • That GitHub profile with tons of commits (ignoring most are trivial or from automated tools)
  • That founder who built an app in a weekend (ignoring the 10 years of experience that enabled it)

What you don’t notice:

  • All the collaborative work that makes “10x” developers productive (code review, pair programming, mentorship)
  • The “slow” developers who prevent bugs and write maintainable code
  • The teams without “stars” that still ship consistently

Result: You believe in the 10x myth, hire for it, create toxic competitive culture, and wonder why teams underperform.

Example 4: The Technology Choice Justification

The scenario:

Your startup chose MongoDB in 2018.

Confirmation bias in action:

2019: “MongoDB is scaling great! Best decision ever.” 2020: “This schema flexibility is really helpful.” (Ignoring the times it caused bugs) 2021: “Look at this MongoDB success story!” (Ignoring PostgreSQL success stories) 2022: Data consistency issues emerge 2023: “All databases have these issues. MongoDB is still the right choice.”

Reality: PostgreSQL might have been a better fit, but you’ve spent five years collecting evidence that MongoDB was right, so switching feels like admitting failure.

Alternative universe: You chose PostgreSQL in 2018 and spent five years collecting evidence that was the right choice.

The truth: Both would’ve worked fine. But confirmation bias made each choice seem obviously correct in hindsight.

Real-World Consequences

This isn’t just academic. Confirmation bias costs real time and money:

Consequence 1: Bad Technical Decisions

Example:

You believe “premature optimization is the root of all evil” (a real Knuth quote).

Because of confirmation bias:

  • You notice when optimization efforts were wasted
  • You ignore when lack of optimization caused performance disasters
  • You interpret slow code as “we’ll optimize later”
  • You remember the blog posts about over-engineered code

Result: You ship code that’s unnecessarily slow because you’ve trained yourself to see all optimization as premature.

Consequence 2: Missed Opportunities

Example:

You believe “AI is just hype.”

Your confirmation bias:

  • You seek articles criticizing AI limitations
  • You dismiss GPT-4 successes as cherry-picked
  • You remember AI failures
  • You ignore companies building successful AI products

Result: You miss the actual opportunity to build valuable AI-enhanced products while you wait for the “hype” to die.

Consequence 3: Team Conflict

The scenario:

Half your team loves TypeScript. Half your team prefers JavaScript.

What happens:

Each side:

  • Shares articles supporting their position
  • Dismisses the other side’s evidence as biased
  • Interprets problems as validation of their view
  • Remembers times the other approach failed

Result: Endless debates, no progress, team division.

The truth: Both can work. But confirmation bias makes compromise impossible.

Consequence 4: Cargo Culting

Example:

You read that Google uses monorepos.

Confirmation bias kicks in:

  • You seek more articles about monorepo benefits
  • You interpret your current multi-repo pain as evidence you need a monorepo
  • You remember the success stories
  • You ignore the articles about monorepo complexity

Result: You spend three months migrating to a monorepo, copying Google’s approach without Google’s tooling or scale, and create more problems than you solved.

How to Combat Confirmation Bias

You can’t eliminate confirmation bias—it’s hardwired. But you can build practices that counteract it:

Strategy 1: Actively Seek Disconfirming Evidence

Don’t just consume content that agrees with you. Hunt for disagreement.

Practical implementation:

For every major belief, actively search for the opposite view.

Example:

Belief: “Tailwind is the best CSS approach.”

Deliberate search:

  • Google: “Why Tailwind CSS is bad”
  • Google: “Alternatives to Tailwind”
  • Twitter search: “Tailwind criticism”
  • Reddit: r/css threads critical of Tailwind

Then: Actually read them. Seriously consider the arguments.

Goal: Not to change your mind necessarily, but to understand the strongest counter-arguments.

Strategy 2: The “Steelman” Exercise

The opposite of strawman argument is “steelman”—representing the opposing view in its strongest, most charitable form.

How to practice:

Pick a technology you disagree with. Write out the BEST possible argument for using it.

Example:

I believe: “NoSQL databases are overhyped.”

Steelman of the opposite:

“NoSQL databases excel in specific use cases: handling massive write throughput, storing genuinely schemaless data, horizontal scaling beyond what RDBMS can economically achieve, and providing flexible data models for rapidly evolving products. For companies like Facebook or Netflix with billions of records and specific access patterns, NoSQL isn’t hype—it’s the right tool for the job.”

What this does: Forces you to understand why smart people might disagree with you, instead of dismissing them.

Strategy 3: Diversify Your Information Diet

Follow people who think differently than you.

If you’re a React fan:

  • Follow Vue, Svelte, and Solid advocates
  • Read articles praising other frameworks
  • Join Discord servers for other communities

If you believe in microservices:

  • Follow people building successful monoliths
  • Read “The Majestic Monolith”
  • Study companies that chose monoliths intentionally

If you’re a TypeScript evangelist:

  • Follow developers who ship great products in plain JavaScript
  • Read arguments against type systems
  • Study dynamic language successes

The goal: Not to change your tech choices, but to understand the full landscape instead of just your corner of it.

Strategy 4: The “What Would Change My Mind?” Test

For any strong belief, ask: “What evidence would convince me I’m wrong?”

If the answer is “nothing,” you’re not holding a belief—you’re holding a dogma.

Example:

Belief: “Remote work is better than office work.”

What would change my mind:

  • Data showing remote teams consistently underperform in-person teams across multiple studies
  • Personal experience of significantly better collaboration in-office
  • Evidence that specific types of work (e.g., creative brainstorming) measurably suffer remotely

Having clear criteria makes you intellectually honest.

If you encounter that evidence, you have to update your belief—or admit you’re being irrational.

Strategy 5: Track Your Predictions

Confirmation bias thrives on fuzzy thinking. Combat it with precise predictions.

How it works:

Write down specific, testable predictions based on your beliefs.

Example:

Belief: “Using TypeScript will reduce bugs in our codebase.”

Prediction: “After migrating to TypeScript, we’ll see a 30% reduction in production bugs within 6 months.”

Result after 6 months: Check the data. Were you right?

  • If yes: Great! Your belief was validated with evidence.
  • If no: Update your belief. Maybe TypeScript helps but not as much as you thought, or maybe you need different practices.

What this prevents: Vague claims like “TypeScript makes code better” that can never be proven wrong.

Strategy 6: Join Opposition Communities

This sounds uncomfortable, but it’s incredibly effective.

Example:

If you love React, spend a month actively participating in the Svelte community.

Not to troll. To genuinely learn why people prefer it.

What you’ll discover:

  • Smart people have different preferences
  • There are valid reasons to choose differently
  • Your framework isn’t objectively superior
  • Every tool has tradeoffs

Bonus: You might actually learn something valuable about your own preferred tool.

Strategy 7: The “Yes, And” Practice

When someone disagrees with you, resist the urge to say “Yes, but…”

Instead say “Yes, and…”

Example:

Them: “I think GraphQL adds unnecessary complexity.”

Confirmation bias response: “Yes, but that complexity is worth it for the flexibility.”

“Yes, and” response: “Yes, and I can see how for simpler apps, REST might be more straightforward. Where do you think the tradeoff tips?”

What this does: Opens conversation instead of defending your position. You might learn something.

Strategy 8: Consume Primary Sources

Don’t just read opinions about things. Try them yourself.

The pattern:

Most people form opinions based on other people’s opinions, creating a game of telephone.

Better approach:

Want to evaluate a technology? Build something with it for a week.

Example:

Instead of reading 10 articles about “Svelte vs React,” build the same small app in both. Form your own opinion based on direct experience.

You’ll still have bias, but it’ll be based on reality, not someone else’s blog post.

The Social Media Detox Strategy

Here’s how to use social media without drowning in confirmation bias:

Tactic 1: Curate Your Follow List Ruthlessly

What to follow:

  • People who disagree with you respectfully
  • People who think differently
  • People from different backgrounds
  • People who make you uncomfortable (in a challenging, not toxic way)

What to unfollow:

  • People who only validate your beliefs
  • Engagement bait accounts
  • Outrage merchants
  • Echo chamber amplifiers

Tactic 2: Use Algorithmic Feeds Less

Instead of the algorithmic “For You” feed:

  • Use chronological feeds when available
  • Use lists to segment topics
  • Use RSS for blogs (no algorithm)
  • Deliberately search for opposing views

Why: Algorithms optimize for engagement, not truth or balance.

Tactic 3: The “Opposing View” Ritual

For every article you share that confirms your beliefs, share one that challenges them.

Example:

Share an article praising TypeScript? Also share a thoughtful critique of TypeScript.

What this does:

  • Signals intellectual honesty
  • Exposes your followers to diverse views
  • Forces you to engage with counter-arguments

Tactic 4: Set Consumption Limits

Unlimited social media consumption = unlimited confirmation bias.

My rules:

  • Twitter: 30 minutes/day max
  • Reddit: Only for specific questions, not browsing
  • LinkedIn: 15 minutes/day
  • Hacker News: Morning coffee only

When you limit time, you’re more selective about what you consume, and less likely to fall into echo chambers.

Tactic 5: Actively Engage with Disagreement

When you see a take you disagree with, resist the urge to:

  • Quote tweet with a dunk
  • Leave an angry comment
  • Dismiss and scroll

Instead:

  • Read it fully
  • Try to understand the reasoning
  • Engage respectfully if you have something to contribute
  • Or just let it sit in your brain

Example:

See a tweet: “TypeScript is unnecessary complexity.”

Bad response: “Tell me you don’t work on real projects without telling me…”

Good response: “Interesting take. What size/type of projects have you found work better without TypeScript? Genuinely curious about the tradeoff point.”

The Cost of Being Wrong Less

Here’s the paradox: Fighting confirmation bias makes you less certain, but more often correct.

Most people optimize for confidence. They want to feel right.

But optimizing for confidence leads to echo chambers, dogma, and being very confidently wrong.

Better goal: Optimize for accuracy.

That means:

  • Holding beliefs loosely
  • Updating quickly when you’re wrong
  • Being comfortable with “I don’t know”
  • Seeking truth over validation

The developers I respect most aren’t the most confident. They’re the most curious.

They say things like:

  • “I prefer X, but I understand why you’d choose Y.”
  • “I used to think X, but this changed my mind…”
  • “I’m not sure. Let’s try both and see.”
  • “That’s a good point. I hadn’t considered that.”

They’re not wishy-washy. They’re intellectually honest.

Final Thoughts: Breaking the Echo Chamber

Social media promised to connect us to the world’s information.

Instead, it connected us to customized versions of the world designed to keep us comfortable.

Confirmation bias isn’t a bug in social media. It’s the feature.

Platforms profit when you’re engaged. You’re most engaged when you’re validated or outraged. So that’s what they serve you.

Breaking free requires intention:

  • Actively seek disconfirming evidence
  • Follow people who challenge you
  • Build products with technologies you’re skeptical of
  • Hold beliefs loosely
  • Update quickly

Your goal shouldn’t be to eliminate bias—that’s impossible.

Your goal should be to recognize it, account for it, and build practices that counteract it.

I still prefer Tailwind now, but for the right reasons. I’ve used it extensively, I’ve tried alternatives, I understand the tradeoffs, and I know when NOT to use it.

That’s a belief based on experience, not confirmation bias.

The question isn’t whether you have biases. You do. We all do.

The question is: Are you aware of them? And are you doing anything about it?


What beliefs are you holding too tightly? What evidence would change your mind? What opposing views have you been avoiding? Let me know—I’d genuinely love to hear.