I once deployed code to production because a VP told me to, even though I knew it would break things.

I was a junior engineer. They were a VP of Engineering. They said, “Ship it now. We need this for the demo tomorrow.”

I tried to explain: “The tests are failing. The database migration isn’t ready. This will cause data corruption.”

They responded: “I understand your concerns, but I’ve been doing this for 20 years. Trust me. Ship it.”

So I did.

It broke spectacularly. We corrupted customer data. We spent 72 hours doing manual recovery. Customers were furious.

In the post-mortem, my manager asked: “Why did you ship code you knew was broken?”

My answer: “Because the VP told me to.”

That’s authority bias. And it’s not just about following bad orders. It’s about how we surrender our own judgment to people in positions of power—even when they’re clearly wrong.

What is Authority Bias?

Authority bias is the tendency to attribute greater accuracy and weight to the opinion of an authority figure, regardless of the actual content of that opinion.

In other words: We believe people in positions of authority are more likely to be correct, simply because they’re authorities—not because their argument is better.

The mechanism:

  • Person with title/credentials/status makes a claim
  • We assume they know what they’re talking about
  • We don’t critically evaluate their reasoning
  • We defer to their judgment over our own
  • We follow their direction even when it contradicts our better judgment

The problem: Authority figures are often wrong. Their expertise in one domain doesn’t transfer to another. Their incentives might not align with truth. And blind deference leads to terrible outcomes.

The Milgram Experiment: The Dark Side of Obedience

The most famous demonstration of authority bias comes from Stanley Milgram’s experiments in the 1960s.

The setup:

Participants were told they were part of a learning experiment. Their job: administer electric shocks to a “learner” (actually an actor) every time they got a question wrong.

The shocks started at 15 volts and increased by 15 volts with each wrong answer, up to a maximum of 450 volts (labeled “Danger: Severe Shock” and “XXX”).

The learner was in another room. As the shocks increased, participants heard:

  • 75V: Grunts of pain
  • 120V: Complaints of pain
  • 150V: Demands to be released
  • 270V: Agonized screams
  • 330V: Silence (implying unconsciousness or worse)

The authority figure: A researcher in a lab coat told participants to continue whenever they hesitated.

The script:

  • Participant: “I don’t think I should continue.”
  • Researcher: “Please continue.”
  • Participant: “But he’s in pain.”
  • Researcher: “The experiment requires that you continue.”
  • Participant: “What if something happens to him?”
  • Researcher: “I take full responsibility. Please continue.”

The question: How many participants would go all the way to 450 volts?

Before the experiment, experts predicted: Less than 1%. Maybe some sadists. But normal people wouldn’t hurt someone just because an authority figure told them to.

The actual result: 65% of participants went all the way to 450 volts.

Not 1%. Not 10%. 65%.

Two-thirds of ordinary people were willing to administer potentially lethal shocks to an innocent person simply because someone in a lab coat told them to.

Let that sink in.

Why the Milgram Results Are Terrifying

This wasn’t a fringe study with weird methodology. It’s been replicated across:

  • Different countries
  • Different cultures
  • Different time periods
  • Different contexts

The pattern holds: When an authority figure gives orders, most people obey—even when those orders contradict their moral compass.

The implications:

This explains:

  • How ordinary people commit atrocities during war
  • How corporate fraud happens (everyone was “just following orders”)
  • How medical errors occur (nurses don’t question wrong prescriptions)
  • How engineering disasters happen (juniors don’t speak up when seniors are wrong)

We like to think: “I would never do that. I’d stand up to authority. I’d do the right thing.”

The data says: No, you probably wouldn’t. Neither would I.

Authority Bias in Tech and Startups

You don’t need lab coats and electric shocks to see authority bias. It’s everywhere in professional life:

Example 1: The Senior Engineer Who’s Always Right

The scenario:

In an architecture review, a senior engineer proposes using microservices for a new project.

You’re a mid-level engineer. You think a monolith would be better for this use case:

  • The team is small (4 people)
  • The system isn’t that complex
  • You don’t have the infrastructure for microservices
  • Deployment and debugging will be nightmare

But you don’t speak up because:

  • They’re senior
  • They’ve been at the company for 8 years
  • They’ve built systems before
  • They have “Staff Engineer” in their title

The team builds microservices. It’s a disaster. Deployment is slow. Debugging is impossible. The complexity kills velocity.

Two years later, the team rewrites it as a monolith.

Authority bias in action: Title and experience trumped sound reasoning.

Example 2: The CEO’s Bad Idea

The scenario:

The CEO wants to add a feature. Everyone on the product team knows it’s a bad idea:

  • User research doesn’t support it
  • It doesn’t align with the product vision
  • It’s technically complex
  • It’ll distract from higher-impact work

But nobody pushes back hard because:

  • They’re the CEO
  • They founded the company
  • They’ve “been right before”
  • They’re the boss

The feature gets built. It takes 3 months. Adoption is 2%. It gets deprecated 6 months later.

Authority bias in action: Position trumped evidence.

Example 3: The Consultant Who Must Know Best

The scenario:

Company hires McKinsey (or Deloitte, or BCG) to advise on strategy.

The consultants (fresh MBAs with no industry experience) recommend a major reorganization.

Long-time employees object: “This doesn’t make sense for our culture. We tried something similar before and it failed. This will destroy team cohesion.”

Management responds: “But McKinsey recommended it. They’re experts. They’ve worked with Fortune 500 companies.”

The reorg happens. It’s a disaster. Top performers leave. Productivity tanks. The company spends two years recovering.

Authority bias in action: Brand name and price tag trumped institutional knowledge.

Example 4: The Framework Everyone’s Using

The scenario:

A famous tech company (let’s say Google) open-sources a framework. Everyone starts using it.

You look at it and think: “This seems overly complex for our needs. It’s designed for Google-scale problems, which we don’t have.”

But you adopt it anyway because:

  • “Google uses it, so it must be good”
  • “All the big companies are moving to this”
  • “The creator gave a talk at a major conference”

Your team spends 6 months learning it, only to find it’s massive overkill for your use case. You could’ve solved the problem with something simpler.

Authority bias in action: “Google does it” became the reasoning, not “it’s right for our context.”

Example 5: The Doctor Who Dismissed the Nurse

This one’s from healthcare, but it’s relevant:

A nurse notices something wrong with a patient. They tell the doctor. The doctor dismisses it: “I’m the doctor. I know what I’m doing.”

The nurse is right. The patient’s condition worsens. By the time the doctor realizes the mistake, it’s too late.

Why this matters for tech: The same pattern happens with junior engineers and senior engineers, QA and developers, support teams and product teams.

Authority bias causes us to ignore valuable input from people with less formal authority.

The Anatomy of Authority Bias

Why are we so susceptible to this? Several factors:

1. Heuristic Efficiency

The logic: “Evaluating every argument on its merits is cognitively expensive. Using authority as a shortcut is efficient.”

Usually, experts do know more. Usually, experienced people have better judgment. Usually, people in positions of power got there for a reason.

So deferring to authority is often a good heuristic.

The problem: It becomes a reflex. We stop evaluating the argument and just check the title.

2. Social Proof and Legitimacy

The logic: “If they’re in a position of authority, society has validated them. Who am I to question that?”

The problem: Authority doesn’t equal correctness. People get positions through politics, luck, timing, and yes, competence. But competence in one area doesn’t mean competence in all areas.

3. Fear of Consequences

The logic: “If I question authority, there might be negative consequences. Easier to just comply.”

The reality: This is often true. Questioning your boss can hurt your career. Disagreeing with a senior can make you look difficult.

The problem: This creates systems where bad decisions go unchallenged.

4. Cognitive Dissonance Reduction

The logic: “I trust this authority figure. They’re saying X. If I believe X is wrong, I’d have to question my trust in them. Easier to just believe X.”

The problem: We rationalize authority’s position rather than critically evaluate it.

5. Diffusion of Responsibility

The logic: “They’re the authority. It’s their job to be right. I’m just following orders. If it’s wrong, it’s on them.”

The problem: This is how ethical disasters happen. Everyone abdicates responsibility to the authority figure.

Real-World Authority Bias Disasters

Let’s look at some historical examples:

The Challenger Disaster (Again)

I mentioned this in the groupthink article, but it’s also a textbook case of authority bias.

Engineers warned that O-rings could fail in cold temperatures. Their data was clear. Their concerns were serious.

But NASA management (the authority) decided to launch anyway.

Why did the engineers ultimately defer? Authority bias. “Management must know something we don’t.” “They’ve made these decisions before.” “Our job is to provide data, not to make the call.”

Result: Seven astronauts died.

The 2008 Financial Crisis

Rating agencies (Moody’s, S&P, Fitch) gave AAA ratings to mortgage-backed securities that were actually toxic.

Why did investors buy them? Authority bias. “The rating agencies are experts. If they say it’s AAA, it must be safe.”

Individual analysts at investment firms had doubts. But the rating agencies were the authority. Their judgment was trusted over individual skepticism.

Result: Global financial collapse.

The Boeing 737 MAX

Engineers raised concerns about the MCAS system. Pilots raised concerns about training. Regulators in some countries raised concerns about safety.

But Boeing (the authority in aviation) said it was fine. The FAA (the authority in regulation) approved it.

Other countries deferred to the FAA’s judgment. Airlines deferred to Boeing’s expertise.

Result: Two crashes. 346 people dead.

Theranos (Again)

A board full of powerful, respected people (former Secretaries of State, Generals, prominent businesspeople) oversaw a company making fraudulent claims.

Why didn’t they catch it? Authority bias worked in two directions:

  1. Investors and partners trusted the board: “With that board, it must be legit.”
  2. The board trusted Elizabeth Holmes: “She’s the CEO. She’s the expert in her technology. We defer to her.”

Result: Massive fraud. Patients put at risk.

How Authority Bias Shows Up Daily in Tech

In Code Review

Senior engineer: “This is the right approach.”

Junior engineer: (has concerns but doesn’t voice them because senior must know better)

Result: Suboptimal code merged. Technical debt accumulates.

In Product Decisions

Product Manager: “Users want feature X.”

Designer: (has user research showing otherwise but doesn’t push back because PM is the “product expert”)

Result: Feature built that users don’t want.

In Incident Response

On-call lead: “The issue is with the database.”

Junior engineer: (thinks it’s actually a caching issue but defers to the lead’s experience)

Result: Two hours wasted investigating the wrong thing.

In Career Advice

Manager: “You should focus on learning technology Y.”

Employee: (knows that technology Y isn’t relevant to their goals but assumes manager knows best)

Result: Time wasted on skills that don’t advance their actual career.

In Salary Negotiations

HR: “This is a competitive offer for your level.”

Candidate: (has data showing it’s below market but assumes HR knows the market better)

Result: Accepting below-market compensation.

How to Recognize Authority Bias in Real-Time

Warning signs that you’re deferring to authority rather than evaluating the argument:

Warning Sign #1: Your Reasoning is “Because They Said So”

If your explanation for a decision is:

  • “The VP wants it”
  • “Google does it this way”
  • “The consultant recommended it”
  • “The senior engineer said so”

That’s not reasoning. That’s authority bias.

Better: “Here’s why this makes sense…” with actual arguments.

Warning Sign #2: You’re Uncomfortable Disagreeing

You have a different opinion. You have good reasons. But you’re hesitant to voice them because of the other person’s title or status.

That discomfort is authority bias.

Warning Sign #3: You Stop Asking Questions

You don’t understand the reasoning. But you assume:

  • “They must know something I don’t”
  • “I’m probably just not seeing the big picture”
  • “I don’t want to look stupid by asking”

Healthy alternative: Ask questions until you understand. If you can’t understand, maybe the reasoning is bad.

Warning Sign #4: You Rationalize to Defend Authority’s Position

Authority figure makes a claim that seems wrong. Instead of investigating, you come up with reasons why they might be right.

You’re doing their work for them. You’re bending your thinking to fit their conclusion.

Warning Sign #5: You Feel Powerless

“I’m just a [junior engineer / individual contributor / contractor]. I can’t push back on [senior engineer / VP / CEO].”

That feeling of powerlessness is often authority bias masquerading as realism.

How to Combat Authority Bias

You can push back on authority bias without being insubordinate:

Strategy 1: Evaluate Arguments, Not Titles

The practice: Separate the claim from who’s making it.

Technique: Imagine the same argument coming from a junior engineer. Would it still convince you?

If yes, the argument is good.

If no, you’re being influenced by authority bias.

Example:

CTO says: “We should rewrite the entire codebase in Rust.”

Ask yourself: If a junior engineer said this, would I think it was a good idea?

If not, ask the CTO for their reasoning. Evaluate the reasoning, not the title.

Strategy 2: Ask for the Reasoning

The practice: Don’t accept “because I said so.” Ask for the underlying logic.

How:

  • “Can you walk me through your thinking?”
  • “What data are you basing this on?”
  • “What are the tradeoffs we’re considering?”
  • “What alternatives did you consider?”

Why this works: Forces explicit reasoning. Either they have good reasons (great, now you understand) or they don’t (revealing that authority was their only argument).

Example:

Manager: “We should prioritize Feature X.”

You: “Can you help me understand the reasoning? What user feedback or data is this based on?”

If they have good data, you learn something. If they don’t, the weakness becomes apparent.

Strategy 3: Use “I’m Trying to Understand” Framing

The practice: Frame disagreement as seeking understanding, not challenging authority.

Instead of: “I disagree with this approach.”

Try: “I’m trying to understand this approach. My concern is X. How should I think about that?”

Why this works: Non-threatening. You’re not saying they’re wrong. You’re saying you want to understand. This makes them explain their reasoning without getting defensive.

Example:

Senior engineer: “We should use MongoDB for this.”

You: “I’m trying to understand the database choice. My understanding is our data is highly relational. How should I think about the tradeoffs between MongoDB and Postgres here?”

Strategy 4: Bring Data

The practice: Counter authority with evidence.

Example:

VP: “We should build mobile apps before web.”

You: “I looked at our analytics. 87% of our traffic is web. Only 13% is mobile. Here’s the data. Given that, does it still make sense to prioritize mobile?”

Why this works: Data is harder to dismiss than opinion. You’re not pitting your judgment against theirs. You’re bringing external evidence.

Strategy 5: Find Allies

The practice: If you’re uncomfortable pushing back alone, find others who share your concerns.

Example:

You notice a decision seems wrong. You’re junior. You’re hesitant to speak up.

Approach: Talk to peers or other seniors you trust. If they share your concerns, you can raise it together or they can raise it.

Why this works: Multiple voices are harder to dismiss than one. And senior allies can challenge authority without the same career risk.

Strategy 6: Use Hypotheticals and Questions

The practice: Raise concerns indirectly through questions and scenarios.

Instead of: “This will fail.”

Try: “What happens if X goes wrong?” or “Have we considered the scenario where Y?”

Example:

CEO wants to launch without testing.

Instead of: “We shouldn’t launch. This is reckless.”

Try: “What’s our rollback plan if we discover critical bugs in production?” or “What’s the worst-case scenario if this breaks?”

This raises concerns without direct confrontation.

Strategy 7: Appeal to Shared Goals

The practice: Frame your concern in terms of goals the authority cares about.

Example:

VP wants to ship fast. You have quality concerns.

Instead of: “We need more time for quality.”

Try: “I share the goal of shipping fast. My concern is that if we ship with these bugs, we’ll spend three weeks doing customer support and hotfixes, which will actually delay our next release. Would it make sense to spend two more days on testing to avoid that?”

Why this works: You’re aligned on goals. You’re just proposing a better path to achieve them.

Strategy 8: Document Your Concerns

The practice: If you’re being overruled, document your concerns in writing.

Example:

You’re told to deploy broken code. You do it (because sometimes you have to). But first, you send an email:

“Per our conversation, I’m deploying the update as requested. For the record, I want to note my concerns:

  • The tests are failing on edge case X
  • This could cause data corruption for scenario Y
  • My recommendation was to wait until Z is fixed

I’ll proceed with the deployment as directed.”

Why this works:

  • Creates a record
  • Makes your position clear
  • Protects you if things go wrong
  • Sometimes, putting it in writing makes the authority reconsider

The Balance: When to Defer vs. When to Push Back

Authority bias is a bias, but expertise is real. How do you know when to defer and when to push back?

Defer When:

1. They Have Relevant Expertise

Deferring to a security expert on security issues makes sense. Deferring to a marketing person on technical architecture doesn’t.

2. You Lack Context

They might have information you don’t. Asking questions is good. Assuming you’re right despite lacking context is arrogance.

3. The Decision is Reversible

If you can easily undo it, deferring for the sake of speed/team cohesion might be worth it.

4. The Stakes Are Low

Pick your battles. Not every decision needs pushback.

Push Back When:

1. Safety, Security, or Ethics Are at Risk

If someone is telling you to do something dangerous, unethical, or illegal, push back. Hard.

2. They’re Outside Their Domain of Expertise

CEO has opinions on technical architecture? Those opinions aren’t automatically more valid than an engineer’s.

3. The Decision is Expensive to Reverse

Database choice, architecture decisions, major rewrites—these are hard to undo. Worth pushing back if you have concerns.

4. You Have Evidence They’re Wrong

Data, past experience, user research—if you have evidence, share it.

5. Their Incentives Don’t Align with Truth

Sometimes authorities are motivated by politics, ego, or covering mistakes. Be aware when their incentives might bias their judgment.

Advanced Technique: The “Steel Man” Approach

When you disagree with an authority, steel man their argument:

Steel manning: Construct the strongest possible version of their argument before you critique it.

Why this works:

  • Shows you’re engaging in good faith
  • Forces you to actually understand their position
  • Often, you’ll realize they have better reasons than you thought
  • When you do disagree, your disagreement is more credible

Example:

Authority says: “We should use microservices.”

Steel man: “I understand the reasoning for microservices: independent scaling, team autonomy, tech stack flexibility, and fault isolation. For a large organization with multiple teams, these benefits can outweigh the complexity costs.”

Then push back: “Given that we’re a 4-person team building a relatively simple system, and we don’t currently need those benefits, would it make sense to start with a monolith and split into microservices later if needed?”

You’ve shown you understand the argument. Now your alternative is more credible.

Personal Stories: When I Failed to Push Back

The Database Decision I Regretted

Early in my career, a senior architect decided we should use Cassandra for a new project.

I had concerns. Our queries were relational. Our team didn’t know Cassandra. We didn’t need the scale.

But I didn’t push back because: “They’re the architect. They must know better than me.”

We spent 9 months fighting with Cassandra. Queries that would’ve been simple in Postgres required complex workarounds. We eventually migrated to Postgres.

Lesson: I had valid concerns. I should’ve voiced them, even to a senior person.

The Feature I Knew Would Fail

A product manager insisted we build a feature. User research showed lukewarm interest. The implementation was complex. ROI seemed low.

I raised concerns once. They said “Let’s build it and see.”

I didn’t push back harder because: “They’re the product expert. Maybe they see something I don’t.”

We built it. Three months of work. Adoption was 1%. It was deprecated six months later.

Lesson: I had data. I should’ve pushed harder.

The Deployment I Should’ve Refused

This is the story from the opening. VP told me to deploy broken code.

I knew it would break. I said so. They insisted. I complied.

What I should’ve done:

  • Documented my concerns in writing
  • Asked for explicit written confirmation of the order
  • Escalated to my manager
  • Been willing to refuse

Lesson: When the stakes are high (customer data), push back harder. Be willing to escalate. Be willing to say no.

When Authority is Right (Even If You Disagree)

Important caveat: Sometimes authority figures are right and you’re wrong.

Signs you might be wrong:

1. Multiple Experts Agree

If several experienced people with different perspectives all agree, and you disagree, you might be missing something.

2. You Lack Domain Knowledge

If you’re a backend engineer disagreeing with a UX designer about user experience, maybe defer.

3. They Have Information You Don’t

Sometimes leaders have context that explains their decisions. Worth asking about.

4. Your Reasoning is Emotional

If your objection is “I just don’t like it” rather than “here’s why it’s wrong,” you might be wrong.

The key: Push back enough to understand. If they have good reasons, update your beliefs.

Building a Culture That Resists Authority Bias

If you’re a leader, you can create environments where authority bias is less prevalent:

1. Reward Dissent

When someone challenges your decision, thank them publicly.

“I really appreciate [person] pushing back on this. This is exactly the kind of critical thinking we need.”

2. Admit When You’re Wrong

Change your mind publicly when presented with better evidence.

“I thought we should do X. After hearing [person]’s concerns and seeing the data, I’ve changed my mind. We’re doing Y.”

3. Ask for Disagreement

“I think we should do X. Who disagrees? I want to hear the strongest arguments against this.”

4. Create Explicit Permission to Disagree

“If you think I’m wrong about something, I want you to tell me. I won’t be offended. I want to make the best decision, not be right.”

5. Implement Decision-Making Frameworks

Use structured decision-making that evaluates arguments on merit, not on who proposes them.

6. Rotate Authority

Have junior people run meetings, make decisions, present to leadership. This breaks the automatic deference pattern.

Final Thoughts: Question Authority (Respectfully)

Here’s what I’ve learned:

Authority is not the same as correctness.

People in positions of power are often experienced, often knowledgeable, often right. But not always. And blind deference leads to disasters.

The Milgram experiment’s lesson isn’t “authority is bad.”

It’s that ordinary people, when faced with authority, will do things they know are wrong.

The antidote isn’t rebellion for rebellion’s sake. It’s critical thinking. It’s evaluating arguments on their merits. It’s having the courage to speak up when something seems wrong.

In your career, you’ll face moments where authority and your judgment conflict.

A senior engineer will propose something you think is wrong. A manager will ask you to do something you think is unethical. A CEO will make a decision you think will harm the company.

In those moments, remember:

  • You might be wrong (stay humble)
  • But you might be right (stay brave)
  • Your job is to think critically, not obey blindly
  • Good leaders want pushback, not yes-men
  • Your voice matters, even if you’re junior

The next time someone says “do this because I’m the [senior/VP/CEO],” pause.

Ask yourself: “Do I understand why? Do I agree with the reasoning? If I didn’t know their title, would this still make sense?”

And if the answer is no, find a respectful way to push back.

Because blind obedience isn’t loyalty. It’s how good people enable bad decisions.

And we can do better.


Have you experienced authority bias in your work? When have you pushed back on authority? When do you wish you had? I’d love to hear your stories.