In 1961, Yale psychologist Stanley Milgram placed an ad in a New Haven newspaper: “We will pay you $4.00 for one hour of your time.”

Participants arrived at Yale’s psychology lab, believing they were taking part in a study about memory and learning.

They were told they would be the “teacher.” Another participant (actually an actor) would be the “learner.”

The teacher’s job: deliver electric shocks to the learner every time they answered a question incorrectly.

The shocks started at 15 volts and increased in 15-volt increments, up to 450 volts—marked “XXX” and potentially lethal.

Milgram wanted to answer a question haunting post-Holocaust psychology:

“Would ordinary people follow orders to hurt an innocent person?”

The answer was terrifying.

The Setup

Participants watched as the “learner” (the actor) was strapped into a chair with electrodes attached to their wrists.

The teacher sat in front of an imposing shock generator with 30 switches, each labeled with increasing voltage levels:

  • 15V: “Slight Shock”
  • 135V: “Strong Shock”
  • 255V: “Intense Shock”
  • 375V: “Danger: Severe Shock”
  • 435-450V: “XXX”

The experiment began. The learner answered questions. When they got one wrong, the teacher pressed a button to deliver a shock.

With each wrong answer, the voltage increased.

What Happened

At 75 Volts

The learner grunted in discomfort.

At 120 Volts

The learner shouted that the shocks were becoming painful.

At 150 Volts

The learner demanded to be released: “Get me out of here! I refuse to go on!”

The teacher hesitated. They looked at the experimenter—a man in a white lab coat.

The experimenter calmly said:

  • “Please continue.”
  • “The experiment requires that you continue.”
  • “It is absolutely essential that you continue.”
  • “You have no other choice, you must go on.”

At 300 Volts

The learner pounded on the wall and screamed in agony.

At 330 Volts

The learner stopped responding entirely. Silence.

The teachers were visibly distressed. They sweated, trembled, argued with the experimenter. Some laughed nervously—a stress response.

But when the man in the lab coat said, “Continue,” most did.

At 450 Volts

The maximum. The “XXX” zone. Beyond the “Danger: Severe Shock” label.

Silence from the learner’s room.

The Results

Milgram predicted only 1-2% of participants would go all the way to 450 volts.

He asked psychiatrists and colleagues. They agreed: only sadists and psychopaths would deliver the maximum shock.

The actual result?

65% of participants delivered the full 450 volts.

100% of participants continued to at least 300 volts—well into the “Intense Shock” and “Extreme Intensity Shock” range.

These weren’t monsters. They were ordinary people:

  • Teachers
  • Engineers
  • Salespeople
  • Postal workers

Two out of every three people were willing to potentially kill a stranger because someone in a lab coat told them to.

Why Did They Obey?

Milgram identified key factors that drove obedience:

1. Authority

The experimenter wore a lab coat, represented Yale, and spoke with confidence. Authority figures command obedience.

2. Gradual Escalation

It started at 15 volts. Each increase was small. By the time they reached dangerous levels, they were already committed.

3. Diffusion of Responsibility

“I’m just following orders. The experimenter is responsible, not me.”

4. Institutional Legitimacy

Yale University. A scientific study. Surely they wouldn’t allow harm to come to someone.

5. Physical Distance

The teacher couldn’t see the learner’s face. It was easier to hurt someone you can’t see.

6. Lack of a Clear Exit

No one explicitly said, “You can leave.” The social pressure to continue was immense.

Variations on the Experiment

Milgram ran variations to test what increased or decreased obedience:

When the experimenter left the room and gave orders by phone:

  • Obedience dropped to 21%

When the teacher had to physically hold the learner’s hand on a shock plate:

  • Obedience dropped to 30%

When two experimenters gave conflicting orders:

  • Obedience dropped to 0% (participants used the conflict as an excuse to stop)

When the experiment was moved from Yale to a run-down office building:

  • Obedience dropped to 48% (still disturbingly high)

When another “teacher” (actor) refused to continue:

  • Obedience dropped to 10%

The lesson: Authority, distance, and lack of dissenting voices dramatically increase compliance.

The Ethical Controversy

Milgram’s experiment was groundbreaking—and deeply unethical by today’s standards.

Participants believed they were harming someone. They experienced severe stress:

  • Trembling
  • Sweating
  • Stuttering
  • Nervous laughter
  • Some had seizures

One observer wrote: “I observed a mature and initially poised businessman enter the laboratory smiling and confident. Within 20 minutes he was reduced to a twitching, stuttering wreck.”

Participants weren’t fully debriefed until after. Many struggled with the realization that they had been willing to potentially kill someone.

This experiment could not be conducted today. Modern ethics boards would never approve it.

In the Real World

The Milgram Experiment explains countless atrocities:

The Holocaust

Milgram was Jewish and conducted the study in response to Adolf Eichmann’s trial. Eichmann orchestrated the deportation of millions of Jews to death camps.

His defense? “I was just following orders.”

Milgram wanted to test if ordinary people really would follow immoral orders. His experiment confirmed it.

My Lai Massacre (1968)

U.S. soldiers killed hundreds of unarmed Vietnamese civilians—men, women, children—during the Vietnam War.

When asked why, soldiers said: “I was following orders.”

Corporate Fraud

Enron, Wells Fargo, Theranos—employees carried out unethical or illegal actions because leadership ordered it.

Medical Experiments

Tuskegee Syphilis Study, Nazi medical experiments—doctors and nurses participated because authority figures sanctioned it.

In Tech and Software

The Milgram dynamic appears constantly in technology:

Unethical Product Features

“Build the Dark Pattern”

PM: "Add this manipulative subscription flow."
Engineer: "This tricks users into paying."
PM: "Leadership approved it. Just build it."
Engineer: *builds it*

Authority diffuses responsibility. “I’m just the developer.”

Surveillance and Privacy Violations

“Log Everything”

Manager: "We need to track every user action."
Engineer: "That's invasive and probably illegal."
Manager: "Legal approved it. Implement it."
Engineer: *implements it*

The engineer becomes the executioner of privacy.

Ignoring Security Vulnerabilities

“Ship It Anyway”

Security team: "This has a critical vulnerability."
Leadership: "We need to hit the deadline. Ship it."
Team: *ships it*

Authority overrides safety.

Manipulative AI

“Make the Algorithm Addictive”

Data scientist: "This algorithm maximizes engagement by exploiting anxiety."
Leadership: "Engagement is our KPI. Deploy it."
Data scientist: *deploys it*

Facebook, TikTok, Twitter—algorithms designed to manipulate because someone in authority said so.

Layoffs and Cruelty

“Fire Them All by Email”

HR: "This is brutal. People deserve better."
Exec: "Do it this way. It's more efficient."
HR: *sends mass termination emails*

The “Just Following Orders” Defense in Tech

After the Cambridge Analytica scandal, Facebook engineers said: “I was just doing my job.”

After Uber’s toxic culture was exposed, employees said: “Leadership set the tone.”

After crypto scams, developers said: “I just wrote the smart contracts.”

The Milgram Experiment proves this defense is real—and dangerous.

How to Resist Unethical Orders

1. Recognize the Pattern

If you’re uncomfortable, ask yourself:

  • “Am I doing this because it’s right, or because someone told me to?”
  • “Would I do this if no one was watching?”

2. Find Allies

In Milgram’s variations, obedience dropped to 10% when another person refused. Be that person. And find others who agree.

3. Question Authority

Authority isn’t always right. Challenge:

  • “Why are we doing this?”
  • “What are the consequences?”
  • “Is this ethical?”

4. Create Distance from the Harm

In Milgram’s experiment, obedience dropped when teachers saw the learner’s face.

In tech: See the users you’re harming. Read the support tickets. Watch the user interviews. Make the harm real.

5. Establish Your Own Moral Line

Decide in advance what you won’t do, regardless of who orders it:

  • “I won’t build features designed to manipulate.”
  • “I won’t ignore security vulnerabilities.”
  • “I won’t deploy harmful algorithms.”

6. Exit if Necessary

Sometimes the only ethical choice is to refuse and leave.

What Engineers Should Learn

1. You Are Responsible

“I was just following orders” didn’t work at Nuremberg. It won’t work for you.

2. Code Has Consequences

Your code can:

  • Manipulate users
  • Violate privacy
  • Enable fraud
  • Cause harm

You’re not just a “neutral implementer.”

3. Authority Isn’t Ethics

Your manager, your CEO, your investors—none of them absolve you of moral responsibility.

4. Dissent Is Necessary

If something feels wrong, say so. You might be the only one willing to speak up.

5. Build Ethical Muscle

Practice saying “no” to small unethical requests. It makes it easier to refuse big ones.

The Uncomfortable Truth

Most of us would press the button.

It’s easy to say, “I would never obey an immoral order.” But Milgram’s experiment proves otherwise.

65% of people—educated, normal, moral people—delivered potentially lethal shocks.

The real question isn’t whether you’re immune to authority. You’re not.

The question is:

“What am I doing now to prepare for the moment when authority asks me to do something wrong?”

Key Takeaways

  • ✅ Ordinary people obey authority even when it means harming others
  • ✅ Gradual escalation makes immoral actions easier to justify
  • ✅ Institutional legitimacy increases obedience
  • ✅ Distance from victims makes cruelty easier
  • ✅ Dissenting voices dramatically reduce obedience
  • ✅ “Just following orders” is real—and not an excuse

Stanley Milgram wanted to understand how the Holocaust happened.

He discovered the answer wasn’t “Germans are uniquely evil.”

The answer was: Any of us, under the right conditions, would become the executioner.

In tech, we’re constantly given “orders” by management, investors, and market pressure.

Build the dark pattern. Ship the buggy code. Ignore the privacy violation. Deploy the manipulative algorithm.

When that moment comes—and it will—will you press the button?

Or will you be the person who refuses?