The Ultimatum Game: Are Humans Really Rational?

Imagine this: I give you $100 and ask you to propose how to split it with a stranger. There’s one catch — if the stranger rejects your offer, neither of you gets anything.

Game theory predicts you’ll offer $1 and keep $99. After all, $1 is better than $0, so the stranger should accept.

In reality? Most people offer $40-50, and offers below $30 are frequently rejected.

Welcome to the Ultimatum Game — the experiment that shattered economists’ assumptions about human rationality.


The Setup

The Ultimatum Game is deceptively simple:

Players:

  • Proposer: Offered a sum of money (say $100) to split
  • Responder: Can accept or reject the split

Rules:

  1. Proposer suggests a split (e.g., $70 for me, $30 for you)
  2. Responder either accepts (split happens) or rejects (both get $0)
  3. Game ends

One shot: No negotiation, no repetition, complete anonymity.

%%{init: {'theme':'dark', 'themeVariables': {'primaryTextColor':'#fff','secondaryTextColor':'#fff','tertiaryTextColor':'#fff','textColor':'#fff','nodeTextColor':'#fff'}}}%% graph TD A[100 to split] --> B[Proposer offers X, 100-X] B --> C[Responder decides] C -->|Accept| D[Proposer gets X
Responder gets 100-X] C -->|Reject| E[Both get 0] style A fill:#2d3748,stroke:#4299e1,stroke-width:3px style D fill:#2d3748,stroke:#48bb78,stroke-width:2px style E fill:#742a2a,stroke:#f56565,stroke-width:3px

The Game Theory Prediction

Let’s use backward induction (from the previous post).

Step 1: Responder’s decision

Suppose Proposer offers $X to Responder.

  • Accept → Get $X
  • Reject → Get $0

Since X > 0, the responder should always accept any positive offer.

Step 2: Proposer’s decision

Knowing the responder accepts anything positive, the proposer should offer the smallest possible amount (say $1 or $0.01) and keep the rest.

Subgame Perfect Nash Equilibrium: (Offer $1, Accept)

This is the prediction of perfectly rational, self-interested agents.


What Actually Happens

Thousands of experiments across dozens of cultures show:

Typical Proposer Behavior:

  • Modal offer: 50% (the most common proposal is an equal split)
  • Mean offer: 40-45% (average offers cluster around fairness)
  • Offers below 30%: Rare (only about 10-20% of proposers offer this little)

Typical Responder Behavior:

  • Offers of 40-50%: Almost always accepted (>95%)
  • Offers of 30%: Accepted about 50% of the time
  • Offers below 20%: Frequently rejected (40-60% rejection rate)
%%{init: {'theme':'dark', 'themeVariables': {'primaryTextColor':'#fff','secondaryTextColor':'#fff','tertiaryTextColor':'#fff','textColor':'#fff','nodeTextColor':'#fff'}}}%% graph LR A[Offer Distribution] --> B[50%: 35% of proposers] A --> C[40-49%: 45% of proposers] A --> D[30-39%: 15% of proposers] A --> E[<30%: 5% of proposers] F[Rejection Rates] --> G[50% offer: 0-5%] F --> H[40% offer: 5-10%] F --> I[30% offer: 30-50%] F --> J[20% offer: 50-70%] F --> K[10% offer: 60-80%] style B fill:#2d3748,stroke:#48bb78,stroke-width:3px style K fill:#742a2a,stroke:#f56565,stroke-width:3px

People are systematically rejecting free money to punish unfairness.


Why Do People Reject?

Explanation 1: Fairness Preferences

Humans have a preference for fairness that goes beyond pure monetary self-interest.

Responders aren’t just maximizing money — they’re maximizing:

  • Money received
  • Minus the disutility from accepting an unfair offer

This can be modeled as:

Utility = Own payoff - α × (Unfairness penalty)

Where unfairness is measured by the difference between what you get and what’s “fair.”

If the unfairness penalty is large enough, rejecting becomes rational even though it costs you money.

Explanation 2: Negative Reciprocity

Humans have a strong urge to punish perceived violations of norms, even at a cost to themselves.

This is called altruistic punishment — you pay to punish someone who wronged you, even though you gain nothing materially.

Evolutionary perspective: This trait may have evolved because it enforces cooperation in groups. Societies with punishers thrive; free-riders get punished.

Explanation 3: Emotions (Anger)

Brain imaging studies show that unfair offers activate the anterior insula — a brain region associated with disgust and anger.

When the emotional response (anger at unfairness) is stronger than the rational calculation (money is better than nothing), people reject.

“I’d rather we both get nothing than let you take advantage of me.”

%%{init: {'theme':'dark', 'themeVariables': {'primaryTextColor':'#fff','secondaryTextColor':'#fff','tertiaryTextColor':'#fff','textColor':'#fff','nodeTextColor':'#fff'}}}%% graph TD A[Unfair Offer] --> B[Emotional Response: Anger/Disgust] A --> C[Rational Calculation: Money > 0] B --> D{Which is stronger?} C --> D D -->|Emotion wins| E[REJECT] D -->|Reason wins| F[ACCEPT] G[Cultural Norms] -.->|Shape| D H[Brain: Anterior Insula] -.->|Activated by| B style E fill:#742a2a,stroke:#f56565,stroke-width:3px style F fill:#2d3748,stroke:#48bb78,stroke-width:2px

Cross-Cultural Variations

Fascinating finding: Fairness norms vary across cultures.

Small-scale societies study (Henrich et al., 2001):

Society Mean Offer Description
Machiguenga (Peru) 26% Highly self-sufficient, little cooperation beyond family
U.S. Students 44% WEIRD (Western, Educated, Industrialized, Rich, Democratic)
Ache (Paraguay) 51% Extensive food sharing norms
Lamalera (Indonesia) 58% Whale hunters with strict sharing norms

Key insight: Societies with stronger cooperation norms and market integration make fairer offers.

Implication: Fairness isn’t universal — it’s learned through culture and social structures.


The Dictator Game: Isolating Fairness

To separate “fear of rejection” from “fairness preferences,” researchers created the Dictator Game:

  • Proposer splits the money
  • Responder has no choice — must accept

Prediction if people are selfish: Give $0

Reality:

  • Mean offers: 20-30% (lower than Ultimatum, but still positive)
  • About 30-40% give nothing
  • About 20% give half

Conclusion: People have genuine fairness preferences, not just fear of rejection. But the fear of rejection in the Ultimatum Game does push offers higher.

%%{init: {'theme':'dark', 'themeVariables': {'primaryTextColor':'#fff','secondaryTextColor':'#fff','tertiaryTextColor':'#fff','textColor':'#fff','nodeTextColor':'#fff'}}}%% graph LR A[Ultimatum Game: 40-45% average] --> B[Fear of rejection] A --> C[Fairness preference] D[Dictator Game: 20-30% average] --> C B --> E[Difference ≈ 15-20%] E --> F[Strategic generosity] C --> G[True fairness preference] style A fill:#2d3748,stroke:#4299e1,stroke-width:2px style D fill:#2d3748,stroke:#ed8936,stroke-width:2px

Variations That Test Rationality

1. Ultimatum Game with High Stakes

Question: Do people still reject when it’s $1,000 or $10,000?

Result: Rejection rates decrease slightly, but still occur. Even with life-changing sums, some people reject offers they consider unfair.

Conclusion: Fairness concerns aren’t trivially overcome by higher stakes.


2. Responder Competition

Setup: Multiple responders compete; if one rejects, the proposer can offer the same split to another.

Result: Offers drop significantly. Responders have less leverage, so proposers offer less.

Insight: Market competition reduces fairness — just as economic theory predicts!


3. Proposer Earned the Money

Setup: Proposer “earns” the money by performing a task before splitting.

Result: Offers decrease. Proposers feel more entitled to keep a larger share if they earned it.

Insight: Context and entitlement matter for fairness judgments.


What Does This Mean for Economics?

The Ultimatum Game (and related experiments) sparked the field of behavioral economics, which challenges traditional assumptions:

Traditional Assumption:

  • Humans are rational (maximize own payoff)
  • Humans are selfish (don’t care about others’ payoffs)
  • Preferences are stable and context-independent

Behavioral Reality:

  • Humans care about fairness
  • Humans engage in altruistic punishment
  • Preferences are context-dependent (framing, entitlement, social norms)
  • Emotions matter (anger, disgust, pride)

Implication for policy: Models based on pure self-interest may fail to predict behavior. Fairness, reciprocity, and social norms must be incorporated.


Brain Science: The Rational vs Emotional Battle

fMRI studies reveal the neural underpinnings:

  • Unfair offers activate: Anterior insula (disgust/anger) and dorsolateral prefrontal cortex (reasoning)
  • Acceptance predicted by: Higher prefrontal cortex activity (reasoning wins)
  • Rejection predicted by: Higher insula activity (emotion wins)

Damage to prefrontal cortex → More rejections (less ability to override emotional impulse)

This suggests: Accepting unfair offers requires cognitive control to suppress the emotional desire to punish.

%%{init: {'theme':'dark', 'themeVariables': {'primaryTextColor':'#fff','secondaryTextColor':'#fff','tertiaryTextColor':'#fff','textColor':'#fff','nodeTextColor':'#fff'}}}%% graph TD A[Unfair Offer] --> B[Anterior Insula: Disgust/Anger] A --> C[Dorsolateral Prefrontal Cortex: Reasoning] B -->|Emotional impulse| D[Reject!] C -->|Cognitive control| E[Accept for the money] F{Decision} --> D F --> E G[Prefrontal Damage] -.->|Weakens| C G -.->|Leads to| D style B fill:#742a2a,stroke:#f56565,stroke-width:3px style C fill:#2d3748,stroke:#4299e1,stroke-width:3px

Strategic Proposers: Anticipating Rejections

Smart proposers understand responder psychology.

If proposers know offers below 30% get rejected:

  • Offering 20% → Expected value = 0.3 × $80 = $24 (if 70% rejection)
  • Offering 40% → Expected value = 0.95 × $60 = $57 (if 5% rejection)

Offering 40% is strategically optimal given responder fairness preferences!

Lesson: Rational play isn’t about maximizing immediate gain — it’s about maximizing expected gain given how others actually behave.


Implications for the Real World

1. Negotiations

Don’t open with an extreme lowball. The other side may walk away out of principle, even if your offer is better than their alternative.

Example: Salary negotiations, labor disputes, international treaties.

2. Business Relationships

Fair treatment matters for long-term relationships. Customers may boycott or leave negative reviews if they feel exploited, even when switching costs money.

3. Public Policy

Policies perceived as unfair face resistance, even if they’re economically efficient. Political legitimacy requires perceived fairness.

Punishment isn’t just about deterrence — people demand that wrongdoers be punished, even when it’s costly (prison expenses).


Critiques and Limitations

Critique 1: Lab Experiments ≠ Real Life

Counter: Field experiments show similar results. Fairness concerns persist outside the lab.

Critique 2: Cultural WEIRD Bias

Counter: Cross-cultural studies show variation, but fairness concerns appear universal (though the threshold varies).

Critique 3: Stakes Are Too Low

Counter: High-stakes replications show similar patterns.

Critique 4: People Learn to Be Selfish

Counter: Experienced participants (repeated games with different partners) still exhibit fairness preferences, though they decrease slightly.


Key Takeaways

  1. Game theory prediction: Proposers offer $1, responders accept → Reality: Offers average 40-45%, low offers are rejected
  2. Humans have fairness preferences that aren’t captured by pure self-interest models
  3. Altruistic punishment: People pay to punish unfairness
  4. Emotions matter: Anger and disgust drive rejections
  5. Culture shapes fairness: Cooperation norms affect ultimatum behavior
  6. Strategic proposers anticipate rejections and offer more to avoid them
  7. Behavioral economics emerged from ultimatum game findings

Practice Problem

You’re playing an Ultimatum Game with $100. You know from previous experiments that:

  • Offers of ≥$40 are accepted 95% of the time
  • Offers of $30 are accepted 50% of the time
  • Offers of $20 are accepted 20% of the time

What offer maximizes your expected payoff?

Solution

Calculate expected values:

Offer $60 (keep $40):

  • EV = 0.95 × $40 = $38

Offer $70 (keep $30):

  • EV = 0.50 × $30 = $15

Offer $80 (keep $20):

  • EV = 0.20 × $20 = $4

Optimal offer: $40, keeping $60 for yourself.

Expected payoff: $38

This shows that anticipating fairness preferences leads to fairer offers even for selfish proposers.


What’s Next?

The Ultimatum Game reveals that humans aren’t the rational calculators economists once assumed. But in some strategic contexts, cold rationality reigns supreme.

Next, we’ll explore Auction Theory — where billions of dollars are won and lost based on mathematical strategy, and where even small mistakes can cost you everything.


This post is part of the Game Theory Series, where we explore the mathematics of strategic decision-making.