In the early 1900s, British colonial India faced a venomous problem: too many cobras slithering through the streets of Delhi. The British government, determined to reduce the cobra population, came up with what seemed like a brilliant solution—offer a bounty for every dead cobra brought in.

Initially, the program worked. People killed cobras and collected their rewards. The cobra population appeared to decline. Success!

Or so they thought.

The Twist

Enterprising locals quickly realized they could breed cobras specifically to kill them and collect the bounty. Why hunt dangerous snakes in the wild when you could farm them at home?

When the British discovered this scheme, they were furious. They immediately cancelled the bounty program.

And then things got worse.

With the bounty gone, the cobra breeders had no use for their now-worthless snakes. They released them into the wild. The cobra population exploded beyond its original numbers.

The solution had made the problem significantly worse.

What Is the Cobra Effect?

The Cobra Effect describes any well-intentioned solution that actually makes the problem worse due to perverse incentives or unintended consequences.

It’s named after this colonial India incident, but the phenomenon appears everywhere:

Modern Examples

1. Rat Tails in Hanoi

  • French colonial Vietnam offered bounties for rat tails to reduce the rat population
  • People cut off tails but released live rats to keep breeding
  • Result: More rats, minus their tails

2. Software Bug Bounties Gone Wrong

  • Some companies offered payment per bug fixed
  • Developers intentionally introduced bugs to fix later
  • Quality decreased while “productivity” increased

3. Hospital Wait Time Targets

  • UK hospitals penalized for long ER wait times
  • Solution: Patients left in ambulances outside, not officially “waiting”
  • Statistics improved, patient care didn’t

4. Welfare Cliffs

  • Benefits cut off sharply at income thresholds
  • People turn down raises or work fewer hours to keep benefits
  • Designed to help the poor, traps them instead

Why It Happens

The Cobra Effect occurs when:

  1. Incentives are poorly designed - Rewarding outputs instead of outcomes
  2. People game the system - Finding loopholes in the rules
  3. Second-order effects are ignored - Not thinking beyond the immediate impact
  4. Measurements become targets - Goodhart’s Law: “When a measure becomes a target, it ceases to be a good measure”

In Software Engineering

The Cobra Effect is rampant in tech:

Lines of Code as Productivity Metric

Engineer writes bloated, verbose code instead of elegant solutions
Result: More code, worse quality

Closing Bug Tickets

Bugs marked "cannot reproduce" or "working as intended" to improve metrics
Result: Real issues persist, numbers look good

Code Review Metrics

Measuring reviews by speed or number approved
Result: Rubber-stamping increases, thoroughness decreases

Sprint Velocity Optimization

Teams inflate story point estimates to show "improvement"
Result: Meaningless metrics, no actual productivity gain

How to Avoid the Cobra Effect

1. Think Second-Order

Don’t just ask “Will this work?” Ask “How might people game this?” and “What happens next?”

2. Align Incentives with Outcomes

Reward the result you want, not the activity that supposedly produces it.

  • ❌ Bad: Pay per line of code
  • ✅ Good: Reward working, maintainable features

3. Monitor for Gaming

Track not just your primary metric, but related indicators that reveal gaming.

4. Build in Feedback Loops

Create mechanisms to detect when solutions are backfiring—and be willing to change course.

5. Expect the Unexpected

Complex systems produce complex responses. Your solution will have side effects. Plan for iteration.

The Deeper Lesson

The Cobra Effect teaches us that systems fight back. People are creative, adaptive, and self-interested. They respond to incentives in ways you don’t expect.

The British in India learned this the hard way. But we keep making the same mistake—in policy, in business, in software development—because we:

  • Underestimate human ingenuity in finding shortcuts
  • Overestimate our ability to predict behavior
  • Focus on immediate, visible problems while ignoring hidden dynamics

The Programmer’s Perspective

As engineers, we deal with complex systems daily. We should know better than anyone that second-order effects matter.

When you introduce a new metric, a new policy, a new “efficiency improvement”—pause. Ask yourself:

“What happens when clever people try to optimize for this?”

Because they will. And if the incentives are wrong, you’ll end up with more cobras than you started with.

Key Takeaways

  • ✅ Well-intentioned solutions can make problems worse
  • ✅ Incentives shape behavior in unexpected ways
  • ✅ Measure outcomes, not proxies
  • ✅ Gaming the system is human nature—design for it
  • ✅ Complex systems require thinking in second-order effects

The cobra breeders weren’t villains—they were rational actors responding to incentives. The real problem was a system that rewarded the wrong thing.

When you design metrics, policies, or systems—make sure you’re not accidentally breeding cobras.