THE LAWS OF MEMETICS_

The Laws of Memetics v0.1

This is my attempt at formalizing a set of laws that govern most of what we understand about how and why memes spread. Have tried to express each as succinctly and understandably as possible without losing signal.


The Substrate

Law Zero: Scarcity

Attention is finite and zero-sum. Every meme that wins is another that loses.

You can't think about two conflicting ideas at the same time. This means ideas compete. When one wins your attention, others lose it.


The Fundamental Laws

First Law: Triangulated Desire

Memes spread through mimetic desire: we want what others we admire or envy want. The model, not the object, drives transmission.

We don't believe things because they're true. We believe them because people we want to be like believe them. The same claim gets opposite reactions depending on who says it. It was never about the claim.

Second Law: Arousal as Energy

Emotional arousal is the transmission energy of memes. High-arousal states (anger, awe, fear, disgust) accelerate spread; low-arousal states (nuance, sadness, contentment) slow it down.

This is why outrage goes viral and corrections don't. When you're angry, you share. When you're contemplative, you don't. Arousal is the fuel.

Third Law: Integration Lock

Once a meme becomes part of your identity or your explanatory worldview, evidence stops mattering. Attack the meme, and you attack the person.

Two ways this happens:

Once a belief is load-bearing, people defend it even when they know the correction is true. And persecution makes it worse: being attacked for a belief makes you hold it tighter.


The Derived Laws

Fourth Law: Correction Asymmetry

Retractions inherit all the disadvantages the original claim avoided, and face active resistance from people who've already integrated the claim.

The original claim had novelty, emotional punch, and made you look smart for sharing it. The correction? You already know the claim (boring), nuance doesn't get people fired up (low energy), and sharing it either means "I was fooled" or "I'm being pedantic" (status cost). Plus it threatens beliefs people have already woven into their identity. The game is rigged from the start.

Fifth Law: Refraction

When memes cross from one group to another, they change. The further the groups are from each other, the more the meme transforms.

Ideas get simpler as they travel. Ambiguity gets resolved in favor of what the new group already believes. The meme picks up tribal markers: it becomes "our" version versus "their" version. Keeping an idea intact across group boundaries takes serious effort (institutions, sacred texts, formal training). Without that effort, drift is inevitable.


The Regime Laws

Sixth Law: The Utility Exception

Ideas that spread peer-to-peer are selected for desirability. Ideas passed down through generations are selected for usefulness. When these conflict, desirability wins in the short term, but usefulness wins over time.

Conspiracy theories spread faster than hand-washing advice. But hand-washing persists across centuries because ideas that kill your kids don't get passed down. The problem: modern technology massively amplifies peer-to-peer spread while weakening generational transmission. We've tilted the field toward virality and away from truth.

Seventh Law: Institutional Capture

Once a meme gets written into law, curriculum, or official policy, it no longer needs to spread on its own merits. It replicates through compliance.

This creates zombie ideas: beliefs that would lose in open competition but persist because the institution requires them. Academic theories no one believes but everyone cites. Regulations no one defends but no one repeals. Corporate practices that survive every reorganization. These ideas aren't alive. They're undead, animated by enforcement rather than desire.


The Mnemonic

  1. Attention is scarce
  2. We want what models want
  3. Arousal drives spread
  4. Integration prevents removal
  5. Corrections fail structurally
  6. Meaning drifts across boundaries
  7. Utility competes with desire (and loses short-term)
  8. Enforcement escapes selection (and calcifies)

What This Explains


What This Predicts

  1. Pre-bunking beats debunking. Getting there first with good information works better than correcting bad information after it's integrated.

  2. Model trumps message. You can predict what people will believe better by knowing who said it than by knowing what was said.

  3. Corrections can work if they satisfy the same needs. A correction from a trusted source, with an alternative explanation, that doesn't threaten identity, can actually displace a false belief. It just has to do everything the original did, plus be true.

  4. Expert immunity is domain-specific. Being an expert protects you in your field. Everywhere else, you're just as vulnerable as anyone. Brilliant physicists fall for diet fads.

  5. Institutional collapse triggers rapid idea change. When the enforcement mechanisms break down, zombie ideas suddenly have to compete again. This is when paradigm shifts happen.

← Back to Home