door

AI is compression (and why that frame is too small)

AI Is Not Just Compression. It Is Coherence.

Responding to: “AI Is Compression: Why Human Judgment and Agency Will Matter More Than Coding in 2026”
by Clarencer R. Mercer — Medium, Feb 24, 2026


The phrase “AI is compression” is clever, sticky and timely.

It captures something important about large language models: they distill staggering amounts of human text into a smaller parametric space, and then unfold that compression back into language on demand. They do not remember every sentence; they learn structure.

Clarencer Mercer’s essay makes that case well. If AI is compression, the crucial question becomes: what remains incompressible? Judgment, taste, risk, long-horizon thinking. That is a useful way to talk about how human work shifts in an AI-heavy world.

But there is a deeper shift underneath this metaphor that “compression” cannot see.

Compression is what the model does internally.
Coherence is what it does to the world.

And that difference matters.


1. What “AI is compression” gets right

The essay is strongest where it is most concrete:

  • It shows clearly how LLMs compress patterns rather than memorize examples.
  • It highlights that “compressible work” becomes cheap.
  • It argues that human value moves toward problem selection, constraint design and risk ownership.

On this level, the article is almost certainly right. If your value is mostly producing standard forms of text or code, the leverage curve will eat you.

Where the piece becomes less precise is when it treats “compression” as the main story of AI instead of as one visible layer of a much larger architectural change.


2. The limits of the compression frame

Compression is an internal property of the model.
Civilization is an external property of our systems.

The compression metaphor implicitly says:

  • AI is a super-compressor of past information
  • Humans decide what to do with that compressed past

The risk is that it keeps our attention on content instead of on the climate that emerges when these systems saturate our environment.

Three blind spots appear:

  1. Compression ignores the thermodynamics of attention.

    Faster execution increases audit load, switching cost and residual tension. Without redesign, AI leverage becomes cognitive exhaustion.
  2. Compression focuses on individuals, not systems.

    It says nothing about trust, institutions or shared reality when coherence is no longer enforced by slow channels.
  3. Compression treats AI as a tool, not a climate.

    AI is shifting from “thing you call” to “background presence that shapes how everything feels and flows.”
Compression describes the zip file.
It does not describe the weather.

3. Transformers as coherence infrastructure

Zooming out reveals a different picture.

Transformers do not just compress data; they stabilize patterns across previously fragmented information spaces. They are coherence infrastructure:

  • They make consistent narratives cheap across millions of documents.
  • They smooth gaps in language, policy and communication.
  • They unify interfaces and documentation into one continuous surface.

AI is not just compression of the past — it is real-time coherence over the present.


4. The missing layer: attention, stress and reversibility

If AI becomes infrastructure, the deeper question becomes:

What happens to human attention?

  • Attention is finite.
  • Irreversible stress accumulates when systems require constant prediction and defense.
  • Technologies either amplify irreversibility or absorb it.

A humane standard:

  • A system is future-compatible only if stress remains reversible.
  • A system is humane only if it carries coherence for people instead of forcing each person to carry coherence against the system.

The real divide is not compressible vs incompressible workers — it is extractive vs carrying infrastructure.


5. AI as climate, not competitor

AI is not a mind.
AI is climate.

A polluted informational climate erodes trust, cognition and resilience.

So the core questions become:

  • Does AI stabilize attention or fragment it?
  • Does it reduce noise or amplify it?
  • Does it keep stress reversible or lock people into acceleration?

The compression metaphor is for individual survival.
The climate metaphor is for civilizational survival.


6. Beyond “becoming incompressible”

“AI is compression” is useful for career strategy:

  • Automate the trivially compressible.
  • Invest in judgment, taste and cross-domain synthesis.
  • Design workflows instead of executing steps.

But as AI becomes infrastructure, the frame becomes too small.

We need to ask:

  • How do we lower semantic entropy in institutions?
  • How do we design interfaces that calm instead of agitate?
  • How can AI act as a warm, stabilizing background rather than a cold amplifier of pressure?

These are questions about engineering coherence.


Conclusion

If AI is only compression, the future becomes a sorting mechanism.

If AI is also coherence, the future becomes an environment where technology quietly carries complexity that used to sit on individual nervous systems.

Mercer is right that “what cannot be compressed becomes power.”
But the greater task is to build systems where power is measured not by compression but by how much human coherence they can safely carry.

That is the difference between an AI career strategy and an AI civilization strategy.
We are already late to begin the second.

Geef een reactie

Reactie