Is There an Art to Suffering in an Age of Acceleration?

News & Insights

Feb 17, 2026

2/17/26

10 Min Read

We keep getting stuck in the same argument.

Someone posts an AI-generated image or video and the comments fill up like a bathtub:

“Is this art?”
“Is it real?”
“Is it theft?”
“Is it lazy?”
“Is it inevitable?”

And look — those questions are not stupid. They’re human. They’re the kinds of questions we ask when we feel something shifting under our feet and we want to grab onto a handle that feels morally legible. Or comfortable.

But I think perhaps we’re grasping at the wrong handles.

Because the “is it real?” debate is not the center of the story. It’s one of the costumes the story wears during these interesting times.

The center of the story is scarcity. More specifically: what happens when scarcity collapses. When scarcity collapses, pricing power moves. When pricing power moves, people feel tend to feel it.

As an internal sensation. Some experience a loss of a sense of safety as a creeping suspicion sets in that the game you built your life around is being rewritten, and nobody is handing you the new rulebook.

Helpful navigation

non-technical definitions

Terms

ELI5

Economic rent

The extra money you earn because something about your position is rare.

If only five people have a skill, those five people can charge more. If suddenly there are five thousand people with the same skill (or a piece of software that can do it instantly) prices fall.

That extra income you were making because you were scarce? That’s rent.

It’s not evil. It’s not holy. It’s just what happens when scarcity exists.

When technology reduces scarcity, rent compresses. That compression is what a lot of people experience as “displacement,” even before anyone is formally replaced.

Marginal cost

The cost to make one more unit or thing.

One more burger, spreadsheet, minute of video, legal draft, one more “concept,” one more version, etc.

When marginal cost is high, decisions are careful and budgets are cautious. You're justifying the next unit.

When marginal cost collapses, the question flips.

It stops being, “Should we do this?” and becomes, “Why wouldn’t we?”

Production function

A fancy way of saying: “How is output made?”

Traditionally, output comes from labor + capital + materials.

When software and robotics can substitute for labor on tasks, labor’s leverage changes — not because any of us are less human, but because the bottleneck moves.

This type of change isn’t new

It just feels like someone hit fast forward.

We’ve been here before, structurally.

Industrialization didn’t just add machines, it reorganized what “skill” meant. Mechanized agriculture didn’t just improve yields, it collapsed the need for large portions of farm labor. Containerization didn’t just optimize shipping, it erased entire ecosystems of work at ports. Software. The internet.

AI touches media, law, finance, logistics, education, manufacturing, customer service — everything all at once — while robotics extends it from cognition to action. The speed of change is fast compared to prior shifts.

Sometimes systems move faster than identity can metabolize, people don’t just adapt.

The global layer

Reality is perception

We also tend to talk about “AI disruption” like it’s one unified experience. The underlying incentives are similar, but the friction is different depending a lot on where you live.

In the United States, capital moves fast and labor protections are relatively thin (hiring and firing is comparatively easy). That means volatility is sharp — both upside (hiring in hot markets) and downside (industry-wide layoffs, reorganizations). Our system in the United States tends to reward speed (even when speed is less ideal).

Our read on parts of Europe are that safety nets are stronger and regulation is heavier. Adoption can be slower, but tension may last longer. It appears to be a longer negotiation with a lot of politics, not a sudden drop. That can feel like drag if you’re trying to move quickly, but it can also function like a shock absorber.

In East Asia, demographic reality tends to change the vibe. Aging populations and labor shortages turn automation into a necessity. In that context, robotics means a lot more margin expansion.

With a global perspective, the stakes can appear existential in a different way. Many development models have relied on labor-cost advantage and manufacturing-led growth. If countries can automate more of their own production and shorten supply chains with fewer workers, that ladder gets shorter. In that context, wage compression destabilizes paths that entire countries have depended on.

Adaptation is and will be uneven. We expect uneven adaptation to turn into politics / increased regulation because politics / regulation is what happens when pain distributes unevenly.

Conflicted times

We see five camps

It's easy to end up with a dualistic / good vs. bad view of AI when witnessing the public AI debate or through our own experiences. We think that framing is not very useful. We've tried to distill our thoughts and experiences into five camps.

We challenge you to understand which camp you’re in at any given moment.

The Embracers

They see acceleration as progress. They look at increased efficiency / collapsing marginal cost and say, “That expands the market, increases output, and lowers barriers.”

Often, they’re right about growth, but we their blind spot is often governance. They tend to assume scale is free. It isn’t. Scale sometimes sends an invoice later, and that invoice is usually labeled “trust.” We have seen major data breaches simply from lack of governance as AI tools are adopted.

The Fearful

They see wages compressing, ladders thinning, and identity destabilizing. They are not irrational. They are responding to real signals.

Their blind spot is believing that the previous / current scarcity regime was / is permanent. In terms of labor and skill, scarcity is more like a lease and less like a deed.

The Denialists

They say, “It’s not real. It won’t scale. It’s inferior.” Sometimes they’re making technical points; sometimes they’re building or operating from a psychological shelter.

Their blind spot is that diffusion doesn’t require perfection. It only requires “good enough” plus distribution. The world runs on “good enough” more often than we like to admit.

Conflicted Adopters

They see the shift and feel the compression. They insist on discipline: standards, auditability, transparency, constraints. They don’t pretend removing friction is morally neutral. They know it changes livelihoods.

They know pretending the shift isn’t happening is worse. This camp builds with one hand and holds a fire extinguisher with the other.

The Observers

No hot take, identity panics, or outrages. They watch, seek to map incentives, and learn. They wait to speak until they understand.

Their blind-spot may be over-detachment and watching so long they never act. But their advantage may also very well be timing.

They understand something most people resist: production functions move before perceptions catch up.

A concrete example

Our own

Consider stereoscopic 3D.

Historically you’ve had three paths:

  • Shoot native 3D for high cost, high complexity, and high friction ($15-$20 million+ feature films).

  • Manual conversion with skilled labor, expensive iteration, and quality variance ($10-15 million+ for feature films).

  • Automate: segmentation, depth estimation, stereo synthesis, and all the reliability work required.

If automation becomes “good enough,” the question changes. It stops being: “Is this title worth the budget?”

It becomes: “Why wouldn’t we convert the whole catalog?” That is threshold collapse.

The scarce things become:

  • distribution

  • curation

  • trust

  • comfort

  • governance

  • the ability to say, with evidence, “this is safe and good,” not just “this exists.”

Scarcity doesn’t disappear. It relocates, and whoever controls the scarce layer tends to capture the rent in our view. We think governance will be the critical unlock for scaled AI abundance.

Governance is the new rent

When execution becomes cheap, governance becomes scarce.

Governance is not a vibe. It’s the unsexy standards and machinery.

For us, it's quality standards, audit logs, reproducibility, failure detection, constraint enforcement, stop-loss triggers, provenance, evidence artifacts, license attributions, etc.

In other words: “Show me the work” — especially if things go wrong.

The danger of low marginal cost is not just that more gets made. It’s that more gets made without discipline. It's the difference between creating abundance or a landfill.

In 3D/VR, the thing people underestimate is how quickly a landfill becomes a trust crisis, and how quickly trust crises become consumer cynicism. In other markets, it can appear in the forms of platform clampdowns and policy responses. Because automation without standards degrades trust.

When we say “governance becomes the new rent,” we're not trying to be dramatic or lead you through a funnel to discover our differentiators. We're describing our perception of the new choke point. Automation with governance becomes infrastructure.

Delegation isn’t the threat

Abdication is.

We have a history of delegating cognition. Books replaced memory, calculators replaced manual arithmetic, search replaced manual archives, etc.

Delegation is not itself abdication. Using AI to enhance judgment preserves agency; however, using AI to avoid judgment erodes it. That distinction matters more now because weak judgment can be amplified at scale. That’s not meant to be a moral claim and much it's practical one.

When the dollar or time cost of producing “another one” approaches zero, the cost of producing nonsense approaches zero too. Governance are the guardrails that will determine if we end up in an abundance or a landfill.

Accept things we can't change, change things we can

And wisdom to know the difference.

We do not control diffusion curves, capital cycles, hardware economics, or the fact that marginal costs are collapsing in more domains.

We do control our standards, whether we build systems that can be audited and improved, whether we tie our identity to rent scarcity, whether we become bitter or disciplined, whether we respond early or react late.

Suffering increases when we moralize scarcity, deny incentives, and react before understanding. We end up paying extra energy for the confusion tax.

Suffering increases when we increase the difference between reality and our acceptance of reality. But clarity doesn’t remove the need to adapt. It just removes unnecessary suffering layered on top of the adaptation cost.

We may still feel pressure. We may still feel compressed. We may still face instability. But we suffer less when we stop taking the bottleneck shift personally and spend that energy by asking a question:

“Where is the scarcity moving, and what does it ask of me?” That question led to the start of Anelo.

Join our newsletter list

Sign up to get the most recent blog articles in your email every week.