When Fear Holds the Wheel: How Some Institutions Use AI to Pretend, Not Perform

“We live in a time when the institutions that run society are struggling to cope with the complexities of the real world, and so they retreat into simplified versions of it.”

Adam Curtis, HyperNormalisation 2016

Complexity makes people nervous. And when institutions get overwhelmed, they don’t reach for transparency — they reach for control. Or at least the appearance of control.

That’s where AI enters the story. Not as a strategy. As a sedative.

The Illusion of Progress

Bureaucracies don’t always lie to others. Sometimes, they lie to themselves.

Internal narratives get built to calm people down. “We’ve got AI on it” is the new “we’re looking into it.” It doesn’t mean something is happening — it means something sounds like it is.

Here’s the thing: AI isn’t the problem. But in dysfunctional cultures, it’s quickly repurposed. It stops being a tool and becomes theatre.

You get slides. Demos. Pilots with no plan to scale. Strategy decks full of buzzwords. Dashboards that look slick but tell you what you already knew.

AI becomes a performance. Not a product.

What’s Really Going On? Fear.

In healthy organisations, AI is used to test hypotheses, challenge assumptions, and build better systems.

In unhealthy ones, AI is used to:

  • Look modern without changing anything
  • Delay decisions that require courage
  • Defer accountability until “the model is ready” or “the technology matures”

At the root of all this? Fear.

Fear of falling behind competitors.
Fear of choosing the wrong vendor.
Fear of looking indecisive.
Fear of being blamed.

So instead of addressing the fear, the institution builds a story around it. And AI becomes the main character.

The Alternative: What Healthy Looks Like

Let’s flip it.

Healthy institutions:

  • Use AI to illuminate uncomfortable truths, not hide them
  • Accept ambiguity and iterate instead of rushing to signal control
  • Measure impact, not optics

These are places where leaders are secure enough to say:

Progress doesn’t come from watching AI happen to others, it comes from putting AI to work — with clarity, with purpose, and with skin in the game

Final Thought: AI Mirrors Culture

AI doesn’t fix dysfunction — it reflects it.

If your culture is fearful, AI will be used to reassure, not effect change.
If your culture is curious, AI becomes an engine for learning and growth.

So if your AI strategy looks great in presentations but isn’t shifting how the organisation actually works, it’s worth asking:

Are we executing a real strategy — or just trying to signal one?