The meeting is running. Someone asks the CEO for a decision. He doesn't pause to think. He doesn't consult his team. He opens his laptop, types the scenario into an AI tool, and reads out whatever comes back.

If this feels like an exaggeration, you're about three years behind.

A 3Gem market research study of 200 UK business leaders found 62% use AI to make "most decisions." Not as a reference point. Not as a second opinion. As the decision engine itself.

This number should alarm you. Not because AI is unreliable. Because 62% of leaders have stopped doing the work.

A business executive sitting back passively while an AI interface makes decisions on his behalf

The Numbers Don't Lie

The same study found 70% of those leaders second-guessed their own judgment when AI gave a conflicting answer. Nearly half (46%) said they rely on AI more than their own colleagues.

Read it again: almost half of senior leaders trust a language model over the people who know their business, their customers, and their team.

Carnegie Mellon and Microsoft published research showing workers who trust AI outputs demonstrate a lower propensity for critical thought. The pattern repeats wherever researchers look. Delegate your thinking, and you stop thinking.

This isn't a new problem. It's a new flavour of an old one. In 2025, 27% of those same executives used AI to make termination decisions. Think about the person on the receiving end of one of those. Their livelihood, decided by an algorithm, delivered by someone who abdicated the weight of the decision entirely.

What You're Losing

The MIT Media Lab published a study warning of cognitive atrophy from excessive reliance on AI-driven solutions. Your brain weakens in the areas you stop exercising.

Karen Thornber, a professor at Harvard, drew the parallel with GPS navigation. We travel more than ever, yet our spatial memory has declined because we never need to build it. The Harvard Gazette described it clearly: skills like discernment, evaluation, and reflection become more valuable as AI proliferates... precisely because they're the skills most at risk.

A split brain — one half active and glowing with neural connections, the other dim and atrophied from AI over-reliance

Your judgment works the same way. Wrestle with hard problems and you sharpen. Skip the wrestling by delegating to a language model, and your ability to wrestle deteriorates. Not more efficient. Worse.

I've seen this in engineering teams. Engineers who use AI to write all their code eventually stop being able to read it critically. They lose the intuition for what looks wrong. When something breaks and the AI doesn't have the answer, they're stuck. The tool removed the need to develop the skill, and now the skill is gone.

Leaders face the same atrophy. Stop forming your own views and you lose the judgment needed for decisions no AI handles well: the ambiguous calls, the interpersonal situations, the ethical grey areas where character and context matter more than pattern-matching. When you need it most, it won't be there.

The Leadership Problem Is Bigger Than You Think

Leaders who stop thinking independently stop leading. They broadcast.

I wrote about a related pattern in a post on prompts and leadership: the quality of your AI output reflects the quality of your thinking. Vague prompts produce vague output. Poor framing produces poor answers.

The deeper version of the same problem: when you hand a decision to AI before forming any view yourself, you've stopped leading. You're not thinking through the situation. You're transcribing a response.

Your team reads you. They know when you've genuinely thought something through versus when you're reciting output. This difference matters for trust. It matters for credibility. It matters because your team's willingness to execute a direction depends partly on whether they believe you understood the situation before deciding.

When a leader changes direction mid-project because "the AI suggested it," without any explanation of the context or reasoning, the team doesn't follow. They comply. There's a difference. Compliance without understanding is fragile. It collapses under pressure.

Leadership consultant Ben Morton frames it this way: if you outsource your thinking, the decisions coming back belong to whoever designed the model. Not to you.

You're not the decision-maker. You're the distribution channel.

The 99.5% Problem

My research found 99.5% of survey respondents said they've experienced one or more types of bad boss. The number one thing bad bosses have in common? They don't engage with the people they lead.

Outsourcing your thinking to AI is a new form of the same disengagement. It looks productive. You're moving fast, processing information, generating answers. But the people around you don't feel led. They feel processed.

The trust gap this creates compounds over time. Teams stop bringing you real problems because they sense you're not genuinely engaging with them. They work around you. They escalate past you. They check out.

AI Is a Tool, Not a Brain Transplant

AI is genuinely useful. I use it every day. It catches errors I miss, summarises material faster than I read, and surfaces options I might not have considered. All of it is legitimate.

The problem isn't using AI. It's using AI instead of thinking.

The difference is sequence. Bring your own perspective first. Form a view. Then use AI to stress-test it, find gaps, or speed up execution. The thinking happens before the tool, not in place of it.

Think of it as a research assistant, not an oracle. You still need to understand the question before you ask it, evaluate the answer before you use it, and own the decision before you announce it.

A leader thinking independently at a whiteboard, laptop closed, team watching with engagement

Three Habits Worth Building

Write your view first. Before using AI for any significant decision, write your position in one sentence. One sentence. If you're unable to produce it, you need more thinking time, not a faster tool. The inability to articulate a position is a signal, not a prompt to skip ahead.

Treat AI output as a first draft. AI gives you a starting point. Your job is to edit, challenge, and improve it. If you read the output and nod along, you've handed over your role. The value is in the push-back you bring, not the content it generates.

Keep human decisions human. Any decision affecting a person directly stays with you. Performance conversations, restructures, team changes... these belong to the people accountable for them. AI should inform, not decide. The person on the receiving end deserves a human who understood the situation, not an algorithm dressed up in leadership clothes.

What the Best Leaders Are Doing

The leaders I respect most use AI aggressively for leverage... but they engage deeply on things worth engaging with. They've set a clear rule for themselves: AI handles the volume, humans handle the weight.

They use AI for research, draft generation, summarisation, and option generation. They don't use it to avoid the uncomfortable cognitive work of forming a view, weighing trade-offs, and sitting with uncertainty long enough to develop genuine judgment.

The result: they're faster on the low-stakes work and better on the high-stakes decisions. Their teams trust them. They keep getting sharper.

The leaders who outsource everything get the opposite result. Faster on everything, worse on the things where it counts.

The Real Risk

The tools will keep improving. Your judgment won't improve on its own. You have to exercise it. Make calls without AI input. Build your own view before asking for a second opinion. Stay with a problem long enough to genuinely think it through.

The leaders who come out of the AI era with sharp judgment are the ones who used AI as a tool without letting it atrophy the muscle.

What's the last decision you made where the thinking was entirely yours?