2026.15: Myth 5 - Executives Are Hungry for AI Transformation
Happy Friday, from Calgary.
This is Part 5. The last myth. The one that ties all the others together.
Over the last five weeks we’ve covered…
Myth 1 - Fixing the Foundation
Myth 4 - Why Your Employees Are Already Building the Future
Every one of those, traces back to this.
What leaders say vs. what they actually do.
Let's break it down.
The Signal:
CEOs Are Saying One Thing and Doing Another. The Numbers Don't Lie.
BCG's 2026 AI Radar report surveyed 640 CEOs and 2,360 senior leaders. 82% are more optimistic about AI than a year ago. AI is a top-3 strategic priority for 2 out of 3 CEOs. Half believe their job stability depends on getting AI right this year.
That sounds like commitment. Then you read the rest of the data.
60% of those same CEOs admitted they've intentionally slowed AI implementation over concerns about errors and malfunctions. Only 6% plan to scale back spending if AI fails to deliver. They're slowing down the work while refusing to cut the budget. Motion without movement.
EY's survey of 500 US senior leaders found that 96% report AI-driven productivity gains. But 65% admit they can't tie those gains to AI adoption. They're reporting results they can't measure.
The spending gap is just as wide. In 2024, 65% of executives predicted they'd invest at least $1 million in AI the following year. Only 58% actually did. 34% predicted $10 million or more. Only 23% hit that number.
And the clock is ticking. An HBR piece from this month found that 71% of global CIOs said AI budgets would be frozen or cut if value can't be demonstrated within 2 years.
BCG's own "Widening AI Gap" report found that deeply engaged C-level leaders are 12x more likely to be in the top 5% of companies winning with AI. Which means 95% of companies aren't winning. The difference isn't the technology. It's leadership.
The Scale:
Myth 5: Executives Are Hungry for AI Transformation.
Reality: They're hungry to talk about it.
Andriole was blunt about this in 2017. The number of executives who really want to transform is small. The gap between what leaders say and what they do is wide. Nothing has changed.
Today, executive AI talk creates pressure. Subordinates do something performative. Something leadership can point to on the quarterly call. The result: a lot of motion and very little movement.
This is a systems problem. Donella Meadows identified information flow as 1 of the most important characteristics of a healthy system. When information moves accurately from edges to center, the system adapts. When it doesn't, the system stagnates.
In most organizations, the information flow around AI is broken. Signals from frontline employees never reach the people making strategy decisions. Honest feedback gets punished. Telling leadership what they want to hear gets rewarded. Every layer of management filters the signal until it's useless.
Kim Scott called this the absence of radical candor. Without it, the information that would drive real change gets sanitized before it reaches the people who need it most.
VG's Three Box framework explains why it persists. Box 1, managing the present, is where all the reporting happens. It's what executives are evaluated on. Box 3, creating the future, requires admitting what's not working. But executives who built their careers in Box 1 have every incentive to protect it. So they fund Box 3 initiatives without doing the Box 2 work. Those initiatives get strangled by Box 1 operating logic. Then leadership blames execution. "The team couldn't deliver." The team was never set up to succeed.
George Westerman of MIT captured the result: when transformation is done wrong, all you have is a really fast caterpillar. That's what most AI "transformation" looks like right now. Fast caterpillars. Multiple initiatives in flight. No vision of a butterfly in sight.
Meadows argued that the most powerful place to intervene in a system is at the level of its goals. If the real, unstated goal of your AI strategy is "say the right words on the quarterly call," the system will produce exactly that. Words. Not results.
If you want different behavior, change the goal. Make it "build the capacity to adapt" instead of "implement AI." The tool is secondary. The muscle is primary.
Because it's hard, it's worth doing.
The Deep Dive:
This week's deep dive closes the series. It covers the full anatomy of performative transformation, why the information flow breaks, what the duty to dissent looks like in practice, and the 5 steps to make the goal real instead of rhetorical.
Read: Executives Love Talking About AI. The Numbers Say They’re Faking It.
Thanks for reading!
That's the series. 5 myths. 5 weeks. If you've been here from the start, thank you. If you jumped in at the end, go back to Myth 1 - Fixing the Foundation.
The full Five Myths About AI Transformation paper is where this all started. If you haven’t read it yet, start there.
I’d love to hear about which myth hit the hardest for you? Which one are you living inside right now? Leave a comment. The conversation doesn’t end here. It starts.
See you next Friday.
Best,
JT