Apple has the best silicon on the planet. The most loyal user base in consumer tech. Margins that make every other hardware company weep. And after three years of trying, they still haven't shipped a working AI assistant.
Not a technology problem. A culture problem.
Let me walk you through what happened at Apple, and why it should make every tech leader think hard about their own organisation.

Three Years of Broken Promises
Cast your mind back to WWDC 2024. Apple took the stage and announced Apple Intelligence. A smarter Siri. Context-aware, able to take action across apps, integrated with ChatGPT for the things it couldn't handle alone.
The crowd responded. The press responded. Stock responded.
Then came the first delay. Spring 2025. Internal testing showed "high error rates." Not ready.
Then another delay. Target shifted to 2026.
Then, this month, a story emerged: Apple sent roughly 200 of its Siri engineers to a multi-week AI coding bootcamp. Not to ship AI... to learn how to use AI tools to write code. Because the Siri team had earned a reputation inside Apple as "a laggard" for resisting the very tools they were supposed to be building.
Let it land.
The team building AI products didn't use AI tools. And Apple spent years not noticing, or not caring enough to fix it.
Two Teams. One War.
Apple's AI problems don't come from a shortage of talent or capital. They come from two powerful teams inside the same company pulling in opposite directions.
On one side: Craig Federighi's software engineering group. Tight, opinionated, ships products. They've delivered macOS and iOS on a clockwork annual cycle for over a decade.
On the other: John Giannandrea's AI/ML group. Giannandrea came from Google in 2018, hired specifically to close Apple's AI gap. His team brought different management styles, different priorities, and a different culture.
The two groups never meshed.
The Information's investigation... which Apple's own leadership didn't publicly dispute... described "long-running tensions" and "contrasting management styles and work cultures" leading to "growing dysfunction." During one voice control project, senior leaders in Federighi's group "openly expressed frustration" with their counterparts in the AI group, who were seen as "hesitant and risk-averse."
In December 2025, Giannandrea stepped down as AI chief. Apple began breaking up the AI/ML organisation entirely, scattering pieces to other parts of the company.
Not a restructuring. An admission.
Two teams sharing a company didn't share a culture. When the pressure came, they broke instead of building.

The Privacy Problem Nobody Names
There's a deeper tension buried in all of this. Worth naming.
Apple built its brand on privacy. "What happens on your iPhone, stays on your iPhone." Not a slogan. A genuine engineering philosophy and a meaningful differentiator from competitors.
AI needs data. Good AI needs a lot of it, processed in large data centres, with feedback loops improving models over time. Privacy and AI's data appetite pull against each other.
Apple's original plan was a privacy-respecting two-model system: a small on-device model for sensitive tasks, a larger cloud model for everything else. Clean. On-brand.
It didn't work. Technical complexity too high. Error rates unacceptable.
So Apple pivoted to a single large cloud model. Then, by some accounts, struck a deal with Google to run Gemini under the hood... the same Google they compete with in search, browsers, and nearly everything else.
A painful pivot for a company whose identity was built on not being Google.
The culture resisted. Of course it did.
Fifty Thousand Outdated GPUs
Here's a detail from the reporting. It didn't get enough attention.
While OpenAI and Google were scaling AI on infrastructure running hundreds of thousands of GPUs, Apple's data centres reportedly contained "only about 50,000 outdated GPUs." Leadership only approved partial funding for newer GPU upgrades.
This isn't a company short of money. Apple generates more profit in a quarter than most companies generate in a decade.
This is a company whose culture didn't treat AI infrastructure as a priority until it was too late. The budget decisions reflect what leadership valued. And for years, what they valued wasn't AI.
You get the roadmap your culture builds... not the one your strategy deck promises.
What Your Organisation Should Learn
I'm not writing this to pile on Apple. They have smart people working on hard problems. They'll get there.
I'm writing this because I see the same patterns in smaller organisations constantly.
Two teams with different management styles, forced to collaborate on a product neither fully owns. A leadership culture that says "yes" to a strategic direction while budget decisions quietly say "no." An engineering team resisting the tools it's supposed to be building. An identity assumption... "we're the privacy company," "we're the enterprise company," "we do things properly"... that slows down change faster than any technical challenge.
These aren't Apple problems. They're organisational problems. Apple has the scale to make them visible.

If you're leading an AI initiative right now, here are the questions worth sitting with:
Are your teams aligned on what matters? Not on paper. In practice. Shared incentives, shared metrics, shared working rhythms. Or are they nominally collaborating while guarding territory?
Do your budget decisions match your stated priorities? If AI is the future and the infrastructure budget stays flat, you've told your team the truth by accident. They'll act on what you do, not what you say.
Is your culture's identity in conflict with what AI requires? Some organisations are built on a model AI will undermine. Acknowledging this openly, and making deliberate choices about it, is far better than pretending the conflict doesn't exist.
Are your people using the tools they're supposed to be building? The Siri team not using AI coding tools isn't irony. It's a signal. If your team doesn't believe in what they're building enough to use it themselves, you don't have a technology problem. You have a culture problem.
The Question Worth Asking
Peter Drucker's line gets repeated so often it's become wallpaper: "Culture eats strategy for breakfast." People nod and go back to updating their roadmaps.
Apple's AI story is what it looks like when you ignore it at the billion-dollar scale.
The question for you isn't whether your roadmap is ambitious enough. Most roadmaps are ambitious. The question is whether your culture will let you ship it.
If your AI initiative feels stuck... don't look at the technology first. Look at the meeting where two teams stopped talking to each other. Look at the budget line that didn't move. Look at the engineer who rolled their eyes when the pilot was announced.
Your roadmap is being eaten. And it started well before the first delay.
I've spent years helping organisations close the gap between strategy and culture, through Step It Up HR. If this sounds familiar in your organisation, that's where I'd start.
What's the biggest culture gap you've seen slow down a technology rollout? I'd genuinely like to hear.