Three years into widespread AI adoption, enterprise budgets tell a cautionary tale: organisations are spending significantly on AI licensing whilst actual usage rates suggest they're not getting the return they expected.

The numbers are stark. Microsoft has sold 15 million Copilot seats for Microsoft 365, yet daily usage rates hover around 30%. When adoption leaders survey enterprises, the consistent finding is the same: companies can tell you how many seats they've bought, but they struggle to tell you how many people actually use them.

In one financial services firm I worked with recently, senior leadership had active Copilot licences for three months. Not one of them had opened the tool.

This isn't an isolated story. It's the pattern I see every week.

The uncomfortable truth: buying AI doesn't equal using AI. Licensing is procurement. Adoption is change management. They're different problems entirely.

Why Adoption Fails: Three Barriers

According to recent enterprise research, the three most cited barriers to AI adoption are: data governance concerns, insufficient change management budget, and the absence of internal AI Champions who can demonstrate workflows to non-technical employees.

But beneath these barriers is a simpler problem: no one's designed the why.

Most organisations approach AI adoption like they approach software rollouts. Buy the tool. Train people on the features. Expect adoption. But AI is fundamentally different. People don't adopt tools because they're available. They adopt tools because they solve a specific problem better than the current alternative.

Consider the typical Copilot rollout. IT buys 1,000 seats. HR creates a 2-hour training session on how to use it. Employees learn that Copilot can draft emails, summarise meetings, and generate code. Then they go back to their workflow — which works fine without it — and Copilot sits dormant.

The training taught features. It didn't teach workflows. It didn't show how using Copilot reduces actual time spent on actual work people actually do.

This is the gap. And it's expensive.

What Adoption Looks Like in Practice

The organisations I work with that have moved beyond the adoption plateau didn't do anything magical. They did something methodical:

First, they identified specific, low-risk use cases where AI genuinely saves time. Not "everything Copilot can do." One or two workflows where the time savings are measurable and obvious. For a marketing team: Copilot drafts first-pass social media copy, the team refines it, then publishes. Time saved: 30 minutes per week. Multiplied across a team, that's real time.

Second, they designated an internal champion — not a vendor, not IT, but someone from that team who learned the tool deeply and showed colleagues how to use it in their actual workflow.

Third, they measured. Not "how many people have licences" but "how many people actually used this this week?" and "did the promised time savings actually happen?" If the answer was no, they adjusted the workflow or the tool.

Fourth, they iterated. After three months, they either scaled what worked or shut down what didn't.

This approach is unsexy. It doesn't make headlines. But it's what turns procurement into adoption.

The Hype Cycle Is Real — and Predictable

Every major AI announcement follows the same pattern. Peak excitement. Maximum hope. Minimum critical thinking. Then, three months later, organisations start deploying. They discover that benchmark performance doesn't translate to business process performance. They hit the disillusionment valley.

And that's where most AI deployments live right now — in that valley between hype and pragmatism.

The organisations that succeed are the ones who navigate this valley deliberately. They ask uncomfortable questions: What exactly will this solve? Who's going to use it? How will we know it's working? What if it doesn't?

These questions sound obvious. Most organisations skip them entirely.

What to Do Differently

Start with change management, not the tool

Before you buy a single licence, design the future state. What workflows will actually change? Who does those workflows? What training do they need? What are the barriers? Address these first. Most organisations get this backwards — they buy the tool, then try to figure out how to make it fit existing processes. That's why 70% of seats stay inactive.

Measure usage, not seats

Track actual engagement — not licences purchased, but people who used the tool last week. A 3,000-seat Copilot deployment with 900 weekly active users (30% usage) isn't a success story. It's a red flag. Something about the training, the workflow, the tool fit, or the motivation isn't working. Most CFOs won't see this flag until Quarter 2 or 3, when the spend justification comes around and no one can explain the ROI.

Designate internal experts

Not consultants. Not vendors. People from your own teams who learn the tool deeply and can explain it in language their colleagues understand. External consultants are useful for strategy and design. But adoption happens one person at a time, in one department at a time, through conversations with someone they trust.

Start small

Pick one or two workflows. Get them right. Then scale. A full enterprise rollout that doesn't work is a disaster. A two-team pilot that works is a template for the next two teams.

Accept the disappointment valley

You'll spend time here. That's normal. The question is whether you're learning and iterating, or slowly admitting failure. The organisations that succeed treat low adoption as data — not as evidence that AI doesn't work, but as evidence that something in the approach needs to change.

The Real Cost

An organisation with 3,000 Copilot licences, 30% weekly active usage, and no clear productivity outcome is spending approximately £60,000–£120,000 per year on unused AI capacity. Multiply that across enterprises, and you're talking about billions in wasted spend globally.

But the real cost isn't the money. It's the cynicism. After one failed AI rollout, the next one faces immediate resistance: "We tried that. Didn't work."

That resistance is justified — because the organisation didn't do the change management work. And now, when a genuinely transformative AI capability arrives, the organisation will be fighting upstream against justified scepticism.

That's the real cost. Not the wasted licence fees, but the lost opportunity for genuine transformation because the work wasn't done right the first time.

The Question Forward

AI adoption isn't inevitable. It's optional. And expensive if done wrong.

The organisations that will actually benefit from AI in 2026 and beyond aren't the ones who bought the most seats. They're the ones who made deliberate choices about which problems AI actually solves, how they'll know it's working, and who's going to champion it internally.

Where is your organisation in this journey? And more importantly: are you measuring the right things?

Sources: Microsoft Copilot Adoption Statistics & Trends (2026) · No Jitter: 4 obstacles impede paid Microsoft 365 Copilot adoption · Larridin: Enterprise AI Adoption in 2026 · Medium: The $500 Billion Mistake (February 2026)