Service Magic

Change Management for MSPs: How to Roll Out AI Without the Pushback

Written by Matt Linn | Apr 6, 2026 4:30:00 AM

Rolling out AI to your service desk is not, at its core, a technology problem. The integration works. The accuracy is there. What breaks down (almost every time) is the human side of the equation.

Here's the part that surprises most MSP owners: your own techs are often harder to win over than your clients. End users just want faster service. Your help desk team, on the other hand, has habits, anxieties, and a deep familiarity with how things have always worked. Getting them on board takes a different approach than flipping a switch and hoping for the best.

To make this concrete, we talked with Tom Wrigglesworth, a senior technician at Netteam tX Ltd, an MSP serving the UK hospitality industry who shared his rollout story in a recent AI Service Unleashed session. His experience is a good blueprint for what works, and what doesn't, when humans are the hardest part of the equation."

 

Why Rollouts Fail (It's Not the Technology)

Ask MSPs what broke during their AI rollout and you'll hear the same things: trust issues, unclear ownership, no communication plan, and tech anxiety about job security. Rarely does anyone say the AI itself was the problem.

Tom's experience at Netteam tX illustrates this well. When they introduced AI-assisted ticketing, they expected pushback from clients. Instead, the clients were largely receptive, they just wanted their issues resolved faster. The resistance came from within.

"We expected everybody to go, oh, absolutely not. How dare you. But the end users were really on board for it. The problem was the actual help desk." — Tom, Netteam tX

 

And when trust does break with clients, it moves fast. Tom described one scenario where a single skeptical user convinced their CEO that the MSP had handed everything over to a robot. That perception spread through the organization before anyone had a chance to address it. Once that kind of distrust takes hold, it's extremely difficult to undo.

The lesson: the window for setting the right expectations (both internally and externally) is before launch, not after.

 

 

Start Internal: Win Your Team Before You Win Your Clients

The first rule of a successful AI rollout is to never make your clients the beta testers.

Introduce AI on your internal service tickets first. Your techs are submitting tickets when they have IT issues too — that's a low-stakes environment to get familiar with how the AI behaves, what it gets right, and where it needs refinement.

From there, roll out in waves. Don't try to turn everything on at once. A sequenced approach gives people time to adjust, builds confidence, and lets you catch problems before they become client-facing issues. Netteam tX started with email triage only, no client-facing AI interaction, just behind-the-scenes categorization and prioritization. That gave their techs time to see the output, trust the accuracy, and get comfortable before the next phase.

A rotation model works well here: the first person on the rotation learns the tool deeply, then trains the next person, who trains the next. You're building internal champions rather than mandating adoption from the top down.

One more thing: find your change drivers. Every organization has them — the operationally sharp people who are always curious about new tools and willing to try things. Put them in front of this first. Let them become your internal proof points.

 

Find the Carrot for Your Techs

Explaining why you're introducing AI is necessary but not sufficient. Techs need to feel it, they need to have a moment where it clicks that this is actually making their job better, not threatening it.

For Netteam's help desk, that moment was Timepad's AI-written time entries. Tom showed his team that they could click a single button and have the AI write up everything they'd done on a ticket. That one demo was enough to break through the skepticism.

"That just blew everybody's head off when I showed them. I think that's what sold it to the help desk." — Tom, Netteam tX Ltd

The carrot doesn't have to be time entries. It could be clean ticket titles that finally make the queue readable at a glance (Tom mentioned this too, titles that actually reflect the issue, so searching three months later isn't a nightmare). It could be auto-routing that means techs stop getting tickets outside their skillset. Find the thing that solves a pain your team already feels, and lead with that.

Critically: don't lead with the business case when talking to techs. Don't talk about efficiency ratios or cost savings. Talk about what their Tuesday afternoon looks like when the AI is handling the administrative noise.

 

The Three Promises You Need to Make to Your Customers

Customers who aren't told what's coming will see an agent interact with them and immediately wonder what happened to their MSP. That confusion turns to distrust fast. Proactive communication is the single most important thing you can do before launch.

A framework developed with GadellNet, another MSP that has thought carefully about this, centers on three promises:

  • You still have a human support team. AI is improving how we deliver service, not replacing the people who support you day in and day out.
  • You will get faster, more consistent service. Don't be vague about this. You're doing this for a reason, say it plainly.
  • We will monitor and continuously improve this over time. Just as you continuously train your human team, you'll be refining how the AI works on your behalf.

 

A note on framing: don't tell customers you're "updating the tool." That implies you launched something unfinished. Instead, frame it as continuous improvement a mature, intentional process of making your service delivery better. One AISU session attendee named William put it well: "I'd frame it as we are improving our process as we onboard and roll out this new tool, not we are updating the tool." That framing lands very differently.

Thread provides a set of ready-to-use communication assets. This tool includes an email template, a one-page FAQ, a 30/60-day success email that you can white-label and send to clients ahead of rollout. Having something professional and clear in clients' inboxes before launch does a lot of the trust-building work for you.

 

Roll Out in Waves, Then Share the Wins

Once you've communicated with clients and your team is trained, start with a controlled subset of customers, ideally ones you have a strong relationship with, who submit tickets regularly and are generally collaborative. Not your biggest account. Not your most difficult client. The ones where the feedback loop is healthy.

Be explicit with both techs and clients about what the AI is handling and what humans still own. Ambiguity here is where anxiety lives. If AI is doing first response and triage, say so. If a human is always reviewing before anything escalates, say that too. Clarity on ownership prevents the perception that the AI is running unsupervised.

And when things go well, share it. When you hit a zero-touch resolution, tell your team. When a client responds positively to the AI interaction, share that in your internal channels. Proof compounds. The more people see it working, the faster the remaining skeptics come around.

Tom described the moment when Thread stopped working briefly at Netteam tX and two of his technicians got nervous. That's when you know adoption has happened. The tool going down was more disruptive than the tool going live.

 

The Methodology in One Line

Start small. Prove it. Expand.

Change management for AI is a process. The MSPs that do it well treat the rollout as a communication exercise as much as a technical one. They sequence carefully, find the internal champions, set honest expectations with clients, and keep sharing results as the proof accumulates.

The technology is ready. The question is whether your organization is set up to adopt it well. If you follow the framework above, the answer is yes, and faster than you'd expect. Netteam tX was in full deployment within weeks, with techs who once needed convincing now getting nervous when the tool goes offline.

That's the goal: not just adoption, but dependency. The kind that comes from something that actually makes the work better.