
Here is a number that should alarm every Australian operations manager who has just signed off on an AI tool: companies spend 93% of their AI budget on technology and infrastructure, and only 7% on people-related initiatives like training and change management (HBR, November 2025).
The result is predictable. According to Deloitte Australia's November 2025 report, 66% of Australian SMBs now use AI in some form, but only 5% are fully enabled to realise its potential benefits. That is a staggering gap between buying AI and actually getting value from it.
The missing piece is not the technology. It is the people.
Having worked on large-scale technology rollouts at enterprise operations including BHP and Rio Tinto, where data platforms had to be adopted by hundreds of operators and analysts across multiple sites, I have seen this pattern play out repeatedly. The tools that succeed are not always the best tools. They are the ones with the best adoption strategy behind them.
This guide gives you the practical, step-by-step playbook for getting resistant teams to not just tolerate AI, but actively use it -- based on research from McKinsey, HBR, Deloitte, and Australia's Department of Industry.
Part 3 of 4 in our AI Launch Series. This post covers user adoption after you have verified quality and understood what makes AI launches different.
The instinctive assumption is that employees fear losing their jobs. But Australian data tells a different story. The BizCover Australian Small Business AI Report (2025) found that 72% of employees view AI as an opportunity to enhance their roles, not replace them. Cost and job loss concerns are notably absent from the top barriers.
So what is actually going on?
Understanding the specific flavour of resistance in your team determines your entire adoption strategy. A skills gap requires training. A trust gap requires transparency. A leadership gap requires visible executive usage.
Research published in HBR (November 2025) found that employees who receive hands-on AI training report 144% higher trust in their employer's AI tools compared to those who receive none. Workers with high trust save approximately 2 hours per week using the same tools as low-trust peers.
But here is the catch: 30% of employees have received zero AI training, and 61% have spent less than five hours learning about AI (Slack Workforce Lab, 2024). Most businesses are asking people to trust tools they have never been properly taught to use.
After studying the research and drawing from enterprise rollout experience, the adoption approach that works in Australian SMBs follows a specific sequence. Skip a step and adoption stalls.
Research supports this sequence. McKinsey (2025) found that 48% of employees would use AI tools more often if they received formal training, and 45% would increase usage if AI were integrated into daily workflows. You need both -- and in that order.
The first AI use case your team encounters will shape their attitude toward every subsequent one. Choose poorly and you create sceptics. Choose well and you create advocates.
The worst first use case is anything that requires people to change how they work before they see value. The best first use case delivers visible time savings within the first week, using a tool that lives inside software they already know.
Research consistently shows that 10-20% adoption of any innovation triggers rapid acceptance by the majority (Strategico Consultants, 2025). You do not need everyone on board. You need the right 2-3 people.
Champions are not necessarily your most technical people. They are:
When champions model adoption in their own work and talk openly about what they struggled with, they create psychological safety for others to try (and fail, and try again).
From enterprise experience: On large-scale data platform rollouts across mining operations, the teams that adopted fastest were never the ones with the best training materials. They were the ones with a respected operator on the floor who said, "Look, I was sceptical too, but this actually saves me an hour a day." Peer influence beats PowerPoint every single time.
Forget the corporate "AI awareness session" with 40 slides about machine learning. Research from HBR (November 2025) found that workers given hands-on practice opportunities are 72% more likely to report high trust in AI tools.
Instead of telling people what AI can do, show them -- on their own data, with their own tasks.
This approach works because it addresses the three main objections simultaneously: it proves the tool is useful (skills gap), shows the output transparently (trust gap), and integrates with their actual work (workflow gap).
This conversation deserves honesty, not corporate spin. Here is how to handle it with integrity.
What to say:
What NOT to say:
The reframing that works: Position AI adoption as a career skill, not a company mandate. Employees who build AI fluency are more employable everywhere. This shifts the motivation from "do this for the company" to "do this for yourself."
| Metric | Current Task | With AI | Improvement |
|---|---|---|---|
| Email drafting | Write from scratch (20 min) | Edit AI draft (5 min) | 75% faster |
| Data entry | Manual keying (2 hrs/day) | AI extracts, human validates (30 min) | 75% faster |
| Report compilation | Pull data from 4 systems (3 hrs) | Auto-compiled, human analyses (45 min) | 75% faster |
| Inventory forecasting | Spreadsheet guesswork (4 hrs/week) | AI forecast, human reviews exceptions (1 hr) | 75% faster |
The pattern is clear: AI handles the mechanical parts. Humans handle the judgment. The role shifts from "doer" to "reviewer and decision-maker," which is typically a more senior function.
McKinsey's 2025 research is unambiguous: when employees receive adequate training, they use AI tools more frequently as skill levels rise. But "training" does not mean a two-hour workshop and a PDF.
The critical insight: weekly manager check-ins raise trust scores by approximately 60% (HBR, November 2025). This is not about monitoring compliance. It is about signalling that AI use is valued, supported, and expected.
Consider a typical Australian SMB with an operations manager responsible for a sales team of 8. The team sends roughly 40 outbound emails per person per day. Each email takes 10-15 minutes to draft from scratch. The ops manager wants to introduce AI-assisted email drafting to reclaim time for actual selling.
Tool selection: A generative AI assistant integrated into the existing email client (Microsoft Copilot in Outlook at $30/user/month, or a standalone tool like Superhuman AI). The key is the tool must live where the team already works -- no new logins, no new tabs.
Budget: For 8 users at approximately $30/month each, total cost is $240/month ($2,880/year).
Week 1 -- Champion Selection and Early Access The ops manager identifies two champions: one top performer who is naturally curious, and one mid-performer who is vocal and respected. Both get access a week early with one instruction: "Try using it for your actual emails this week. Note what works and what doesn't."
By Friday, both champions have found their groove. The top performer uses it for prospecting emails. The mid-performer discovered it is brilliant for follow-up sequences after demos.
Week 2 -- The Live Demo (Not a Training Session) The ops manager books a 30-minute team meeting. No slides. Instead, one champion opens their laptop and shares their screen:
The team sees the real output, including the imperfections, and watches the edit process. This takes 2 minutes per email instead of 12. The visual impact is immediate.
Week 3 -- Paired Practice with Low Pressure Each team member gets access. The ops manager pairs each person with a champion for their first three emails. No mandate to use it for everything -- just "try it for three emails today and see what you think."
Common pushback at this stage:
Week 4 -- The Daily Challenge and Measurement The ops manager sets a light challenge: "Use AI to draft at least 5 emails per day this week." No punishment for missing it. At Friday's sales meeting, they share early results.
Weeks 5-8 -- Embed and Expand By week 5, most of the team is using AI for at least half their emails. The ops manager updates the team's email SOP: "Step 1: Generate AI draft. Step 2: Personalise and review. Step 3: Send."
The two holdouts (there are always 1-2) get individual attention. The ops manager asks: "What specifically is not working for you?" Often the answer reveals a solvable problem -- the tool does not handle their specific email type well, or they need a different prompt approach.
Consider a warehouse manager at an Australian wholesale distribution business with 65 employees. The procurement team of 4 currently uses spreadsheets to forecast demand, placing orders based on gut feel and last year's numbers. Stockouts cost the business roughly $15,000/month in lost sales. Overstocking ties up $200,000 in working capital.
Tool selection: An AI-powered demand forecasting module that integrates with their existing inventory system (such as Cin7 with forecasting add-ons, or a standalone tool like Inventoro that connects to Xero/MYOB). The tool analyses historical sales, seasonality, lead times, and supplier reliability.
Budget: Approximately $300-800/month depending on SKU count and features.
Procurement teams are often the most sceptical because they have deep domain knowledge. They know their suppliers, their seasonal patterns, their weird one-off customers. Telling them "the AI knows better" is a guaranteed way to kill adoption.
Phase 1: Shadow Mode (Weeks 1-2)
The warehouse manager does not replace the existing process. Instead, AI runs in parallel.
This is non-threatening. The team keeps full control. But they start seeing patterns: "The AI caught that seasonal spike two weeks earlier than we did."
Phase 2: Advisory Mode (Weeks 3-4)
Now the team receives AI forecasts before they place orders. They can accept, modify, or reject each recommendation. The key rule: no one is forced to follow the AI's suggestion.
The warehouse manager tracks three metrics:
Phase 3: Assisted Mode (Weeks 5-8)
Based on Phase 2 data, the team and warehouse manager review results together. Typically, the pattern looks something like this:
| Metric | Manual Forecasting | AI-Assisted | Improvement |
|---|---|---|---|
| Forecast accuracy | 65-70% | 82-88% | +20% points |
| Stockout frequency | 12 per month | 3-4 per month | 70% fewer |
| Excess inventory value | $200,000 | $130,000 | $70,000 freed |
| Time on forecasting | 16 hrs/week | 4 hrs/week | 75% less |
At this point, the procurement team has seen the evidence with their own data. The conversation shifts from "Do we trust the AI?" to "Which categories should we let it handle automatically, and which do we want to keep reviewing manually?"
Phase 4: Ownership (Months 2-3)
The procurement team now owns the AI tool. They decide which product categories get automated ordering and which get human review. The warehouse manager's role shifts from "driving adoption" to "supporting the team's decisions about how to use the tool."
The critical factor that made this work: the team was never told to trust the AI. They were shown the data and allowed to make their own decision. Autonomy builds trust faster than mandates.
Use this checklist during the first 8 weeks of any AI rollout.
Do not expect 100% adoption in week one. Research and practical experience suggest the following timeline for most Australian SMBs.
Note: 100% adoption is not the goal. 80-90% sustained usage is a realistic and excellent outcome. The remaining 10-20% often have legitimate reasons (different role requirements, accessibility needs) that should be respected.
The gap between buying AI and actually using AI is where Australian businesses are leaving the most value on the table. Deloitte Australia (2025) found that moving from basic to intermediate AI use delivers a 45% increase in profitability, and moving from intermediate to fully enabled use delivers a 111% increase.
The difference between a wasted AI investment and a transformative one is not the technology. It is whether your team actually uses it. And whether they use it depends entirely on how you roll it out.
Need help building an adoption strategy for your specific team? Book a free 30-minute consultation and we will map out a practical plan based on your tools, team size, and industry.
This is Part 3 of 4 in our series on launching AI successfully in Australian SMBs:
Related Reading:
Sources: Research synthesised from Deloitte Australia "The AI Edge for Small Business" (November 2025), Harvard Business Review "Overcoming the Organizational Barriers to AI Adoption" (November 2025) and "Workers Don't Trust AI" (November 2025), McKinsey "Reconfiguring Work: Change Management in the Age of Gen AI" (2025), Australia Department of Industry AI Adoption Tracker (Q1 2025), BizCover Australian Small Business AI Report (2025), Slack Workforce Lab (2024), and Strategico Consultants change champion research (2025).