Richard Batt |
How to Train Your Team to Use AI in One Day (Not One Quarter)
Tags: Leadership, Productivity
A financial services company rolled out ChatGPT access to 85 employees in September. By January, only 14 were actively using it. The rest had access but no idea what to do with it. The company spent nothing on training. They assumed smart people would figure it out. Smart people didn't. They were busy. And intimidated.
Three months later, a different company in the same industry did the same rollout with a single four-hour training session. Nine months in, 73 employees were using AI daily. The difference wasn't the tools. It was the training.
Key Takeaways
- Effective AI training takes 4-8 hours (one day or two half-days), not months
- Only 5% of workers are AI-fluent without training; the other 95% need structure
- Three elements drive adoption: concept introduction, real-task practice, and a prompt library they can copy
- The 80/20 rule: context + format + examples covers 80% of daily use
- A one-page prompt library available after training increases adoption by 6-8x
Why Training Fails (And What Actually Works)
Most companies try one of two approaches. Both fail.
Approach 1: "We'll send everyone a link to an online course." Result: 3% completion. People are busy. Self-directed learning only works for people who already want to learn.
Approach 2: "We'll hire a consultant to teach them everything about AI and transformers and large language models." Result: people understand the theory. They can't apply it. They remember nothing a week later.
What actually works is tighter: show them how to use AI to do their job better, immediately, using real examples from their role, in a single session.
The One-Day Training Structure (4-8 Hours)
This is the framework I use with clients. It's built on three elements: concept (30%), practice (50%), and tools (20%).
Morning Session (2-3 Hours): Foundation + Real Examples
Start at 9am. Finish by noon.
9:00-9:15 | Context (Not Hype)
Don't start with "AI is transforming the world." Start with a specific example they'll recognize.
"Last month, Sarah in your team used AI to rewrite an email in five seconds instead of 15 minutes. That's not speculation. That's one of your teammates. You can do the same thing, today, in this room."
Cover three things, and only three:
What AI tools do (they predict the next word, very fast, based on patterns in text). Don't explain transformers or neural networks. You don't need to.
What they're actually good at (rewriting, summarizing, generating options, format conversion, organizing information). Give one example per use case from their industry.
What they're not good at (real-time data, specialist knowledge they don't have, physical tasks, anything that requires judgment calls). Be clear about limitations. Trust builds fast when people see you're honest about boundaries.
9:15-9:45 | The Three Prompt Principles
This is the core. Everything else is detail.
Principle 1: Context. "Tell AI what role you play and what you're trying to do." Without context, AI guesses. With context, AI nails it.
Bad prompt: "Write an email." (AI doesn't know why or to whom.)
Good prompt: "You are a project manager following up with a contractor who missed a deadline. Write a professional but direct email reminding them of the deadline and asking for a revised timeline. Keep it under 150 words."
Principle 2: Format. "Tell AI how you want the output structured." JSON, bullet points, a table, a step-by-step list. AI will deliver exactly what you ask for.
Example: "Give me a 10-point checklist in this format: [number]. [Description] - [Time in minutes]."
Principle 3: Examples. "Show AI what good looks like." Two examples of the output you want, and AI will match that style and quality.
Example: "Here's an example of the tone I want:\n\nExample 1: [your example]\n\nExample 2: [your example]\n\nNow write the same thing for..."
That's it. Context + Format + Examples = 80% of what you'll ever need. Spend 20 minutes on this and people get it.
9:45-11:00 | Live Practice (45 Minutes)
Stop talking. Start doing.
Pick three real tasks from the room's job descriptions. Could be: rewrite a product description, create interview questions, summarize a customer complaint, turn meeting notes into action items.
Do the first one live in front of them. Type the prompt, run it, show the output, edit if needed. Narrate what you're doing: "I'm adding context here so AI understands the audience. Now I'm specifying the format because I want a table, not paragraphs."
Do the second one with them. "What context should we give AI? What format do you want? Show me an example of what good looks like." They write the prompt with you. They see it work.
Do the third one with them in pairs. They write the prompt, they run it, they report back. This is where confidence builds.
11:00-11:30 | Q&A + Concerns
People will ask: "Will this replace my job?" (No, it'll change your job.) "Is this data secure?" (Policy decision, not tech one.) "What if the output is wrong?" (You review it, always.) "Where do I start?" (With the prompt library they'll get after this session.)
Answer in 30 seconds or less. Move on.
Afternoon Session (2-3 Hours): Building Their Toolkit
Break for lunch. Reconvene at 1pm.
1:00-2:30 | Role-Based Deep Dive
If your team is mixed (managers, analysts, writers, salespeople), split them by role for 90 minutes. If everyone does similar work, do one group.
For each role, build a prompt library together. Start with prompts they'll use this week:
Salespeople: email templates (follow-up, objection handling, proposal recap), call notes summarization, pipeline analysis prompts.
Managers: one-on-one preparation, feedback writing, project status summaries, goal-setting frameworks.
Analysts: data interpretation, report writing, insight generation, anomaly detection.
Writers: headline generation, outline creation, tone adjustments, SEO optimization.
Don't write the prompts for them. Ask questions that make them write the prompts: "What context should AI have? What format makes sense? What does good look like?" They'll write better prompts when they're thinking about their own work.
2:30-3:00 | Implementation Plan
Give them a one-page template: "My AI Adoption Plan." Three sections.
This week: one task I'm automating (and my prompt for it).
This month: two more tasks I'm planning.
Questions I'll ask my manager if something breaks.
They write it in 15 minutes. It takes the abstract training and makes it concrete. On Monday morning, they know exactly what they're doing.
3:00-3:30 | Hands-On Lab Everyone opens an AI tool. They run their "this week" prompt. It works (it usually does). They see it immediately. Questions drop. Confidence rises.
What They Take Home
At the end of the day, every person leaves with:
A one-page prompt cheat sheet: "Context + Format + Examples" with three examples ready to copy and edit.
Their role-specific prompt library (5-8 templates they wrote together).
Their adoption plan for the coming month.
A Slack channel or email list where they can ask questions if something breaks.
That's not optional. Without these, 60% of them will forget the training by Friday. With them, 80% will be using AI weekly by month two.
Measuring If Training Worked
Three weeks later, check three things:
Are people using the tools? (Check access logs, or just ask in a one-on-one: "Have you used the AI prompts yet?")
Are they using them the way you trained them? (Ask one person to show you a prompt they wrote. Is it following the context + format + examples pattern?)
Are they generating new prompts, or just copying the templates? (Templates are fine for month one. By month three, they should be writing their own variations.)
If the answer to any of these is no, you need a follow-up session (30 minutes, same format). But if you structure the first day right, you won't need it.
Common Mistakes (And How to Avoid Them)
Mistake 1: Mixing everyone together
A salesperson and an analyst learn different things. Split by role if you have more than 15 people in the room. It doubles engagement and halves confusion.
Mistake 2: Talking too much about how AI works
Nobody cares about transformers. They care about whether they can use this tool today. Spend 5 minutes on "how it works," 45 minutes on "how you use it."
Mistake 3: Not giving them prompts to take home
Verbal training evaporates. Written prompts stick. Print them. Send them. Make them impossible to lose.
Mistake 4: Training without access or tools set up
Before the session: every person has a login to the tool they're training on. No setup time on the day. No password resets during training. No friction.
Mistake 5: One-shot training with no follow-up
Training doesn't stick on day one. You need: initial session (one day), follow-up 30 minutes (two weeks later), open office hours (every Friday for a month). Adoption compounds over three months, not three days.
Frequently Asked Questions
Can we do this online instead of in-person?
Yes. It's slightly less effective (fewer questions, harder to spot confusion), but still works. Use breakout rooms for role-based sessions. Make sure everyone has camera on. The live examples lose impact when you're screen-sharing instead of standing next to people, but it still lands.
What if people are worried about AI replacing their job?
Address it directly at the start: "I'm not here to replace any of you. I'm here to give you a tool that handles the tedious parts of your job so you can focus on the judgment calls only you can make." Then show an example where AI automates the boring bit (reformatting, summarizing) and leaves the thinking to them. The fear usually goes away fast.
How long does adoption actually take?
Week 1: they use it in the lab and the training. Week 2-3: quiet. They're thinking about it but haven't started. Week 4: 40% are using it. Week 8: 65% are using it regularly. Month 6: 75% are fluent, 15% occasionally, 10% still not engaged. Those 10% won't engage no matter what. Don't waste energy on them.
Should we train managers separately?
Yes. One session for individual contributors, one for managers. Managers need to know: how to coach their team on AI, what to look for in quality checks, how to spot when someone's using AI as a crutch instead of a tool. Different skillset.
What if we have 300 people?
Train 30 champions first (4 hours). Then have them run 30-minute sessions with their teams, using the prompt library and adoption plan template you give them. Decentralized training is slower but scales faster and builds ownership. By week three, your champions are teaching their teams. Your organization is AI-fluent by month two.
Richard Batt has delivered 120+ AI and automation projects across 15+ industries. He helps businesses deploy AI that actually works, with battle-tested tools, templates, and implementation roadmaps. Featured in InfoWorld and WSJ.
Put This Into Practice
I use versions of these approaches with my clients every week. The full templates, prompts, and implementation guides, covering the edge cases and variations you will hit in practice, are available inside the AI Ops Vault. It is your AI department for £97/month.
Want a personalised implementation plan first? Book your AI Roadmap session and I will map the fastest path from where you are now to working AI automation.