Your School Doesn't Need AI Training. It Needs a PLC Cycle.
You've probably already had the PD day.
Someone stood at the front of the room, walked through a tool, showed a few impressive outputs, and gave everyone 15 minutes to try a prompt. Maybe it was ChatGPT. Maybe it was MagicSchool. Maybe it was something the district purchased over the summer.
Teachers left with a login. Some felt energized. Some felt overwhelmed. Most felt both.
And then Monday came. And the tool sat unused — not because teachers didn't want to try it, but because there was nothing on the other side of the training. No structure for practice. No protected time to come back and try again. No one to ask when a prompt didn't work. No clarity about what was safe to put into the tool and what wasn't.
I wrote earlier about why AI implementation is a change management problem, not a training problem. This article is about what to do about it — and it starts with the PLC time you already have.
The Training Model Doesn't Work for AI
The training model assumes the barrier is skill. If we just show teachers how the tool works, they'll use it.
But skill isn't the barrier. The real barriers are:
Trust. Teachers don't know if it's safe to experiment with something they don't fully understand — especially in front of colleagues.
Privacy confusion. Most educators know they shouldn't put student names into AI tools. But they don't have clear decision rules for the gray areas. The uncertainty creates paralysis — and paralysis looks like resistance.
No ongoing structure. A one-day PD has no follow-up loop. Without that loop, even the motivated teachers lose momentum within weeks.
Unclear purpose. "Explore AI" is not a goal. Teachers need a specific, bounded task — something concrete enough to try this week and evaluate next week.
Training addresses none of these. Change management addresses all of them.
You've sat in this meeting. I've sat in this meeting. The format isn't new. The result isn't either.
PLCs Are Already Built for This
What AI integration needs is exactly what effective PLCs already provide:
Protected time to practice, not just watch a demo.
Collaborative accountability — trying something together is less risky than trying it alone.
A feedback loop — what worked, what didn't, what to try next week.
Facilitator support — someone holding the structure so the exploration doesn't drift into complaints or chaos.
When you position AI exploration inside the PLC cycle, you're not adding a new initiative. You're giving an existing structure a specific, timely, practical focus.
AI integration isn't an event. It's a cycle. PLCs are already built to run cycles.
Why NotebookLM Is the Right Starting Point
Google's NotebookLM works as a first tool because it eliminates the biggest fears before they become barriers:
It works with teachers' own materials — you upload a document, and the AI responds only from that source. Every output can be verified against what was uploaded. No hallucinations from unknown sources.
It doesn't require student data. Teachers are the users, not students. No names, no assessment scores, no PII.
It runs inside Google Workspace — no new account, no IT request, no app approval process.
It has bounded, practical use cases — summarize a text, generate discussion questions, create a study guide. Specific enough to try this week. The skills teachers build here — evaluating, verifying, applying AI outputs collaboratively — transfer to whatever tool comes next.
Where This Course Came From
Before building this course, I spent time working on the implementation side of edtech — helping schools adopt AI-powered tools. I saw what happens when the conditions for teacher support are right, and I saw what happens when they're not.
Sometimes the gap was the technology itself. Sometimes it was the structure around it. But the pattern I kept seeing was that even when the tool worked well, schools without a sustained support system couldn't make it stick. And when the support system was strong, teachers found ways to make even imperfect tools useful.
When leaders stayed involved, when teachers had protected time to practice, when support continued throughout the year rather than ending after a launch event — teachers experimented, reflected, and actually integrated the tool into their practice. That didn't happen by accident. It happened because people on the ground — district teams and the educators working alongside them — did the hard, daily work of making implementation real.
That experience led me back to building training content for schools. This course was my first project after that transition — built because I saw what was missing and wanted to make it available to any school leader who needed it.
What the Course Covers
The course follows a gradual release arc — from leader-led exploration to team-led practice to sustainable independent cycles.
Module 1: Why AI, Why Now — and Why Through a PLC?
Establishes the case for AI exploration, introduces NotebookLM, and uses the VOYAGE Horizons diagnostic framework to identify what's actually slowing AI adoption in your building. Is it skill? Will? System? Resources? Leaders try NotebookLM themselves before asking anyone else to. Because you shouldn't ask your staff to be beginners at something you haven't tried yet.
Module 2: Privacy, Norms, and Guided Practice
Safety comes first. Before anyone touches a tool with real content, the team co-creates norms, establishes privacy guardrails — including what I call the Golden Rule: never put student data into an AI tool — and does structured "I do / We do" practice in a public sandbox. Includes scripting for psychological safety and a Tough Conversations Cheat Sheet for handling the skeptical comments that can derail a session if you're not ready for them.
The goal isn't to convince everyone. It's to protect professional trust while keeping the collective learning on track.
Module 3: You Do Together
Teams practice independently with light-touch support. They build a grounded notebook using approved instructional texts, keep web access off so the AI stays anchored to the source material, and verify outputs using assigned roles: Driver, Source Checker, Skeptic, and Recorder. Each session produces one usable artifact — something a teacher can actually use in their classroom that week.
Verification happens collaboratively, not as a compliance checkbox. That's the difference between "we checked the box" and "we trust what we produced."
Module 4: Keep the Cycle Going
The handoff. PLC teams continue the same routine — focus, evidence, verification, one usable artifact — while rotating the instructional strategy. NotebookLM becomes optional. The PLC cycle is the engine.
This is the part most AI PD never reaches. Not because it's hard to design, but because the training model doesn't have a structure for what happens after the workshop ends. The PLC model does.
Start the Course
The course is self-paced, includes downloadable facilitator materials and planning tools, and doesn't require a sign-up to start. It's free — because the barrier to AI exploration shouldn't be the price of the training about it.
Your teachers don't need another AI demo day. They need a structure that supports them all year. This course gives you that structure, and it starts with the PLC time you already have.
→ Start the Free Course: Using PLCs as a Launchpad for AI Integration
Want to check your PLC readiness first? Take the free to diagnose where your PLCs are breaking down. Or explore the full VOYAGE Horizons framework for school leaders building or rebuilding their PLC structures.