For each module, give Claude the one thing students should be able to do after completing it, then ask it to design a practice exercise where students apply that skill to their own real business or teaching situation. Exercises built around a student’s actual context are completed at far higher rates than generic practice problems.
Why Generic Exercises Don’t Get Done
Most course exercises fail for one reason: they’re abstract. “Complete this worksheet” or “apply the framework to a hypothetical scenario” gives students no reason to engage beyond compliance. When the exercise has nothing to do with their actual situation, it feels like homework in the worst sense — obligatory busy work that can be skipped without consequence.
The exercises that actually get completed are the ones where students can see immediately why doing the work benefits them. Ask a coach to map their own client onboarding process, and they’ll do it. Ask them to map a fictional character’s onboarding process, and they’ll skip it. The real-context requirement is the difference.
The Prompt Structure That Works
Give Claude three things: the module’s learning objective, a brief description of your typical student, and the format preference (written reflection, filled-in template, action step, or short project). Then ask: “Design a practice exercise for this module that requires students to apply the concept to their own real business or teaching situation. The exercise should take 15 to 30 minutes and produce something the student can actually use.”
That last phrase — “something the student can actually use” — is the key. Claude will generate exercises with tangible outputs: a draft email, a filled-in framework, a list of five examples from their own experience, a revised lesson plan. These outputs double as portfolio pieces and implementation tools, which means completing the exercise creates direct value beyond the learning itself.
Ask Claude to also write the exercise debrief — a set of questions students should reflect on after completing it. Good debrief questions sound like: “What surprised you about doing this? What felt harder than expected? What would you change now that you’ve worked through it?” These prompts turn a solo exercise into community discussion material.
What This Means for Educators
Courses that have strong exercises have stronger testimonials. Students who complete the exercises have concrete outcomes to report — and those outcomes become the evidence that your course works. A student who did the module exercise and got a tangible result will write a testimonial that is ten times more compelling than one from a student who just watched the videos.
Building AI-generated exercises into every module also reduces the live session Q&A burden. When students have already worked through a practice exercise before the live call, they arrive with specific questions about their real situation rather than generic questions about the concept. Your live time becomes more valuable.
The Simple Rule
Every module should produce something. Let AI design the exercise so every lesson ends with a student holding a real output, not just a filled notebook. Outputs become testimonials. Testimonials become enrolments.
