Tell AI exactly what you liked, what missed the mark, and what to change — then ask it to try again. Specific feedback produces dramatically better results.
Yes — use custom instructions, saved prompts, and brand voice documents to make AI consistently produce content that sounds like you.
Focus on checking specific claims — statistics, tool features, and step-by-step instructions. Skip fact-checking general advice and opinions.
Never auto-publish AI-written student assessments, legal or financial guidance, personal feedback, or anything with specific claims your students will act on.
Tell AI to write for action, not information. Every piece of content should end with something the student can do, try, or build right away.
AI uses a degree of randomness in every response, so the same prompt can produce slightly different output each time — like asking the same question to a classroom of students.
Good AI output is specific, action-oriented, and sounds like you. Bad AI output is generic, vague, and could have been written for anyone.
Give AI a detailed briefing about your audience, your niche, and your teaching style at the start of every session so it writes for your people.
No — always review AI lesson content before publishing. Even great AI output needs a human check for accuracy, voice, and student safety.
Plan for 5-15 minutes of editing per piece. If you are spending longer, your prompt needs work, not more editing time.