Lesson Plans for Teachers
Structured, adaptable frameworks for introducing AI in the classroom. Pick a lesson, adjust for your students, and teach with confidence.
Lesson 1: What Is AI? (And What It Isn't)
Students explore what artificial intelligence means through hands-on activities and discussion, building a foundational understanding before they ever touch a tool.
Learning Objectives
- Define AI in their own words
- Distinguish between things AI can and cannot do
- Identify 3 examples of AI they already use in daily life
Activity Overview
Part 1 (15 min): Sorting game — students sort cards into "AI Can Do This" and "AI Cannot Do This" categories (e.g., "recognize a cat in a photo" vs. "feel happy"). Discuss surprising answers as a class.
Part 2 (15 min): AI Scavenger Hunt — students list AI they encounter daily (voice assistants, autocorrect, recommendation algorithms, etc.).
Part 3 (15 min): Group discussion: "If AI can do all these things, what makes humans special?" Students write one sentence summarizing what they learned.
Lesson 2: Prompt Engineering 101
Students learn that the quality of AI output depends on the quality of input. Through structured experiments, they discover how changing a prompt changes the result.
Learning Objectives
- Write clear, specific prompts that produce useful results
- Compare vague vs. detailed prompts and analyze the difference in output
- Understand that AI responds to instructions — it doesn't read minds
Activity Overview
Part 1 (10 min): The "Peanut Butter Sandwich" exercise. Students write instructions for making a sandwich. Teacher follows instructions literally (demonstrating how computers interpret instructions). Connect to AI prompting.
Part 2 (25 min): Prompt Lab. Students get a worksheet with 5 "bad" prompts. They rewrite each one to be more specific, test both versions, and document the differences. Example: "Tell me about space" → "Explain why Mars appears red, in 3 sentences, for a 7th grader."
Part 3 (15 min): Share best prompt transformations. Class votes on the most improved prompt. Discuss: "Why does being specific help?"
Lesson 3: The Fact-Check Challenge
Students learn that AI can be confidently wrong. They practice critical evaluation skills by finding errors in AI-generated content.
Learning Objectives
- Understand what AI "hallucination" means and why it happens
- Use reliable sources to verify AI-generated claims
- Develop a healthy skepticism toward AI-generated content
Activity Overview
Part 1 (10 min): Teacher shows 3 AI-generated "facts" about a familiar topic — one true, one half-true, one completely made up. Students vote on which is which before the reveal.
Part 2 (25 min): In pairs, students ask AI to explain a topic they're studying. They check every specific claim against at least 2 other sources (textbook, encyclopedia, official website). Worksheet: What was right? What was wrong? What was misleading?
Part 3 (15 min): Class debrief. How many errors did each pair find? Discuss: "Why does AI sound so sure even when it's wrong?" and "What's a good process for checking AI's work?"
Lesson 4: AI Ethics Debate
Students engage with real ethical dilemmas around AI use in schools, work, and society. They research, argue, and form their own positions on complex issues.
Learning Objectives
- Articulate multiple perspectives on AI ethics topics
- Construct evidence-based arguments for a position
- Recognize that AI ethics questions often don't have simple answers
Activity Overview
Part 1 (10 min): Present debate topics. Options: "Should AI-generated art win competitions?", "Should schools ban AI tools?", "Should companies have to label AI content?", "Is it ethical to use AI to write college application essays?"
Part 2 (25 min): Students are assigned (or choose) sides. They research using both AI and traditional sources, prepare 3 key arguments, and anticipate counterarguments.
Part 3 (20 min): Structured debate: each side presents (3 min), rebuttals (2 min each), open Q&A (5 min). Class reflection: "Did anyone change their mind? Why?"
Lesson 5: Build Your Own AI Policy
Students draft an AI acceptable use policy for their school, applying everything they've learned about ethics, safety, and practical use. Real-world policy design meets AI literacy.
Learning Objectives
- Analyze existing AI policies from real schools and organizations
- Balance competing interests (learning vs. shortcuts, access vs. safety)
- Produce a clear, enforceable policy document
Activity Overview
Part 1 (15 min): Review 2-3 real school AI policies. What do students agree with? What's missing? What's unfair?
Part 2 (30 min): In groups of 3-4, draft a policy covering: when AI use is allowed, when it's not, citation/disclosure requirements, consequences, and support for students who need help understanding AI tools.
Part 3 (15 min): Groups present. Class votes on best policy elements to combine into a final class-endorsed policy. Optionally, present to school administration.
Need something specific?
We're building more lesson plans regularly. Let us know what grade level, subject, or topic you'd like us to cover next.
Request a Lesson Plan