Helping Students Thrive in a Changing World
We want every student to learn in an environment that is welcoming and prepares them for future success.
As technology like Artificial Intelligence (AI) becomes part of everyday life, YRDSB is ensuring students understand how to use these tools safely, responsibly, and in ways that support learning.
What is AI?
Artificial Intelligence (AI) is technology that can mimic intelligent human behaviour, such as reasoning, learning and problem-solving.
How AI Supports Learning
Used effectively, AI can support, not replace, student learning and classwork.
- Brainstorming and sparking creativity. AI can help students think of different topics, angles, or ideas to get started on a project.
- Study support. AI can make clear study notes or quizzes to help students prepare and practice for tests. AI can help to explain and simplify topics students may be struggling with.
- Check student work (not do it for them). AI can help students find grammar mistakes and identify what might be missing from an essay's argument.
- Learn new skills. AI can provide lessons and practice questions for students who want to learn a new language or skill, like coding.
- Organize information. AI can help organize and summarize study notes or research.
- Research help. AI can help summarize information or suggest sources for your assignments. Students should always double-check the facts.
- See ideas differently. AI can help students to see different ideas, viewpoints and perspectives.
Talking About AI at Home
Families play an important role in helping students think critically about technology. Open conversations at home can support responsible AI use, curiosity, good decision-making and safe practices.
The questions below are optional conversation starters to help guide discussions about AI learning and responsibility.
Understanding AI
- Have you heard about AI tools at school or from friends, apps or social media?
- Have you used any AI tools? What was that experience like?
- What’s the most interesting or concerning thing about AI to you?
- Can you show me how you’re using an AI tool for schoolwork or a project?
Thinking Critically About AI
- How do you know whether information from an AI tool is accurate?
- Do you usually check other sources to confirm what an AI tool tells you?
- Do you think AI answers are always fair or complete? What might be missing?
- If something from an AI tool doesn’t seem right, what should you do?
Using AI Responsibly
- When do you think AI is a helpful learning tool, and when might it replace your own learning?
- If an AI tool does most of the work for you, what do you miss out on?
- How should you tell your teacher if you used AI to help with an assignment?
- What kinds of personal information should you avoid sharing with AI tools?
Frequently Asked Questions
🖶 To print this page, open all of the accordions before printing the page.
Students in Grades 7-12 will have access to Board-approved enterprise AI tools, such as Gemini, which meets strict security and privacy standards.
Unlike public AI platforms, enterprise tools:
- Protect student privacy by not using personal data for training
- Meet strict security requirements set by the Board
- Provide a safe, monitored environment for learning
Students will be taught how to use AI responsibly, effectively, and safely to support, but not replace, student learning and working. They will learn to think critically, be honest about how they use AI, and protect privacy and safety.
Read the guidelines for students, families and educators to learn more.
Academic integrity is an important part of the YRDSB AI guidelines. Students will be taught that they must do their own work, give credit when using AI and never submit AI-generated work as their own.
Students will be held accountable for any incidents of plagiarism or academic dishonesty under the Board’s Equitable Assessment Evaluation and Communication of Student Learning and Achievement Policy and Procedure.
In Kindergarten to Grade 6, students will learn about AI - what it is, how it works and how to think critically about it. They will not use AI chatbot tools. Some classroom tools (like Canva) may include AI features. In these cases, parent consent may be required.
In Grades 7-12, students will have access to Board-approved enterprise AI tools, including Gemini, which meet strict security and privacy standards. These tools do not use student data to train AI models and provide a safe, controlled environment for learning.
In Kindergarten to Grade 6, students will learn about AI - what it is, how it works and how to think critically about it. They will not use AI chatbot tools. Some classroom tools (like Canva) may include AI features. In these cases, parent consent may be required.
In Grades 7-12, students will have access to Board-approved enterprise AI tools, including Gemini, which meet strict security and privacy standards. These tools do not use student data to train AI models and provide a safe, controlled environment for learning.
Critical thinking is an important part of the AI guidelines. Students will learn how to spot mistakes, misinformation, bias or “hallucinations” (when AI makes things up). They will be taught to review and think critically about AI-generated content and to verify facts.
Critical thinking is an important part of the AI guidelines. Students will learn how to spot mistakes, misinformation, bias or “hallucinations” (when AI makes things up). They will be taught to review and think critically about AI-generated content and to verify facts.