Speak with your instructor before using generative AI tool, such as ChatGPT, to help complete assignments.
If your instructor has not specified you may use ChatGPT or other generative AI technology, or has specifically stated you cannot use these tools, using these tools to complete a portion or whole of your assignment will be considered academic misconduct.
If your instructor has permitted the use of generative AI tools, make sure you understand exactly what is permitted for a specific assignment
By having a conversation about AI with your instructor can help you get the most out of your education. It gives you a change to understand what is and is not allowed, as well as supports your learning of course concepts and AI tools. It is a good opportunity to show you are conscious about the ethical impacts of AI tools, which builds trust and promotes academic integrity. It is important to discuss:
If you’ve used AI tools—whether for school, creative projects, or just out of curiosity—consider sharing your experience with your instructor. You may already be using AI to brainstorm, organize ideas, or check grammar, and you might have insights, questions, or concerns worth exploring. On the other hand, if you haven’t used AI tools before, or if you’re unsure how they work or feel uncomfortable using them for an assignment, that perspective is also valuable to share.
Having an open conversation can lead to a clearer understanding of how AI can—or shouldn’t—be used in your coursework. It’s also a chance to build trust, set shared expectations, and ensure that these tools are used in ways that support your learning. Your perspective matters.
Ask your instructor what their expectations are with respect to AI use for the course, or for each assignment and test. If you are unsure whether use of AI is allowed, consult your instructor.
Instructors have different guidelines depending on the task, and it’s better to ask than risk unintentionally crossing a line. Having these conversations shows that you’re engaged, responsible, and committed to doing your best work with integrity.
Understanding how GenAI tools generate content, how they’re trained, and how to identify misleading or false information can give you insight into why some people are cautious about their use in academic settings. These tools don’t always produce accurate or trustworthy results, and that can raise concerns about fairness, integrity, and learning.
By learning how AI works, you can better understand your peers’, instructors’, and academic institute’s perspectives. It also helps you make more informed decisions about when and how to use AI responsibly.
It is valuable for you to familiarize yourself with the core principles of academic integrity and review JIBC’s Academic Integrity Policy. AI tools can blur the lines between your own work and machine-generated content, which is why understanding where AI use overlaps with academic integrity is essential. Knowing these boundaries helps you avoid unintentional misconduct, maintain your credibility, and ensure that your learning remains authentic.
If you think your use of AI could be considered academic misconduct, it’s best to talk to your instructor before using AI for any coursework. This includes, but is not limited to, using AI to generate study notes, review class material, or assist with completing assignments.
If you are allowed to use AI in your coursework, it’s important to clearly acknowledge where and how you used it. If you are using quotations or paraphrased information generated by AI in an assignment, you must cite the information in APA Style. This helps maintain academic integrity and gives credit for the contributions AI tools make to your work.
See the Citing AI page for examples.
While AI tools can be helpful in many ways, it’s important to recognize that not all uses of AI are appropriate for academic work. Using AI in ways that do not align with your course’s expectations or academic integrity standards can lead to serious consequences.
This section will help you understand what counts as appropriate AI use, why it matters, and how to avoid unintentional misuse. Knowing the boundaries will help you use AI responsibly and maintain your academic credibility.
Reasoning: When instructors give explicit permission to use AI tools for specific assignments or tasks, it ensures that AI use aligns with course objectives and maintains academic integrity.
Example: Your instructor says you can use ChatGPT to brainstorm a research question for your assignment. You will then need to develop the question further yourself or with a friend.
Reasoning: AI tools can support effective exam preparation by helping students quiz themselves, explain confusing topics, and organize study material. When used actively (not passively copying), AI can reinforce understanding and increase retention.
Example: Generative AI can create quizzes and questions you can use to practice and prepare for exams and organize study material.
Reasoning: AI tools can support learning and writing skills by providing suggestions, explanations, and practice prompts. This helps students improve grammar, vocabulary, structure, and fluency, especially in a low stake setting.
Example: AI tools can support writing skills, such as using writing assistant software tool to rephrase their sentences and compare the results to better understand tone and word choice.
Reasoning: AI can help students plan study sessions, organize content, and clarify key concepts. It supports time management and active review techniques.
Example: AI tools can break down a syllabus into weekly study goals.
Reasoning: AI can act as a creative partner to help students explore ideas for assignments, projects, or presentations. It encourages divergent thinking and helps overcome creative blocks.
Example: AI tools can be used to brainstorm possible angles for a multimedia project on climate change, to generate themes for a short story in a writing class, or explore different ways to visually represent data in a group presentation.
While AI tools can support learning, using them in ways that bypass academic expectations, misrepresent your work, or conceal the source of ideas is considered inappropriate and dishonest. These kinds of misuse can undermine your learning and lead to academic misconduct.
This section outlines common examples of inappropriate AI use, explains why they cross the line, and helps you recognize behaviors to avoid. Understanding these boundaries will help you protect your academic integrity and make informed, ethical choices.
Any use of an AI tool without your instructor’s permission is an act of academic misconduct.
Reasoning: Using AI without your instructor’s permission violates course rules. Even if the tool seems helpful, unapproved use may count as academic misconduct, especially if it influences your submitted work.
Example: Using AI to generate ideas or outline a take-home essay, even though the instructor has clearly stated that all work must be completed independently without external tools.
Reasoning: If you did not get permission from your instructor to use an AI tool for your assignment and do so anyway, you would be using an unauthorized aid. As you are submitting an assignment that was not done by you, you would misrepresent what you can do and what you know. This is considered cheating.
Example: Using AI tools during an online quiz to copy questions into and get correct answers, then submits them as their own, or using AI tools to generate a full research paper and submitting it as if it were conducted and written by the student.
Reasoning: Plagiarism means presenting the ideas and words of others as your own without giving proper credit to the original sources. If you are submitting an assignment that was created by an AI tool as your own creation, you are presenting the ideas of others, even if this "other" is not a human being.
Example: Using AI tools to write sections of a paper, then submitting the paper without editing or citing that the text was created by a tool, or borrowing AI-generated ideas for a discussion post but not citing or attributing the source.
Reasoning: Text generating AI tools, such as ChatGPT, sometimes make up information and references to sources that do not exist. This is commonly referred to as AI Hallucination. If you submit an assignment that contains information, research, or data that is made up and/or references that don't exist, then you are committing academic integrity misconduct, as well as submitting false claims and spreading misinformation..
Example: Asking AI tools to summarize a journal article but don’t verify the content, leading to inclusion of incorrect interpretations in the student’s assignment, or using AI to explain a scientific concept and citing it without confirming accuracy or academic credibility.
Reasoning: AI tools use content from the internet to generate their output. In Canada, content in a fixed form is automatically copyrighted. For example, if you prompt a text-generating AI tool like ChatGPT to create a song similar to Leonard Cohen's "Anthem", or ask an image-generating AI tool like DALL-E to create an image using the style of a contemporary artist, you may be infringing copyright as AI tools draw from the existing works and reproduce derivatives of them. AI can also generate text that resembles existing works, even unintentionally. All of this can lead to academic or even legal consequences.
Example: Asking AI tools to rewrite a popular novel chapter in their own words and submits it as creative writing, or using AI tools to generate a presentation based on proprietary research slides from a paywalled source.

Unless otherwise noted, this guide is licensed under a CC BY-SA 4.0 (Creative Commons Attribution-ShareAlike 4.0 International License).