- Admission
- Programs
- Learning
- Community
- 91ĹÝÜ˝
- Research
- Strategic Research Plan
- Implementation Plan
- Supporting Health and Wellness of Individuals, Populations and Communities
- Expanding the foundations of knowledge and understanding our origins
- Strengthening Democracy, Justice, Equity and Education
- Supporting Research Graduate Students
- Supporting Postdoctoral Fellows
- Valuing and Measuring Scholarly Impact
- Incorporating Indigenous Perspectives into Research Ethics
- Building World-Class Research Space and Infrastructure
- Involving Undergraduate Students in Research
- Supporting Early-Career Researchers (Faculty)
- Funding Research Chairs
- Reducing Administrative barriers to Research
- Implementation Plan
- Performance & Excellence
- Innovation
- Knowledge Mobilization
- Researcher Resources north_east
- Institutes, Centres & Facilities
- Leadership & Departments
- Strategic Research Plan
- Dashboard
- Campuses
- Contact Us
- Emergency
Transform the SFU Experience
Clarity, transparency and learning first: SFU faculty share their strategies for responding to AI in the classroom
As AI reshapes higher education, SFU instructors are exploring diverse ways to address its use in their courses. Their syllabus statements set boundaries while opening dialogue about what responsible and meaningful use looks like across disciplines.
Computing science lecturer Victor Cheung encourages students to use AI to support their learning but cautions against relying on it for assignmentsâa balance that sparks valuable discussion.
âItâs a toolâuse it to support your learning, not stunt it. If students are struggling with a concept, I encourage them to use AI to deepen their understanding, not replace it. Otherwise, they risk putting their future careers in jeopardy by becoming replaceable by the very tool that they have grown complete reliance on. Whatâs powerful about this approach is that it gets students to think critically about how and when they use AI. Our class has many conversations about whatâs appropriate and what isnâtâand for me, thatâs the real goal: helping students become critical thinkers about AI and the world.â
See Cheungâs syllabus statement.
CHAT GPT Use of ChatGPT or Other AI Tools
As mentioned in the class, we are aware of them. I see them as helpers/tutors from which you can look for inspiration. However, these tools are not reliable and they tend to be overly confident about their answers, sometimes even incorrect. It is also very easy to grow a reliance to them and put yourself at risk of not actually learning anything and even committing academic dishonesty. Other issues include:
When it comes to uncertainty, you wonât know how to determine what is correct.
If you need to modify or fine tune your answer, you wonât know what to do.
You will not be able to learn the materials and thus will not be able to apply what you learn in situations where no external help is available, for example, during exams and interviews.
For introductory level courses and less sophisticated questions, it is likely that youâll get an almost perfect answer from these tools. But keep in mind if you can get the answer, everyone can also get the answer. Bottom line is, if you ask these tools to give you an answer and you use the answer as yours, you are committing academic dishonesty by claiming work that is not done by you as yours. In particular, do not copy & paste the questions, in part or in whole, and have these tools give you the answers.
Note that different instructors might have different policies regarding the use of these tools. Check with them before you proceed with assignments from other courses
Biomedical physiology and kinesiology professor Dave Clarke takes a similar approach, drawing clear lines around when AI use is appropriate and when it is not.
âI tell students that AI can be a valuable tool for generating ideas and providing direction. But if itâs used to produce entire responses, it short-circuits the learning process. In our discipline, the ability to find, read, analyze, and understand verified knowledge is foundational. If students bypass that process, they will leave our program ill-prepared to address real-world challengesâand ultimately, thatâs a detriment to everyone.â
See Clarkeâs syllabus statement.
Use of Generative AI software: Generative AI can help you generate ideas and possibly to find sources of information. It can be used as a tool, but the standards of academic work apply. Your responses must be written in your own words and refer specifically to the information provided in the question, reference materials, scholarly references, and the course lecture notes. In citing a source, you are affirming that you have read the source, understood the material, and cited it faithfully. Citing generative AI software as a source of information is inappropriate; it is a pattern matcher, not âinformation.â
For English lecturer Nicky Didicher, the priority is helping students grapple with the reality of AI. She invites them to use it in their assignmentsâso long as they are transparent about their prompts and revision process.
âWhatever their future career, our students are going to be using AI. One way I try to help them build this skill is by offering the option to complete creative writing assignments with AIâon the condition that they document their process and donât let the tool write the entire piece. In a larger, lower-division class, this approach works well because it opens space for dialogue about transparency and responsibility in AI use. Itâs not a perfect systemâI still find some students turning to AI even when theyâve chosen the âno AIâ stream. But what it does do is help students confront the impact of AI on their futures as learners and workers.â
See Didicher's syllabus statement.
You are permitted to use text-generating AI such as Microsoft Copilot, ChatGPT, Grammarly, or Gemini Pro as writing support for your report and analysis assignment, provided you acknowledge what AI(s) you use and how, and your own contributions to the work are more significant than those of the AI; you are required to use AI as writing support for your plan and sample assignmentâsee details here
You should be aware that ChatGPT and other text-generating AIs fabricate information such as facts and quotations (so avoid using them for research or make sure to check everything they give you); you should also be aware that if you put your work into an AI other than Copilot accessed through SFUâs Microsoft 365, then you are giving your intellectual property to the company without credit or compensation.
Health sciences professor Ralph Pantophlet also emphasizes building a culture of dialogue about AI use.
âI tell my students that if they want to use AI, Iâm open to the possibilityâbut they first need to think through how they plan to use it and get permission. This signals to them that Iâm not here to police or forbid behaviors, but rather to create a collaborative classroom environment. Over the term, if I notice that a student or group seems to be relying too heavily on AI, my TAs or I will have a conversation with them. Almost always, the behavior changesâand I believe thatâs because Iâve been transparent and collaborative with them from the start.â
See Pantophletâs syllabus statement.
Generative AI tools: Students may use ChatGPT and other generative AI assistants for assignments in this class but must receive permission from the instructor first. Permission must be requested via Canvas email at least seven days ahead of the assignment due date and include the following:
a) how does the student plan to use these tools specifically, and
b) how will the student indicate the use of the tool in their work?
Failure to obtain permission from the instructor will be considered a violation of SFUâs Academic Integrity policies and will be investigated accordingly.
Health sciences professor David Whitehurst permits the use of AI with proper attribution but emphasizes that students are equally welcome to avoid it. His main goal, he explains, is to acknowledge that AI use is a new and evolving practice.
âItâs important for me to let students know that, at least for now, there is no expectation that they must use AI. As a student, I would have been a slow adopterâjust as I am as a researcher and instructorâand I wanted to ensure that students like me wouldnât feel anxious or disadvantaged. My priority is to provide clarity. I know that best practices will continue to evolve and that there will be better ways to integrate AI into assignments. But for now, what matters most is that Iâve set clear expectations, outlined responsibilities, and acknowledged that AI is something we are all still learning to navigate.â
See Whitehurst's syllabus statement.
Guidance about the use of generative AI software (e.g., ChatGPT)
Much of the first paragraph been adapted from the guidance developed by the University of Jyväskylä (; last accessed August 25th, 2025)
AI-based text editors or text generators, so-called large language models, are interactive AI applications that produce text based on user input. There has been much discussion about the use of language models in higher education, as they offer an obvious opportunity to cheat by writing essay answers with the help of AI or by rewriting plagiarized text. However, language models are likely to be integrated with word processor software in the future and may well be standard applications you will use in your employment following graduation. Therefore, there is no outright ban on their use for Midterm #1. The use of generative artificial intelligence tools is permitted for this assignment, with proper attribution (see below).
One thing that generative AI does not change is that you, the student, is always responsible for the text you are submitting. As explained in the course syllabus, the instructor and Teaching Assistants will assume that students have a familiarity with the norms of plagiarism; all students should be aware of SFUâs Academic Honesty and Student Conduct Policies ().
The use of generative AI is not mandatory, neither is it necessarily recommended. A key aspect of Unit 1 (and, therefore, Midterm #1), is to develop an understanding of foundational concepts that will be necessary to do well in the other HSCI 206 assignments (e.g., in-person quizzes and the in-person final examination). AI is a means for gathering information and, in that way, it is no different to the Internet, the books in the library, or a conversation with the instructor during Office Hours. It is your decision how you use available information and how best to develop your knowledge of economics throughout the semester.
Attribution of AI tools in your assignment
When using material generated from an AI tool in an assignment, it is important that your use is transparent and appropriately referenced. When selecting your chosen referencing style (see page 1 of the assignment), it is advisable to see what citation guidelines are provided for citing AI-generated materials.
In addition to appropriate referencing within the assignment text, you are required to include (i) the prompt(s) used to generate a response; (ii) the name and version of the AI software used; (iii) the date(s) the prompt was used; and (iv) a copy of the response(s) generated by the software. All this information should be provided in a separate section at the end of the
These examples show that there is no one-size-fits-all approach to AI in teaching. What unites these instructors is their focus on clarity, transparency, and student learningâwhether that means encouraging experimentation, setting limits or requiring disclosure.
For more information on SFU-wide guidance on AI in learning and teaching, view the Artificial Intelligence in Learning and Teaching Task Force page, part of SFUâs AI Strategy.