Guidelines for the Use of AI in Teaching at EPFL

Version 1 April 2025
The goal of these guidelines is to drive responsible AI innovation in teaching and learning at EPFL by using an evidence-based approach as well as by providing a clear framework for teachers to enable them to make choices with clarity concerning their boundary conditions. The guidelines are structured as follows:

Opportunities and ideas for the use of genAI in teaching and learning

Below teachers find a concise summary of opportunities for AI innovation in teaching and learning, organized by type of activity, including lectures, exercises, projects and assessments. These ideas are just a starting point; they will be updated based on new research findings and will be further detailed in a dedicated practical teaching guide with concrete steps. We emphasize that these ideas  are meant to support teachers by easing their high workloads and creating new learning opportunities for students. However, they are not intended to replace teaching assistants or other essential resources, which are now more critical than ever in addressing the challenges these new technologies bring to higher education.

It is essential that all applications are thoroughly tested before being integrated in the learning process. The center LEARN along with CEDE and CAPE offers to accompany teachers in innovating (e.g. by means of the DRIL fund), supporting an evidence-based approach to teaching and learning. Finally, all implementations should take into account the considerations presented in section 2.

Lectures & Exercises

  • Providing effective feedback is the teacher behaviour that is often identified as most strongly associated with student learning gains (e.g. Wisniewski, Zierer & Hattie, 2020).  AI can serve as an innovative learning instrument that can for example provide adaptive, real-time feedback to students. The learning effect of AI-generated feedback in large-scale classes has been successfully demonstrated (e.g. Pardos and Bhandari (2024), however, teachers should also know that in 2023, EPFL students had reservations towards feedback from AI and therefore a human-in-the-loop approach is advised (Nazaretsky et al. 2024).
  • There is substantial evidence (e.g. Freeman et al., 2014) that students learn more in active learning environments which encourage students to actively process information (rather than passively receiving explanations). AI could promote active processing by acting as a tutor, or letting students test their understanding of concepts (“learning by explaining” – successfully tested by Prof. Ola Svensson in the Advanced Algorithms course). 
  • You can customize your own AI assistant by providing course documents, either via prompts, or through Retrieval Augmented Generation (RAG). Experimenting with RAG and customization of assistants can be done through custom GPTs (but note that you should not ask students to subscribe to any third-party service to access a custom assistant that you created),  locally on your own computer with Msty, or you can contact CEDE to discuss the possibility of a student-facing AI assistant for your course. A student-facing AI assistant can for example be instructed to answer questions about the course, to quiz students, or to engage in role playing scenarios.
  • Teachers can use AI as a support for lecture or exercises preparation, for example to support slide design and clarity, or to help create engaging exercises, hints and quizzes. An interesting tool to explore is NotebookLM
  • AI tools can be used to create a more inclusive environment by tailoring content delivery to diverse needs, including sensory impairments, e.g., generating alt text for images (aiaiapps at EPFL in active development), creating auditory material based on text (e.g., a podcast with NotebookLM), or transcribing lectures into text.
  • AI can be used for research, content analysis and summarization. Possible applications for lectures are preparatory literature analysis and analysis of student feedback with the objective of improving lectures.
    • AI tools like elicit, perplexity, consensus and deep research can assist with research and literature analysis. The EPFL library is actively researching the topic of AI-assisted literature search and analysis and will issue further recommendations.
    • If you wish to automate content analysis, you can use an application programming interface (API) to interact with remotely hosted models (e.g., OpenAI API, infomaniak API, etc.) or with locally hosted models (e.g., Ollama). Alternatively, you can integrate AI tools into spreadsheets (e.g., GPT for sheets).
    • When doing systematic content analysis, consider incorporating a metric to test whether AI tools perform well on your task. You could analyze a subset of the data manually and compare your results with the results of the AI tool.
    • It is important to note that, as a general rule, regulated data, such as licensed content, data subject to official secrecy, or identifiable personal data, should not be processed using AI service providers that fail to meet legal or institutional data protection standards (see below).

 

Projects & Assessment

  • Similar to lectures and exercises, AI can support teachers in preparing projects and assessments.
  • AI is great at adapting and personalizing material, allowing content to be tailored to different competency levels and diverse backgrounds. This capability can be used to create more relevant and engaging project assignments for students.
  • For assessments, AI tools can assist in creating exam questions and/or improve their clarity to ensure students understand them correctly.
  • Retrieval Augmented Generation (RAG) can also facilitate the development of a student-facing assistant capable of answering questions regarding projects and assessments based on course and institutional documents, supporting active learning of students (e.g. Freeman et al., 2014). Additionally, such an assistant could guide students on effective (learning) strategies and resources for exam preparation and working on projects.
  • As discussed in the section on lectures and exercises, AI-assisted formative feedback is a promising application. This is particularly relevant for projects and assessments, where AI tools can support teachers and teaching assistants in delivering the large-scale feedback that is necessary here. Beyond providing feedback, AI is also being tested for assisting exam grading, though a lot of challenges are still in the process of being addressed (e.g., Kortemeyer & Nöhl, 2024).
    • It is important to recognize that AI-assisted exam grading can be considered as automated individual decision-making, requiring transparency about AI’s role in the process and explainability of its decisions. Additionally, student submissions may be considered regulated data. Currently, the Data Protection Office has not issued official recommendations on the use of AI tools for processing student submissions. Running a large language model (LLM) locally on your computer or on EPFL infrastructure is currently the safest, though technically most complex, approach. CEDE can give further advice on what are the current possibilities.
  • The emergence of powerful AI tools also raises critical questions about assignment design, given that students can use these tools to complete their work. We recommend a careful reconsideration of assessment design, as a substantial portion of the skills and knowledge assessed at EPFL is susceptible to AI-assisted completion  (Borges et al., 2024).
  • Several strategies can be employed regarding student use of AI tools in assessments:
    • Mitigating AI misuse by alternative question types, oral interviews, proctored exams, and video diaries documenting project progress  (e.g., Nikolic et al., 2024).
    • Guiding students towards appropriate AI use instead of prohibition. Nikolic et al outlines various strategies for this. For example, an opportunity for authentic assessment could be to have students use AI tools and critically evaluate and improve the output.
    • Ultimately, teachers decide whether and how students should use AI tools for projects and assessments. The conditions and rules for AI tool usage must be explicitly communicated to students. Furthermore, it is important to take note of the rules outlined below to avoid AI plagiarism.

More information on generative AI Tools and student assessment is provided in the teaching guide.

Important considerations for the responsible implementation of genAI

Here, we outline important considerations for using AI tools in education. We address data protection, risk and responsibility, setting appropriate goals, as well as the promotion of equity and fairness, based on our current knowledge of existing regulations. For further reading: the European Commission has also published guidelines for educators for the ethical use of AI in teaching and learning.

  • Whether you can process data with AI in the context of your role as a teacher at EPFL depends on the nature of the data, the AI service provider, and the existence of a data processing agreement.
  • As a general rule, regulated data, such as licensed content (e.g., copyright, intellectual property), data subject to official secrecy, or identifiable personal data, should not be processed using AI service providers that fail to meet legal or institutional data protection standards. You should assume that these restrictions apply to all materials collected from or created by others and not rendered publicly accessible, including student-produced work like essays. 
  • However, when the appropriate level of data protection is ensured, AI tools can, in principle, be used for processing regulated data. There are several aspects of data protection to be considered, including whether the data (e.g., prompts) are used for training or advertising purposes, and whether the necessary data protection measures can be guaranteed, either through a data processing agreement or through local hosting. Here are some recommendations:
    • Most companies offering generative AI services collect and use your data by default, but you can disable this option.
    • Enterprise licences from these providers typically do not use your data by default, and they offer additional protection measures (e.g., encryption). 
    • However, enterprise licenses such as Microsoft 365 Copilot via your EPFL account are currently not a secure solution for processing regulated data, because no data processing agreement has been signed by EPFL that would guarantee that data protection measures align with institutional needs or with Swiss legislation on personal data protection. 
    • Infomaniak AI tools provide GUI and API access to open-source models including Llama and Whisper, while ensuring that data protection complies with European and Swiss laws. Everything is hosted in Switzerland on certified infrastructure.
    • If you need more control, you can use LLMs that are hosted and controlled by you or by the institution. You can download and run open-weights models (e.g., llama, mistral, deepseek R1, whisper for audio transcription, etc.) on your own computer with easy-to-use tools like ollama and Msty.
  • Although there are many aspects to consider for processing regulated data, you can use any preferred AI tool for tasks such as processing publicly accessible data or open-access content, handling unregulated or correctly anonymized and therefore no longer personal data, or for generating responses to new or openly shared prompts.
  • AI can produce inaccurate or biased content, and can sometimes lead to unintentional plagiarism of existing works. 
  • Users are responsible for verifying their AI-generated content to avoid unintentional plagiarism and should back up factual statements with reliable sources. 
  • Concerning student assessments, the following rules apply:
    • EPFL rules (Lex 1.3.3, Article 4) require that all assessment material that is not the student’s personal and original contribution must be recognizable as such. Therefore, the use of AI tools should be disclosed in a statement. The use of AI-generated content in assignments without proper attribution is considered AI plagiarism. 
    • Tools that specifically aim to detect whether content was AI-generated (e.g., turnitin) are not admissible as stand-alone evidence of AI plagiarism due to their high risk of false positives (e.g., Farrelly & Baker, 2023). Other irrefutable evidence must be presented to substantiate claims of academic dishonesty.

See the teaching guide chapter on generative AI Tools and Student Assessment for more details.

  • The environmental footprint of using AI tools compared to traditional methods can be significant. Before using these tools, carefully weigh the benefits against the costs. A calculator is available here, to get an idea of the environmental costs.
  • Define your goals and expectations clearly before using AI tools. Evaluate whether the obtained results align with these goals.
  • For example, a goal could be to use AI tools as a tutor to improve student learning of a difficult concept. You could plan an appropriate test of learning outcomes and evaluate whether the use of AI tools improved these outcomes.
  • The LEARN center can provide resources to further support an evidence-based approach.
  • Concerning students, when they rely on AI tools to complete exercises, projects and assessments, they may bypass the learning process (e.g., Bastani et al., 2024) and miss out on developing essential skills. They may show good performance at tasks when they have access to AI tools, but if they haven’t really learnt the underlying concepts, they will perform poorly on formal assessments that do not allow AI assistance. Therefore, for students, the goal should always be to complement and never to replace the learning process. See the teaching guide chapter on generative AI Tools and Student Assessment for more tips on how to address this.

Although AI tools are ideally used to create more inclusive environments by tailoring content delivery to diverse needs and contributing to impartiality, their use can also lead to exclusion in some cases. Ensure equity and fairness when these tools are integrated for learning or evaluation purposes. For example, ensure that all students have equal access (e.g., you should not require students to subscribe to a third-party provider), and keep track of potential biases in these tools that could negatively affect students from different backgrounds.