The vast majority of students now use generative artificial intelligence (GenAI) programs on a regular basis. Can teachers get students past the principle of least effort and turn these programs into educational tools?
They read your questions without judgement, explain the answers without batting an eyelid, and will reformulate something a dozen times if needed. What's more, they're flexible and always available, and can save time on a number of tasks. It's therefore no surprise that large language models (LLMs) - above all ChatGPT - have won over students around the world. According to a survey by the Digital Education Council, which obtained responses from 3,839 bachelor's, master's and PhD students in 16 countries, 86% of students use GenAI for their studies, primarily to look up information. At EPFL, 79% of students and 61.5% of teachers use the technology, based on two surveys carried out in 2024. "I see three ways in which GenAI can be a useful tool for education: to help us understand how students learn, to automate tasks, and to personalize the learning experience through student-specific feedback and exercises. And I think the first and last ways are the most interesting," says Tanja Käser, a tenure- track assistant professor at EPFL and head of the School's Machine Learning for Education Laboratory (ML4ED).
Trading places
Ola Svensson, an EPFL associate professor, uses ChatGPT's language model as part of his teaching material. He gives a bachelor's level class on algorithms to over 500 students. To help make the class engaging and interactive, he has students don the hat of a teacher - through the use of GenAI. "You learn something better if you have to explain it, but studies have shown that in large classes, only the top students put in that kind of effort," says Svensson. "So I created a chatbot that asks questions instead of giving answers, forcing students to do the explaining. The chatbot then gives them feedback on their explanations. Students seem to really like this approach and the data show that with this method, as opposed to reading an explanation of a concept, students answer quiz questions more quickly and accurately."
You learn something better if you have to explain it, but studies have shown that in large classes, only the top students put in that kind of effort.
Human input is essential
Feedback is a critical part of the learning process, but giving good feedback takes skill. Patrick Jermann, head of EPFL's Center for Digital Education, is working with ML4ED on a project in this area. "We're developing a language model for use at EPFL that can help teaching assistants tutor students," he says. "The model is designed to help them formulate answers that are effective from a teaching perspective. The goal isn't to replace teaching assistants, but to make them better at what they do."
In initiatives like these, researchers use LLMs that have been enhanced with retrieval-augmented generation (RAG) technology. This technology selects the most relevant data source (scientific literature, class materials, student exercises, etc.) for responding to a given prompt, resulting in more reliable output from the GenAI program. However, input from a human with subject-specific expertise is crucial, not only to feed the RAG database but also to ensure quality feedback is given and subsequently taken on board. A recent ML4ED study found that students tend to place less trust in feedback that's given by an AI program.
Learning doesn't happen without effort
That said, when students are in a hurry, they tend to forget that AI output should be taken with a pinch of salt. Francesco Mondada, a robotics professor and the academic director of EPFL's LEARN Center for Learning Sciences, lets students use programs like ChatGPT during their exams. "The quality of the answers from students who reported using AI was highly correlated with that of the answers from ChatGPT," he says. "Where the program got something wrong, the students did too - even though I had told them beforehand that I would be using ChatGPT to prepare the exam. That shows just how important it is to teach students to use GenAI systems appropriately, as early as possible in their schooling." The LEARN Center has developed interactive training programs on AI along with resources for high-school teachers. Within EPFL, the LEARN Center coordinates various initiatives to translate the results of experiments conducted on campus and elsewhere into practical advice and training for the teaching community.
GenAI programs can be effective educational tools only if they're employed correctly. In his class last year, Mondada saw that students who used ChatGPT for their exercises went through the material faster but didn't learn it as well. "The problem is that they don't acquire the necessary knowledge and skills," he says. "Learning something new requires an effort - there's no getting around it. That means we need to think about the bigger picture. Teachers should evaluate students' learning process - and not just their answers to specific questions - by, for example, having them work on projects with assessments performed along the way." That's one way of making sure students obtain a solid foundation they can rely on when facing complex problems - problems that ChatGPT will reformulate a dozen times without providing the right answer. For now.
To support teachers in the use of generative AI for educational purposes, the Center for Digital Education (CEDE), the Teaching Support Center (CAPE), and the LEARN Center, in collaboration with the AI Center, have compiled a selection of resources on this webpage. In addition to recommendations for the responsible use of AI, teachers will find concrete examples of classroom applications, a selection of suitable tools, and an overview of research conducted at EPFL related to AI and education.