The english version of the editorial is translated by UiO GPT.
The conversation about AI so far has largely been characterized by the perception of AI as a tool for cheating on exams. To the extent that there are guidelines for the use of AI in teaching and assessment here at the University of Oslo (UiO), they often focus on how students should not use AI. But AI is not just a cheater's tool; it is also an important tool for learning. We have an obligation to help students understand how they can use AI to enhance their learning outcomes and distinguish between good and bad ways to use AI.
There seems to be a discrepancy between students' and educators' use and perception of AI as a learning tool. On the one hand, there are probably many students who are already using AI in their studies, but due to a lack of guidelines for use, they are cautious about discussing it openly for fear of being labeled as cheaters. On the other hand, my impression is that many educators barely engage with AI. They do not consider it relevant to their own teaching and perhaps operate with a general recommendation that students should not use it either. But both students and educators are calling for more support from the faculty and clearer guidelines for the use of AI in teaching and assessment. This is an important motivation for establishing a working group on AI in teaching and assessment at the Faculty of Social Sciences.
The working group includes members from the scientific and administrative staff, EILIN, and the students, and has been given a fairly broad mandate, developed by the faculty's educational leaders in collaboration with me. The working group is to:
- Explore AI's characteristics as a learning tool relevant to the Faculty of Social Sciences
- Assess students' need for and use of AI
- Evaluate whether there is a need to expand the set of AI tools allowed for use in teaching at UiO and outline a possible process for doing so
- Investigate the diversity of guidelines for AI use that exist at UiO, and
- Assess needs and develop guidelines for the use of AI in teaching and assessment at the Faculty of Social Sciences.
Many would probably like clear and unambiguous guidelines at an overarching level for what can and cannot be done with AI in most teaching-related situations. But the use of AI is an academic question that must be viewed in light of what we want to achieve in a given teaching situation and whether and, if so, how AI can contribute to us achieving the desired learning outcomes. This means that guidelines for AI use must rather be developed at the course level than for the faculty as a whole. Each educator must assess how AI can contribute to the desired learning outcomes and organize teaching and assessment accordingly. This further means that students will have to relate to variations in what is permitted use of AI in different courses and teaching situations and actively seek information about permissible and recommended AI use, for example, in program and course descriptions.
It is also important to remember that AI use has significant environmental consequences, and these should be taken into account when assessing how we should use AI.
The working group will help facilitate that both educators and students can safely explore the use of AI in their work, thus taking a small step towards a teaching environment adapted to a world with AI.
On February 3rd, an opportunity already presents itself for employees at the Faculty of Social Sciences to explore AI. Omid Mirmotahari and Yngvar Berg from the Department of Informatics, who in collaboration with LINK have developed a workshop on AI for use in assessment and feedback, will visit the Faculty. The workshop is from 9 AM to 12 PM, and those who wish to participate can contact Kaja Braathen at EILIN at kaja.braathen@sv.uio.no.