Guidance on Artificial Intelligence (AI) in education and research

There are currently several different types of AI tools able to generate text, images and moving media. Contemporary AI tools, for example AI chatbots, are already used by students, as well as by teachers and researchers to some degree. This guidance is aimed for teachers, researchers and students at the Swedish Defence University.

A tool such as a text-oriented AI bot can formulate shorter or longer texts. The text generated depends on the prompt (question) put to it and the AI tool in question. There are valuable use cases for an example an AI bot to discuss a question, get ideas for texts and compile texts such as a subject overview. It is important however to be critical of its output and always to clearly indicate when and how text has been AI-generated.

AI tools and examination

There is a clear risk that students will use AI tools in connection with examinations in a way that not only is problematic but clearly wrong. Therefore it must be communicated to students when and how they are allowed to use AI tools within a course, including in regard to the examination. Teachers and students are therefore encouraged to discuss ethics and academic integrity in relation to AI and higher education.

The Swedish Defence University recommends that the question of AI in higher education is discussed within the respective disciplines, both subject concils/ämnesråd and collegially, and that guidelines (at the very least on course level) are formulated. The guidelines should include the following aspects:

  • that it is not permitted to use text generated by AI and submit it as one’s own. To do so is an attempt to mislead the examiner, and it is therefore seen as cheating.
  • if a student is allowed to use AI tools to discuss or improve a text, the student should account for this in accordance with instructions given by the course co-ordinator/teacher.
  • it should be clear from the instructions whether a student is allowed or not allowed to use AI tools in relation to peer assessment or when preparing to publicly discuss another student’s work. If it is allowed, the student should be asked to describe how the AI tool was used.

The Swedish Defence University also recommends that the different subjects/disciplines on a more general level as well as within specific courses strive to develop and improve the examination formats used, in order to prevent unsanctioned use of AI in relation to examinations.

Examples of measures are:

  • to reconsider the examination formats used, and bear in mind that this needs to be done in a way that maintains the constructive alignment.
  • to vary the examination formats used. It is pedagogically preferable if a student during a course or a program gets the opportunity to showcase his or her knowledge, skills and abilities in a variety of ways, both orally and in writing.
  • when using take-home examinations, also use other formats; for example some kind of oral follow-up examination.
  • when the student is working on a written paper during a longer period of time, as a degree project, it is a good idea to have a number of checkpoints where the student recounts how the text and / or solutions to problems have been created. It is however important to be attentive to in what way such a practice affects the role as supervisor.
  • using formative assessment to a higher degree, and not relying solely on summative examinations at the end of courses.
  • work with progression both within and between courses. A way to do this is to construct assignments that build on knowledge and abilities that have previously been subject to examination, this makes it harder for a student to pass yet another course if he or she has cheated previously.
  • explain and emphasize to students that examinations are more than merely control tools, that they in fact are an important learning activity and as such function as a support in the student’s learning process.
  • explain and emphasize to students that the writing process in itself is an important and crucial part of the learning process.

Suspicion of an attempt to mislead during an examination

If it is suspected that a student has used an AI tool in an unauthorised manner in connection with an examination, the offence must be investigated, and if there is a well-founded suspicion of an attempt to mislead during an examination (cheating), it must be reported in the same way as in other disciplinary cases.

Guidelines for disciplinary cases and expulsion from studies Pdf, 280.7 kB.

There are currently several different services that claim to determine whether a text is AI-generated or not. What these services do is to analyse the word order in a section of text. This means that the result shows a probability of AI-generated text or not. But this does not mean that the text is de facto AI-generated. There are currently no reliable services to detect AI-generated text. There are also tools designed to alter texts to conceal the use of AI. It is therefore very difficult to detect whether or not students are using AI bots.

It is also important to consider that the Swedish Defence University does not have agreements with any services for detecting AI-generated text. Since the student owns the copyright to their text, we should not upload student work through these services. We simply don't know where the text goes after we upload it. And we also don't benefit much from the results that the services show. The Swedish Defence University currently has an agreement with an anti-plagiarism service, Ouriginal. Like others in the market, the supplier is working to be able to detect AI-generated text.

Use of AI tools by teachers and students in courses

AI chatbots are likely to be increasingly integrated into search engines and software we already use today, such as Word. Our students will thus have to relate to and use AI tools in their future professional lives. It is therefore desirable for teachers to consider and discuss with other teachers when and how new tools such as AI chatbots can be used in teaching. Examples could include working with both colleagues and students to:

  • analyse and reflect on the benefits and problems of AI tools and the text they generate.
  • critically analyse responses from AI tools and make students aware of the risk of inaccuracy and bias.
  • reflect on bias and how different perspectives are expressed in the automated responses.
  • compare AI tools responses with those written by experts.
  • reflect on how different forms of knowledge are expressed and how these are valued when machines can now independently produce text.
  • reflect on how AI technology can be involved in creating and maintaining different forms of social and global inequalities.
  • reflect on the development and use of AI from a sustainability perspective.

AI tools like chatbots can also help teachers in areas such as lesson planning and the development of teaching materials. Here it is important to note that the companies behind the chatbots typically do not disclose the exact databases and material on which the chatbots are trained. It is therefore important that AI-generated material be always carefully scrutinized before being used. Another aspect to consider is that submitted texts can be used for different purposes and therefore sensitive information should never be sent to a chatbot.

Use of AI chatbots by researchers in research and for research applications

Publishers and research funders have different rules on the extent to which the use of AI chatbots is authorised. Unauthorised use could in some contexts probably be classified as research misconduct. Therefore, for the time being, researchers are urged to exercise caution when using AI chatbots in the writing of research articles and research applications. If they are used, it is important to be transparent about what was used and how. Check with publishers or research funders what applies to the use of AI chatbots in the current situation. Doctoral students and researchers should have the expertise to develop research questions themselves and are advised against using AI chatbots for this purpose. Just as in a teaching context, it is also important never to share sensitive research information with an AI tool.