Using generative AI in exams

Starting in the winter semester of 2023/24, the Senate's recommendations for the use of artificial intelligence (AI) in exams will take effect.

Generative AI systems may generally be used as aids in unsupervised written examinations such as term papers, seminar papers, and theses, provided that their use does not conflict with the actual purpose of the examination. The decision on this is at the discretion of the respective examiner or module supervisor.

On this page, students, lecturers, and module supervisors can find more information about this.

FAQ students | FAQ lecturers
Declaration on the use of generative AI | Designing examinations and coursework 

FAQ for students


They apply to exams registered on or after 01 October 2023.

Generative AI means artificial intelligence (AI)-based systems that can generate new, similar data using statistical techniques and based on training data. In other words, generative AI systems can mimic their training data. These can be, for example, texts, images, or even program codes.

Well-known examples are ChatGPT, a chatbot for generating texts, Midjourney for generating images based on text input, or Perplexity as a chat-based search engine with source information.

It is not always easy to judge whether generative AI is behind a piece of software or a service. The spelling and grammar checker of an office program, facial recognition on a smartphone, or an email spam filter can also be AI-based. But as a rule of thumb: If the software or service can produce an entire text, image, program code, etc., in response to a short input (called a prompt), then it is probably generative AI. An Internet search can also tell you more about the program or service.

If you are unsure whether you need to declare a software or service used as generative AI, contact your examiner.

This will be decided by your examiner. The Senate of the university has recommended allowing generative AI systems as aids in unsupervised written examinations, e.g., in term papers, seminar papers, and also in theses, but the final decision on this is made by each examiner. Only they can decide whether or not they can assess whether the student meets the qualification goals when generative AI is used.

 

As a rule, your examiner will ask you to provide a statement on the use of the AI systems used. In it, you describe for which work step you have used which system and how. This is not only to be understood as information for the examiners, but also as a reflection tool for yourself. We have created an example for this statement – but clarify beforehand whether you should use this template or one of the department’s own templates. 

If the use of an AI system was not part of the assignment, then the use or non-use as such must not affect the assessment.

The Senate’s recommendations apply only to examinations in the Bachelor’s and Master’s degree programs. Regulations for doctoral candidates are still under discussion. 

You may only do so if the use of AI systems is part of the exam task. Otherwise, this is not allowed and constitutes an attempt to cheat.

How you prepare for an oral exam is up to you. However, you should be aware that generative AI can also provide factually incorrect answers.

It is true that generative AI systems can produce factually incorrect data. This is also because AI systems can’t really think: They produce data that is very similar to their training data, but they cannot judge whether the data produced is really meaningful and/or factually correct.

Nevertheless, it is possible to work with these systems in an academical way if you follow the rules of good research practice – and this includes, in addition to strict honesty, the requirement to consistently doubt and verify all results yourself.

In Hohenheim, you’re already in the right place. In many degree programs, topics related to artificial intelligence and learning systems are already part of the curriculum. The AIDAHO certificate is also open to all students. It allows you to acquire and demonstrate AI skills. The inter-university project ABBA is aimed specifically at students of Business Administration and Economics.

If you use AI systems, data is usually also transmitted. This happens for the first time when you register for a service if you need your own account to use the service. However, it is not always clear to you as a user whether the provider is acting in accordance with the General Data Protection Regulation (GDPR) to protect your personal data. You should therefore exercise a certain amount of caution and always check the terms and conditions and privacy policy of the service provider. In general, the following applies: Disclose as little information about yourself as possible.

However, you must always bear this in mind: You may not pass on data from third parties without their consent! This also applies to input into AI systems, as this data can be accessed by employees of the provider and can also be used for further training of the systems. As a rule, personal, confidential, or other sensitive data must therefore not be fed into AI systems.

You may only use other data, e.g., research data without personal references, if you also have the corresponding rights of use to the data. You must therefore comply with copyright law. You can find more information about this on our information pages on copyright in teaching or via the external platform forschungsdaten.info, in which the University of Hohenheim is a partner institution.

First, contact your examiner. If you have very general or basic questions, you are also welcome to contact the Vice President for Academic Affairs’ office at prsl@uni-hohenheim.de.

FAQ for lecturers


They apply to exams registered on or after 01 October 2023.

Generative AI means artificial intelligence (AI)-based systems that can generate new, similar data using statistical techniques and based on training data. In other words, generative AI systems can mimic their training data. These can be, for example, texts, images, or even program codes.

Well-known examples are ChatGPT, a chatbot for generating texts, Midjourney for generating images based on text input, or Perplexity as a chat-based search engine with source information.

The Senate recommendations discussed here relate to unsupervised written examinations, e.g., term papers, seminar papers, or even theses. Its use is also conceivable in other exams if the use of AI systems is itself part of the task.

The decision is up to you as the examiner or to the module supervisor. You should actively inform your students about your decision and, if necessary, also make your rules for the use of AI transparent.

Yes. The recommendations of the Senate leave the decision to each examiner. However, it should be noted that technical proof of the use of generative AI in creating texts is not currently possible.

Students should submit the declaration along with the usual declaration of originality. In doing so, they make the use of AI systems in the creation of the work transparent. In addition to informing you, this also serves as a tool for the students to reconstruct how they have used AI and reflect on this use. We have created a template for a declaration on the use of generative AI that you can customize according to your needs.

Technical proof is not possible with any reliability at the present time. However, if you get the impression that the declaration of originality does not correspond to the truth, for example because parts of the text do not fit together stylistically, you can proceed as in the case of suspected plagiarism. If you recognize that good scientific practice has been violated (e.g., because fictitious sources have been cited), then this can also have a direct impact on the grading.

This depends largely on the qualification objective of the module that is being assessed. We have created guidelines to help you decide. However, tasks should always be selected in such a way that they cannot be handled by the AI alone.

This question is not so easy to answer and depends on the context. In principle, however, given the current state of technology, it can be assumed that the preparation of a written paper requires significant intellectual effort even when AI systems are used as aids. The Senate’s recommendation assumes that such a significant independent contribution only exists if the students described transparently what AI systems they used and how they were used in creating the work.

On the sub-page of the Teaching Service Portal: AI at the University—ChatGPT und Co, you will find a didactic guide and further information.  In addition, a self-study course on the use of ChatGPT in teaching and testing in a world with AI is available. You can also make an individual consultation appointment with the University Didactics Office (ASL 5).

If you allow the use of AI systems for unsupervised written examinations, you should inform students about or raise their awareness of some basic aspects of data protection and copyright law. This is because when students use AI systems, data is usually also transferred. This happens for the first time when you register for a service if you need your own account to use the service. So make students aware that they need to familiarize themselves with the provider's terms of use and privacy policy. If it is not clear that the service complies with the General Data Protection Regulation (GDPR), students should refrain from using it. In any case, students should exercise caution and disclose as little private information about themselves as possible.

It is essential that students know that data from third parties may not be passed on without those parties’ consent! This also applies to input into AI systems, as this data can be accessed by employees of the provider and can also be used for further training of the systems. As a rule, personal, confidential, or other sensitive data must therefore not be fed into AI systems.

Students may only use other data, e.g., research data without personal references, if they also have the corresponding rights of use to the data. Students must therefore observe copyright law. You can find more information about this on our information pages on copyright in teaching or via the external platform forschungsdaten.info, in which the University of Hohenheim is a partner institution.

If you have general or basic questions, you are welcome to contact the Vice President for Academic Affairs’ office at prsl@uni-hohenheim.de.

Declaration on the use of generative AI

Students submit a declaration on the use of generative AI with their written work in addition to the declaration of originality. The examiner determines what exactly must be included in this statement.

Example and guide for completing form

Declaration on the use of generative AI systems
Example for completing the declaration on the use of generative AI systems

Back to the top

Designing coursework and examinations

Examinations and coursework must be designed such that they allow the lecturers to meaningfully assess whether students have met the learning objectives. For lecturers, examiners, and module supervisors, the availability of AI systems potentially raises the question of whether a change in the previous forms of examination is necessary.

We have summarized information that could be helpful in making a decision together with a didactic handout on the use of ChatGPT in teaching:

Suggestions for decision-making processes for examiners

Back to the top