Provided you stick to the rules explained in the default position, we are happy for you to make use of the generative AI tools throughout your studies, including your assessment. However, there are two key principles that you should pay attention to:
- Your work should always authentically represent your capabilities. The thing that universities really care about is that your assessment genuinely reflects your understanding and skill, and represents your voice. This principle underpins all of our rules about plagiarism, misconduct and cheating. Ultimately, if you keep this in mind in how you use AI, you are likely to be on safe ground
- You should never trust the outputs of Generative AI uncritically. Generative AI isn’t designed to understand your subject or context in detail – and it can behave in odd ways, such as making ideas, theorists or concepts up completely. So, as with any source – but perhaps more so – it is important that you treat the outputs of the tools you use critically, and test them for appropriateness
So, for these reasons, never, ever, ever get a generative AI tool to simply produce a response to an essay question and then incorporate that as your submission. But you knew that already, didn’t you?
See Also: Guidance on Academic Integrity
This page will give you more information on the university’s broader policies on plagiarism, cheating and misconduct.
Here are some more practical ways you can approach these golden rules:
- Start and finish with your thinking. A good way to keep your work anchored authentically in your voice and capabilities to use those as a starting point. Before you turn to any AI tools, map and sketch out your first responses. Keep this visible in your work, and at the end return to them: do any enhancements you have incorporated genuinely reflect the fact that you have learnt new things through your use of generative AI?
- Initiate coaching conversations. When you interact with generative AI tools, make sure you are in the driving seat – don’t let the bot lead the thinking. One way to do this is to prompt the tool to act like a coach rather than an expert. You might think, for instance, of asking questions such as “Is anything unclear about my argument?”, “Am I missing or misunderstanding any key ideas?”, “What feedback would you give against the assignment criteria?”.
- It’s fine to use generative AI to prompt you. This kind of interaction is likely to throw up new things that you haven’t thought about. This is fine, if you genuinely take these away and use them as a stimulus to learn something new, that you can take back to your assignments. If you don’t do this middle step, not only are you ‘kind of cheating’, but you’re likely to end up with badly used content!
- Ask for signposts rather than final answers. Prompting the tool to direct you to other theories, concepts or places to find out about things means you can check the accuracy of answers. Use this to help structure your thinking and direct your reading, rather than as an end to itself. This will also help you stay in line with the guidance for the previous golden rule.
- Look for counterpoint and contradiction. Don’t assume that the shape of any response covers every possibility. Follow up by asking ‘are there any other perspectives on this that I am missing’? In fact, this is a really good thing to do in general and one of the benchmarks of criticality.
- Use generative AI for skills that aren’t part of your learning outcomes. Every module assessment has learning outcomes which define the knowledge and skills you are being assessed directly on. These need to be an accurate reflection of your capabilities. But there are often all sorts of other technical an practical skills that are out of scope of the assessment. Using AI tools to support these is usually safer ground – for example, if you find yourself needing a diagram, and things like visual representation are not explicitly part of of the assessment. However this should always be read in conjunction with the final tip:
- Transparency is always a good idea! When you reflect on the final assessment, you may wish to simply acknowledge the ways in which you have used AI tools to support your assessment. If you feel uncomfortable about acknowledging something, it might be a sign that you should reflect further on its appropriateness…
See Also: Guidance on Referencing AI
Manchester Met’s Library Service has created this documentation to support you in acknowledging the contributions of AI tools to your work.