Enabling AI at Exeter
Your guide to using AI confidently and responsibly
These resources - our Enabling AI Strategy, Policy, Information Classification Scheme and AI Catalogue – are here to help our university community to explore AI with confidence, in line with Strategy 2030 and our commitment to innovation, sustainability and ethical practice.
Exeter is also recognised as a leader in AI research and education. We use the power of our AI education and research to create a sustainable, healthy and socially just future. Our shared purpose and vision continue to move us forward, making a difference to our people, our communities, our partners, and the world around us.
Discover how our pioneering education, research, postgraduate research programmes and innovation partnerships are driving impact locally and globally.
In this section
Enabling AI Strategy
The University of Exeter’s Enabling AI Strategy sets out our vision to embed Artificial Intelligence responsibly and sustainably across education, research and professional services.
Our AI Policy
The University of Exeter Artificial Intelligence policy sets out clear principles for using AI at Exeter. It supports innovation while ensuring fairness, transparency and compliance with data protection and ethical standards. All staff, students and researchers should refer to the policy when using, developing or procuring an AI tool for University purposes, from drafting content to analysing data.
For students: Generative AI can support your studies if used responsibly, but in assessments it must be used according to assessment guidelines. For more information read our guide on generative AI use in assessments.
For research staff: Responsible AI use is vital for maintaining research integrity. For more information read our guidance on Responsible Use of AI in research.
For Postgraduate Research students: Further guidance will be published in the autumn.
AI Catalogue
The AI Catalogue is a dynamic list of approved AI tools for staff, students and researchers. It includes University‑licensed tools, carefully reviewed public tools, and development environments for research and innovation.
Why should I use the AI Catalogue?
- Ensure your chosen tool aligns with our Information Classification Scheme.
- Protect University and personal data while exploring AI.
- Make sustainable choices based on vendor insights.
Training and support
At Exeter we are learning together, building our confidence and skills step by step so that everyone can use AI responsibly, creatively and effectively. This is a shared journey and we understand the importance of community wide and targeted support as we are all at different points in our personal journey.
New support is being developed all the time to keep pace with this fast-moving technology. If there’s something you’d like to explore or need we haven’t yet covered, please get in touch with the relevant Community of Practice or contact OTW-AIEnablement@list.exeter.ac.uk.
Staff
- Access to dedicated University of Exeter training on AI Fundamentals and Responsible AI Use in LearnUpon, alongside training from industry, Microsoft Copilot Training, and external sources through Digital Skills.
- Learn from colleagues, share ideas and get involved in AI pilots via Communities of Practice.
Educators
- Explore the EDUExe AI Hub & EduExe Toolkit for useful resources related to generative AI.
- Keep up-to-date on the latest through the EduExe Blog, and student guidance on GenAI in Assessments guidance & GenAI Research Assistants.
- Learn from colleagues, share ideas and get involved in AI pilots via Communities of Practice.
- Attend our Digital Learning Support workshops (Exploring GenAI (LLMs) as an Assistant for Designing Learning, Reimagining Assessment Design in the Light of GenAI ), which run regularly and can also be scheduled on request.
- Look out for AI-themed events in Expo Lab 1 hosted by the Experimentation and Innovation team, a hub for creative experimentation with AI.
- Apply for GenAI and Data-powered Learning microgrants via the Exeter Education Incubator or Global Classrooms to support AI innovation in teaching and learning.
Researchers
- Resources to support responsible and safe use of AI in research: Exeter's guidance on Responsible Use of AI in Research, UKRIO’s Embracing AI with Integrity guidance, UKRI's guidance on use of Generative AI in funding application preparation and assessment.
- Learn from colleagues, share ideas and get involved in AI pilots via Communities of Practice.
Students
- Learn more about AI and how it can be used in your studies via your AI: Understanding it guide and Digital Skills resources and workshops homepage. You can find out how it relates to your future career through the Career Zone's 'help with using AI' guidance.
Postgraduate Research Students
- Learn more about how to use AI in your research and assessments through our AI:Understanding it guide, sharing guidance on using AI in postgraduate research. You may also find further useful guidance through the links for researchers and students.
Join an AI Community of Practice
Communities of Practice (CoPs) bring together staff from across the University to share ideas, pilot AI tools, and champion responsible adoption in their area. These groups are central to delivering our Enabling AI Strategy and creating a culture of shared learning.
We have active CoPs in AI and Data-Powered Learning, Data, Research Coding and Research Digital Skills (for more information, contact OVR-RSAGroup@list.exeter.ac.uk or Fliss Guest).
Two additional CoPs have also launched: AI & Business Operations and AI & Student Services. Each CoP brings together representatives from relevant directorates and departments to share expertise, collaborate, and drive best practice.
Get in touch
If you have a question about the Enabling AI Strategy, need help selecting tools from the AI Catalogue, or want advice on training and policy, we’re here to help.
- For general advice contact the Digital Transformation Division.
- For tool‑specific queries consult the AI Catalogue.
- To propose an AI application or initiative for future development complete the Demand Management Form.
Frequently Asked Questions (FAQ's)
Yes. Staff and students should use tools listed in the University’s AI Catalogue and follow the Information Classification Scheme to ensure information is not shared above its approved level. These requirements apply to University-related work only and do not cover personal use of AI outside the University.
The University ensures the ethical and lawful use of AI by following relevant legislation, regulatory guidance, and principles of Responsible AI. This involves providing assurances regarding the appropriate use and accuracy of AI systems and applying additional security controls to safeguard both personal data and commercially sensitive information.
All new or configured AI algorithms are assessed to identify and mitigate risks such as bias or unintended harm. Required documentation, such as Data Protection Impact Assessments (DPIAs), is completed where necessary.
The University complies with applicable laws, including the General Data Protection Regulation (GDPR) and emerging AI-specific legislation such as the EU AI Act. Ethical practice is embedded through the principles of fairness, accountability, transparency, and sustainability. Bias audits and equality impact assessments are carried out where appropriate, particularly where AI interacts with students or staff in decision-making contexts.
The University is also a member of the QS Responsible AI Consortium (QS RAIC), reinforcing its commitment to responsible and ethical AI practice.
Managing the environmental impact of AI is a core part of the University’s approach to AI enablement. Sustainability is considered throughout processes, from prioritising energy-efficient suppliers to assessing environmental impacts before adopting new tools.
Sustainability expertise is embedded within digital, IT, data and AI governance to ensure decisions about technology adoption, data use and AI deployment consistently reflect our environmental commitments.
Staff, students and researchers are encouraged to use AI thoughtfully, applying it where it adds clear value and favouring lower-energy models where possible.
Ongoing work includes measuring the environmental impact of AI and linking sustainability insights to the University’s AI Catalogue.
Tools are selected using Microsoft’s Global Cloud Applications Catalogue ratings, which assess applications against regulatory certifications, industry standards, and recognised best practice. The University only recommends tools that achieve the highest security rating to ensure strong data protection and compliance.
The catalogue is regularly reviewed and updated. Key vendors have also been reviewed by the University’s Sustainability Team to support informed decision-making. Their assessment considers commitments to climate and biodiversity policies, renewable energy usage, circularity principles, PUE ratings, SBTi alignment, water consumption, external audits, and wider social value initiatives.
Researchers and colleagues may request the review of new tools by contacting the Research IT team.
All approved tools are listed in the AI Catalogue.
Please read the AI Catalogue and accompanying Information Classification Scheme to identify what information you can input into each listed AI tool.
Yes, but it depends on the type of information you plan to input into the AI tool. First, check the University’s Data Classification Scheme to identify level of information. Then refer to the AI Catalogue to confirm which tools are approved for that type of data.
Concerns should be reported to Information Governance or by emailing information-security@exeter.ac.uk.
This depends on what tool you are using and the settings you have applied.
University provided and licensed tools
When using University-licensed tools with your Exeter account (such as Microsoft Copilot), your data is handled in line with contractual and privacy agreements and it is not used to train foundational AI models.
Public tools
When you use public versions of tools such as ChatGPT or Google Gemini, your prompts may be used to improve and train their AI models unless you disable data training in the tool’s settings.
To request an AI tool license, submit a request via the IT Service Desk.
For help understanding Microsoft Copilot, visit the Digital Hub online for a range of digital guides. You can also visit the Digital Hub on campus (Streatham, St Lukes and Penryn) or book an online support session.
If you cannot find an answer to your question, contact the Digital Transformation Division.