
The digital divide
Across age, gender, subject areas and socio-economic groups, a clear digital divide is emerging in AI adoption. Ofcom’s Online Nation 2024 report shows that 50% of UK adult males used AI in 2024, compared with just 33% of females. Among students, a 2025 HEPI survey found that 53% are put off using AI, largely because early narratives framed AI as “cheating.” This divide is even sharper by gender: 59% of female students compared with 45% of males. Subject disciplines show similar patterns: while 45% of STEM and Health students feel AI produces good content in their field, only 29% of Arts and Humanities students agree.
What can we do?
Scenario 1: AI is banned or ignored
If we avoid or ban AI in higher education, the divide will deepen, and we miss the opportunity to build critical understanding of AI in all our students. If we don’t explore or discuss AI in sessions then we don’t build students’ critical understanding of how AI works, the ethics, sustainability, integrity, reliability, equality, and privacy. Some students will independently learn to use AI to enhance their work, fairly or unfairly, while others will miss out completely. As educators, we risk spending energy on policing “AI cheating,” an impossible task, while failing to prepare students for workplaces where AI is already embedded. This approach widens inequality, leaving behind those who most need our support.
Scenario 2: AI is co-explored with our students.
Alternatively, we can bring AI into our teaching and assessment. By co-exploring its potential and limitations, we build students’ critical understanding of ethics, sustainability, reliability, equality, privacy and academic integrity. An approach that encourages AI as a complement, not a replacement for human thinking, creativity and judgement.
Inviting AI “to the table” in our activities will allow staff and students to see its frustrations as well as its potential. It also allows us to model best practice in our disciplines and prepare graduates for future workplaces. Crucially, this approach can help reduce inequalities by ensuring that everyone benefits from AI’s opportunities.
There is a further gain: freeing up time. When used wisely, AI can reduce routine workload, giving us more space for teaching, innovation and human connection (Advance HE Education for Mental Health Toolkit). But this requires conscious protection of that time, otherwise we risk simply accelerating existing menial tasks.
In short, inclusive assessment in the age of AI demands engagement, not avoidance. By critically co-exploring AI with our students, we can narrow the digital divide, build equitable learning opportunities, and prepare all students for the future.
Bite-sized task
This task acts as a prompt, to you, and if relevant your module/programme team, to reimagine the possibilities of what authentic, engaging and inclusive assessment could be. It makes use of the Assessment in Motion cards (AIM) created by educators in the Faculty of Arts and Humanities.
Step 1 – learn
Look through some of the resources below:
Find out about other HE Staff views on AI and how they are embedding in teaching.
Read this blog post from Ethan Mollick on seven ways to use AI in class.
Listen to a Podcast on the Pros and Cons of AI in Higher education.
Step 2 – do
- Generate a new Assessment Type
Generate a random new assessment type using the AIM (Assessment in Motion) deck of cards, by either:
- Use this link to generate a card : https://www.elanguages.ac.uk/aim/
- OR download a PDF of all the cards, and generate your own random number (e.g. ask a digital assistant, or role a dice) to locate a card: AIM_AssessmentTypeDeck.pdfÂ
If this is an assessment type you already have on your module, please re-generate another card. However, if this new assessment feels like it is an unsuitable fit or makes you uncomfortable – then, good! Stick with it, as the purpose of this exercise is to use a random intervention to push new ideas, approaches and ways of thinking. Don’t worry, you are unlikely to make this your actual assessment – but it may help lead to a more creative change.
Re-imagine your assessment
Re-imagine your module with this assessment, and consider how AI could be incorporated, answer the following:
- Identify the top three skills and knowledge that you want students to develop, learn and demonstrate in your module? (equally important is to identify and explain to students what you don’t want AI to replace)
- Draw a diagram of how you would implement this new assessment in your module. Show where and how you might build in critical exploration of AI (e.g. evaluating AI tools, reflecting on their use, or comparing AI outputs with student work).Â
- List three potential ways students could use AI to support their learning for this module and assessment?Â
Step 3 – reflect
Review your thinking and ideas around this new assessment type, and reflect on what changes you now want to make to your assessment* or curriculum to close the AI digital divide and begin to critically explore AI with your students.
*Please don’t make any changes to your assessment without following the correct QA processes, speak to your Director of Programmes for further guidance.
Join the conversation
Post your thoughts on the weekly Teams post to join the conversation.
Further Links
Highly recommended books:
Mollick, E. (2024) Co-Intelligence: Living and working with AI. London: Penguin Books
This book makes a compelling case for why society must actively engage with artificial intelligence. The author is a professor at Wharton Business School, University of Pennsylvania.
Murgia, M. (2024) Code dependent : how AI is changing our lives. London: Picador.
This book presents compelling human stories of individuals who have helped train AI systems or been directly affected by AI-driven decisions.
Written by a leading AI computer scientist, this book offers a clear and accessible overview of the history and development of artificial intelligence.
Contributor biography
Daniel Hobson is a Principal Teaching Fellow and Director of Design Programmes at Winchester School of Art. Daniel currently chairs a strategic working group as part of the Assessment Future Fit intiative within the Faculty of Arts and Humanities, focused on the pedagogical integration of Generative AI and digital tools in assessment and learning. The group draws on insights from across higher education, industry, and the student body to produce guidance and resources. This work has culminated in the development of a new educator resource designed to provoke discussion and support intervention, helping staff and students navigate assessment in an emerging GenAI landscape.
© 2025. This work is openly licensed via CC BY-NC-SA
