Doctoral supervisors: how to talk about GenAI with your students

Female teacher addressing university students in a classroom

It can seem like GenAI is a ubiquitous feature of teaching and learning for taught students. However, as a doctoral supervisor, have you had conversations with your doctoral students about their use of GenAI during their research? Do you feel unsure about how to start these conversations, or worried that you may not have sufficient knowledge of GenAI to provide support and guidance? As disclosures about use of GenAI tools become commonplace across the academic sector – from institutional submission declarations to disclosures when submitting work for academic publication – there is no ignoring the fact that doctoral researchers will likely be acquainted with such tools and may already be using them to aid with their projects. 

As academic developers working with doctoral researchers, we have heard anecdotally from several individuals about how they have used, or planned to use, GenAI in their research, and it is clear that some do not know how to broach the subject of appropriate GenAI use with their supervisors.  

Research has shown that doctoral candidates are using GenAI to support literature searching, coding and programming, and academic writing. In fact, some respondents even noted using GenAI to answer questions they may have ordinarily asked their supervisors but decided not to. Some will be confident users, aware of the benefits, shortcomings, and boundaries of acceptable use of GenAI. Others may be less confident about the various tools available, and uncertain as to how they should be used appropriately during their doctorate.  

It is imperative that transparent conversations take place about how to appropriately use GenAI during doctoral research. However, it is understandable that if you as the supervisor do not currently use GenAI yourself, or are not sure how some of the tools work, this can seem like an overwhelming prospect. Nevertheless, take some time to upskill in preparation for ongoing dialogue with your doctoral students about their use of GenAI.  

Bite-sized task 

The below activity scaffolds a reflective approach to GenAI that can then be taken into supervisory sessions with some additional prompts outlined in Step 2. Using a blank word document (or a notebook!) have a think about the following questions. 

Step 1 – reflect (part 1) 

Current understanding of GenAI tools 

  • How familiar am I with how GenAI tools (e.g., ChatGPT, Copilot, Grammarly) are being used in academic writing and research?  
  • Are there any gaps in my understandings about commonly used GenAI tools, such as those above? 
  • Have I encountered any examples of students (at any level) using AI in ways that raised concerns or seemed helpful?  

What is my personal positioning? 

  • What do I believe is an appropriate vs. inappropriate use of GenAI in thesis work? 
  • Where might I draw the line between assistance and academic misconduct? 
  • How do these views align (or not) with my discipline’s norms or expectations? (i.e., what is the stance of major publishers in my field?) 

Policy awareness 

  • Do I know what guidance my institution currently offers on student use of GenAI?
  • Am I confident explaining these policies to students – or do I need to seek clarification?  

Step 2 – do 

Once you have filled out your reflections (as honestly as possible) to the above questions, assess if there are any gaps in your knowledge that you need to plug in order to have meaningful conversations with your doctoral supervisee(s). Some prompts for discussion in doctoral supervision meetings could include the following: 

  • Have you used GenAI in any of your previous academic writing, research projects, or professional roles? If so, in what ways? 
  • Are you familiar with UoS policies and guidance on the use of GenAI in research and writing? Would it be helpful for us to review those together to clarify expectations? 
  • What kinds of research tasks (e.g., literature review, data analysis, writing support) do you think GenAI could assist with in your doctoral work? 
  • What do you see as the potential benefits and risks of using GenAI for those tasks, especially in terms of research integrity and critical thinking? 
  • How do you plan to keep track of when and how you use GenAI in your research process? Would a shared log or reflective journal be useful? 

Step 3 – joint reflection 

Once an initial conversation based on the above questions has taken place, reflect on the following: 

  • Based on our conversations, where do we align or diverge in our thinking on the use of GenAI in research? 
  • Can we co-create a supervision plan that includes checkpoints or reviews of GenAI use (e.g., monthly reflections, reviewing outputs)? 
  • Are there professional development opportunities (e.g., workshops, training) that could enhance our understandings of GenAI in research, and how to use such tools ethically and with integrity? 

Further links 

English, R., Nash, R., & Mackenzie, H. (2025). ‘A rather stupid but always available brainstorming partner’: Use and understanding of Generative AI by UK postgraduate researchers. Innovations in Education and Teaching International, 1–15.  

Advice for UG, PGT and PGR students at UoS on using GenAI 

LSE – Introduction to Generative AI for researchers 

Porsdam Mann, S., Vazirani, A. A., Aboy, M., Earp, B. D., Minssen, T., Cohen, I. G., & Savulescu, J. (2024). Guidelines for ethical use and acknowledgement of large language models in academic writing. Nature Machine Intelligence, 6(11), 1272-1274.  

UoS – Using generative artificial intelligence during your studies 

Contributor biographies 

Dr Rebecca Nash is a Senior Teaching Fellow in Academic Practice in the Centre for Higher Education Practice. She has worked in academic development since 2016, and currently leads on academic skills provision for doctoral researchers, where GenAI is a constant feature of discussion, concern, enthusiasm, and disdain.  

This post draws on work carried out in collaboration with Dr Heather Mackenzie and Dr Ross English. 

© 2025. This work is openly licensed via CC BY-NC-SA