Skip to main content

Together we are beating cancer

Donate now
  • Science & Technology

Research funders agree joint approach to generative AI in funding applications

Jacob Smith
by Jacob Smith | News

19 September 2023

1 comment 1 comment

Shutterstock - Peshkova


A group of leading research funders, including Cancer Research UK, has agreed on a joint approach to manage the use of generative artificial intelligence (AI) tools in assessing funding applications.  

In a statement released today, the group made it clear that generative AI tools must not be used in peer-reviewing grant applications. If generative AI is used in other contexts, such as preparing funding applications, it must be clearly cited and acknowledged.  

“Artificial intelligence brings new opportunities, but also new challenges for cancer research,” said Dan Burkwood, director of research operations and communications at Cancer Research UK. 

“It’s important to ensure we are transparent about the use of generative AI tools, avoiding potential legal and ethical issues which can arise from using them. 

“Our grant applications process relies on peer review from experts in the field, providing robust feedback on scientific merit. Generative AI tools pose several risks to this process.  

“It could compromise confidentiality, jeopardise intellectual property and ultimately undermine trust in the peer review process, which is why we are taking action now.”  

Consistent standards 

The statement was agreed in response to the rise of generative AI tools like ChatGPT, which allow large sections of human-like text and images to be created from prompts.  

Generative AI tools can be helpful in some situations, like assisting neurodivergent researchers and reducing language barriers.  

But there are risks that it could compromise confidentiality and research integrity if used to write peer review feedback.  

The statement sets consistent standards on generative AI tools in funding applications and assessment across research funding organisations in the UK.  

Signatories to the statement are members of the Research Funders Policy Group and include the Association of Medical Research Charities, Cancer Research UK, the National Institute of Health and Care Research, the British Heart Foundation, Royal Academy of Engineering, Royal Society, UK Research and Innovation and the Wellcome Trust.  

“This collective position sets out our high-level expectations of how we hope to balance the opportunities AI might bring to researchers while ensuring that the research we fund is conducted responsibly,” said Alyson Fox, director of research funding at Wellcome.  

“We will continue to monitor and evaluate this approach as we develop our own, detailed funding policies.” 

Adapting to an ever-changing landscape 

Cancer Research UK has published its full policy on the use of generative AI tools in funding applications – the first medical research charity to do so. The policy stipulates that generative AI should not be used in peer review. It urges researchers to exercise caution if using it to prepare funding applications, with full acknowledgement of the software and prompts used.  

“AI is changing every aspect of how we devise and conduct research,” Burkwood added.  

“We’re putting our policy in place now, and working collaboratively with other funders, so that we can adapt to rapid technological developments in AI.”  

    Comments

  • Liz
    25 October 2023

    Brain tumours that have no cure in uk

    Comments

  • Liz
    25 October 2023

    Brain tumours that have no cure in uk