Skip to main content

BETA This is a new service - your feedback (opens in a new tab) will help us to improve it.

Setting up an ‘implementation and governance framework’ for Artificial Intelligence (AI) pilot studies taking place in an NHS Trust

This case study explores the implementation and governance process created by a Consultant Radiologist working in an NHS Trust.

Published at 21 October 2024 by Rebecca Evans

Adopters Implementation

This case study explores the implementation and governance process created by a Consultant Radiologist working in breast screening at an NHS Trust. The Consultant Radiologist was interested in the use of AI technology to help clinicians triage and prioritise breast and chest scans. However, before undertaking any AI projects to look into the feasibility of using AI technologies, the NHS Trust where she worked recognised there was a need for a formal ‘AI implementation & governance framework’. This was to make sure that the set up of any projects using or piloting AI was efficient and effective and align all relevant stakeholders.

The problem and possible solution

There is currently a national shortage of radiologists, and record waiting times for cancer treatments. AI technologies that are ‘trained’ to look at scans are developed to ‘assess’ whether abnormalities in the scans matter. These AI technologies have the potential to help radiologists by highlighting the more urgent cases to look at, so the radiologist can prioritise the patients in most need of care, to get them access to treatment more quickly. It is the responsibility of the adopter to determine whether digital technologies are the right solution for providing safe and effective care.

See our best practice guidance for understanding if digital technologies or AI are the right solution to a problem.

Setting up an implementation & governance framework for AI studies

Discovery and evaluation

The Consultant Radiologist knew of AI imaging companies whose digital healthcare technologies could potentially help but decided that more thought was needed around how to set up AI technologies in the NHS Trust. The proposed AI implementation and governance framework would prioritise setting up a well thought out structure for implementation, which was to be agreed and formalised before the Trust embarked on any AI project set ups. As part of the discovery and evaluation phase, the team identified clinical priorities which aligned to the Trust’s existing vision and strategy.

Information:

“The first step was to set the vision of AI use within the hospital. This was presented to the executive team, who approved it, and helped assemble the relevant stakeholders who would come together in the AI Working Group. This was very important to get alignment across the different stakeholders.”

Read our best practice guidance for adopters on thinking about whether a medical device will meet your needs.

Building the team

Stakeholders were brought on board, including the following professionals and teams in the Trust:

  • clinicians
  • information governance
  • digital
  • IT
  • PACS
  • procurement
  • legal
  • contracts
  • research and development
  • Integrated Care System

This multidisciplinary team laid the groundwork for effective collaboration and stakeholder engagement. Recognising the need for ongoing governance and oversight, the Trust established both an AI working group and an AI project management group.

AI working group

This working group convened every two to three months at first. The AI Working Group evolved to address the need for more frequent discussions on AI projects. Stakeholders from various departments participated, emphasising the importance of collaboration internally. The group comprised of:

  • clinicians
  • research finance
  • digital services
  • contracts
  • information governance
  • research AI lead,
  • research and development operations manager
  • clinical AI lead
  • principal investigator
  • clinical safety for IT systems

AI project management group

Comprising technical and clinical experts, this group met monthly to oversee and manage all AI projects within the Trust. The group comprised of:

  • AI clinical lead
  • AI research and development lead
  • research and development associate medical director
  • head of information governance
  • chief information officer
  • IT architecture
  • PACS team
  • clinical safety officer

Identifying opportunities and value in the use of AI

AI technologies were identified for potential pilots within the Trust, and it was important that they were strategically aligned to address existing challenges and priorities in the Trust. This stage concentrated on identifying opportunities for AI implementation and assessing its potential value for the clinical pathways, where there were existing challenges. Importantly this meant asking for suppliers to show how their technologies addressed these challenges with evidence, including providing their intended use statements as per their regulatory clearance. At this time, it was also possible to think about the potential risks involved with the identified AI technologies, including data privacy risks and various biases, such as, gender, or patient demographics or ethnicity. Suppliers were asked to share their training data set at the time of procurement to make sure that AI technologies would be appropriate for the diverse population within the ICS that the trust serves.

Information:

“We have very good relationships with our ICS - that is actually doing a lot of longitudinal data analysis across primary and secondary care and applying AI to it, so that algorithm building was happening in the background, but not in our actual hospital. So we thought, it would make sense to leverage CE approved products to solve for the bottlenecks we had in cancer imaging. So we scoped out a few areas, then we realised that to put this piece of work together, we needed funding”.

Planning for future AI projects

Acknowledging the need for funding and domain expertise, work began with the National Consortium of Intelligent Medical Imaging (NCIMI) and funding was granted. This meant that the team had access to a big portfolio of national AI research and development studies, and over the next two years, the team built their knowledge around how to set up local pilot studies involving AI. The funding also allowed the AI working group to build the infrastructure, both technically and in expertise, over this period of time.

Once the expertise was built, the intention to pursue more local AI projects was formalised. From there project plans were developed, and potential pilot studies were chosen due to their alignment with the Trust’s strategy and needs. The Consultant Radiologist told us:

Information:

“Because nobody else had done real world deployment, we thought the lung cancer pathway would be a good starting point. Chest X rays are a simple modality, with large backlogs, and a priority for the Trust. It would be a great pathway to show end to end operational implementation once all the safety and governance structures were in place."

Setting up a project plan and a dashboard for governance of AI studies

A formal ‘AI study set-up pathway’ was established to simplify the process and make sure AI project set up in the Trust was consistent. The AI specific set up was very much in line with how research and development would set up their usual studies for research purposes. There were some important differences which strengthened the governance of the AI projects, such as including ethical approval where appropriate. To help with the AI study set up, the following was developed:

Programme dashboard

A high-level program dashboard provided an overview of the status of all AI projects, to help monitoring and management.

Project plan

A detailed project plan was created for each AI study, documenting every aspect. Documenting each AI study in this way made sure there was good project oversight.

Risk management

An integrated risk management section was included in the pathway to proactively identify and mitigate potential risks associated with AI projects.

Benefits realisation

Benefits realisation made sure the trust concentrated on high-quality implementation, and formal closure and archiving of completed studies.

Information:

The Consultant Radiologist told us:

The objective for our project plan and dashboard for our research and development Artificial Intelligence study portfolio is to provide a structured, organised and transparent method for managing the life cycle of AI studies from setup to close down. This allowed the Trust to:

  • improve and simplify AI study set up and delivery.
  • make sure they worked on high quality implementation while maintaining clinical excellence.
  • formally close-down and archive a completed study."

Piloting AI – the implementation phase

Process mapping

The Consultant Radiologist told us their first step was to do a comprehensive process mapping of the entire clinical pathway. This step was crucial for understanding the current state of data collection, identifying existing pain points, and envisioning the desired evolution of the pathway.

Accuracy testing and usability assessment

The next step involved a retrospective accuracy test of the AI technology. This step was indispensable to validate the tool's performance against existing patient demographics and imaging equipment that the Trust was using.

Usability testing involved exposing radiologists to the AI tool in a parallel setting alongside their regular workflow, which ran in silent-mode The goal was to observe interactions before, during, and after AI usage. This phase served as a pilot implementation phase, allowing for adjustments based on real-world experiences and feedback from the radiologists, who would be the primary end-users. The Consultant Radiologist told us:

Information:

“You have to make sure that the AI tool matches your patient demographics, and the equipment that you have. And this is where the whole explainable AI comes in, … anybody who's worked with AI knows the AI overcalls a lot… and in the instances where we had the AI not pick up something we thought it should, we've gone back to the company and said “show us how this AI decided not to call this back when the human did call it back”, and they have to be able to explain that to us and make changes. An accuracy test has to be carried out before you do anything.”

Read our best practice guidance for adopters on piloting digital technologies in a health or care service.

Summary

This case study highlights the importance of setting up a comprehensive implementation and governance framework before embarking on piloting AI technologies in the NHS. The experience from the Consultant Radiologist that we interviewed provides valuable insights for other NHS Trusts thinking about similar AI piloting and implementation.

Important: Disclaimer

This case study is a personal account of experiences shared with us by developers or adopters of AI for health and social care. It is intended to provide insights into individual experiences but does not reflect the views or recommendations of the AI and Digital Regulations Service partners (NICE, CQC, MHRA and HRA). AIDRS emphasises that users should continue to seek and adhere to formal statutory guidance and legal requirements applicable to their specific circumstances. It is the responsibility of the legal manufacturer to comply with all applicable statutory regulations.

Is this article useful?

How can we improve this piece?

Error:Select how we can improve this piece
Cancel

Thank you for your feedback!

To share additional insights about this page, please use the following link (opens in a new tab) to submit your observations.

Print case study (opens a PDF in a new tab)

Regulations are regularly updated. For the latest information, check the website as printed documents may be outdated.

Get more support

To discover how the regulatory organisations can assist you and for contact details, visit our 'Get Support' page.