1. Is AI imaging being used in the NHS?

The NHS Transformation Directorate explains that most AI technologies that are currently being tested in the NHS are used for imaging.

In June 2023, the Department of Health and Social Care announced a £21mn AI diagnostic fund for NHS trusts “to accelerate the deployment of the most promising AI tools across hospitals to help treat people more quickly this winter”. The government has already invested £123mn into 86 AI technologies, which is “helping patients by supporting stroke diagnosis, screening, cardiovascular monitoring and managing conditions at home”.

2. What is possible with AI in imaging?

2.1 Planning radiotherapy

The National Institute for Health and Care Excellence (NICE) has for the first time recommended the use of AI in draft guidance. Nine AI technologies can be used to help plan beam radiotherapy treatment for lung, prostate and colorectal cancers.

Currently, following a CT or MRI scan, a radiologist would mark up, or contour, the image by hand to highlight organs at risk of radiation damage, lymph nodes and the site of the cancer to accurately target treatment. This task could be done by AI, NICE says, and then checked by a human practitioner. Clinical experts estimated a time saving of 10 minutes to 30 minutes per plan depending on the amount of editing needed.

By providing AI with large datasets, some marked as showing lesions and some as clear, the algorithm learns for itself the features of an image that are important for classification.

2.2 Finding colon cancer

An article in the journal Gut by Pu Wang et al demonstrated the use of an AI real-time detection system during colonoscopies. The randomised controlled trial saw 536 patients have a standard colonoscopy, and 522 have a colonoscopy with AI real-time detection. There was no statistical difference in the detection of large adenomas between the groups, which means AI could locate them at the same rate as the human practitioner. AI systems also increased the detection of small adenomas and hyperplastic polyps (typically benign tumours).

2.3 Imaging for surgeries

AI has also been used to create 3D models from existing CT and MRI scans. This allows surgeons to have a fuller picture of a patient’s anatomy to plan surgeries. King’s College London’s AI centre reported that using ‘Innersight3D’ gave surgeons the ability to work quickly and safely through the Covid-19 backlog of kidney surgeries at the Royal Free hospital in London.

2.4 Locating heart scarring

Qiang Zhang et al at the University of Oxford have been exploring the use of AI to enable them to do scans without injecting dye contrast agents usually used to show heart scarring. Using dye contrast agents increases pain, expense and risk to some patients, particularly those with kidney failure. AI can combine multiple contrast-free MRIs with heart motion information, enhancing the image so it can be interpreted. The authors reported that their methodology reduced time spent in the MRI scanner from 30–45 minutes to under 15.

2.5 Reading mammograms

For breast cancer, UK company Kheiron Medical Technologies has developed mammography intelligent assessment (known as ‘Mia’), which has been designed to read mammograms. Currently in the NHS, each mammogram is checked by two radiologists, typically with the second opinion blind to the first. Mia has been designed to be the second reader. So, a human will interpret the mammogram first, then Mia will provide its reading. If Mia and the radiologist disagree, a second human reviews the images. This frees up clinical time and should allow people to receive results more quickly. Mia is being tested across 15 hospital sites in the UK.

3. How does AI compare to human radiologists?

In 2017, Silicon Valley tech investor Vinod Khosla claimed that radiologists would be “obsolete” in five years. He said that AI can rapidly sift through thousands of scans to evaluate possible diagnoses and potential treatments, as well as take in the latest medical research.

Studies and commentary from radiologists point to the complexities of the role, including interpreting images alongside other patient data, creating treatment plans and explaining diagnoses and next steps to patients. For example, Giovanni E Cacciamani et al, writing in European Urology Open Science, contend that a combination of AI and human is the optimal solution. They found that a combined AI and human diagnostic process was more accurate than either just human or just AI in interpreting prostate MRIs. They also argue that AI assistance reduces radiologist burnout and enhances patient care.

A study by Susan Cheng Shelmerdine et al published in the BMJ pitted AI against 26 foundation-year radiologists to assess its ability to pass the Fellowship of the Royal College of Radiologists examination. The AI candidate passed two of 10 mock exams, while the average pass rate of the radiologists was four out of 10. The AI candidate was ‘Smarturgences’ (“smart emergencies”), a tool created by French AI company Milvue, which is in clinical use in Europe (though not in the UK) to assist in the interpretation of chest x-rays.

September’s edition of Radiology included a study from Louis Plesner et al evaluating AI’s ability to assess chest x-rays. Testing four commercially available AI tools across over 2,000 x-rays, they found the tools had moderate to high ability to detect airspace disease, pneumothorax and pleural effusion, though the tools were outperformed by radiologists. AI also produced more false positives than radiologists, which Plesner said “would result in unnecessary imaging, radiation exposure and increased costs”. AI also performed less accurately where multiple diseases were present and where lesions were small, but could, Plesner said, increase radiologists’ confidence in diagnoses by providing a second opinion.

4. What are the policy and implementation considerations?

The journal Insights into Imaging published a white paper from the European Society of Radiology to discuss the ethical and professional impacts of AI on radiography. It highlighted challenges including:

  • the resource needed to curate and label large datasets for quality AI training
  • the small number of data points for rare conditions and presentations, making AI harder to train
  • dataset bias and the generalisability between populations
  • over-interpretation of ‘noise’ or small anomalies leading to unnecessary interventions

On dataset bias, the Lancet Digital Health published a study by David Wen et al which explained that some groups are missing or under-represented in the datasets that AI employs during its training. AI-assisted imaging for skin cancer diagnosis, particularly, has been trained on predominantly lighter skin tones, contributing to under-diagnosis in people with darker skin tones.

The European Society of Radiology also flagged issues of medico-legal responsibility, recommending that AI always be used as a second or concurrent reader with a human, and never the sole interpreter of images. It also asked questions about where responsibility for diagnosis would lie:

Whether data scientists or manufacturers involved in development, marketing, and installation of AI systems will carry the ultimate legal responsibility for adverse outcomes arising from AI algorithm use is a difficult legal question; if doctors are no longer the primary agents of interpretation of radiological studies, will they still be held accountable?

The NHS Transformation Directorate is piloting an AI deployment platform to help NHS organisations share workflows and centralise deployment. It notes that varying expertise and IT infrastructure contribute to AI medical imaging technologies being adopted unevenly across the NHS. In Joe Zhang et al’s BMJ article ‘Failing IT infrastructure is undermining safe healthcare in the NHS’, they argue “there is a growing disconnect between government messaging promoting a digital future for healthcare (including artificial intelligence) and the lived experience of clinical staff coping daily with ongoing IT problems”.

5. Read more

Cover image by Anna Shvets on Pexels.