Calendar An icon of a desk calendar. Cancel An icon of a circle with a diagonal line across. Caret An icon of a block arrow pointing to the right. Email An icon of a paper envelope. Facebook An icon of the Facebook "f" mark. Google An icon of the Google "G" mark. Linked In An icon of the Linked In "in" mark. Logout An icon representing logout. Profile An icon that resembles human head and shoulders. Telephone An icon of a traditional telephone receiver. Tick An icon of a tick mark. Is Public An icon of a human eye and eyelashes. Is Not Public An icon of a human eye and eyelashes with a diagonal line through it. Pause Icon A two-lined pause icon for stopping interactions. Quote Mark A opening quote mark. Quote Mark A closing quote mark. Arrow An icon of an arrow. Folder An icon of a paper folder. Breaking An icon of an exclamation mark on a circular background. Camera An icon of a digital camera. Caret An icon of a caret arrow. Clock An icon of a clock face. Close An icon of the an X shape. Close Icon An icon used to represent where to interact to collapse or dismiss a component Comment An icon of a speech bubble. Comments An icon of a speech bubble, denoting user comments. Comments An icon of a speech bubble, denoting user comments. Ellipsis An icon of 3 horizontal dots. Envelope An icon of a paper envelope. Facebook An icon of a facebook f logo. Camera An icon of a digital camera. Home An icon of a house. Instagram An icon of the Instagram logo. LinkedIn An icon of the LinkedIn logo. Magnifying Glass An icon of a magnifying glass. Search Icon A magnifying glass icon that is used to represent the function of searching. Menu An icon of 3 horizontal lines. Hamburger Menu Icon An icon used to represent a collapsed menu. Next An icon of an arrow pointing to the right. Notice An explanation mark centred inside a circle. Previous An icon of an arrow pointing to the left. Rating An icon of a star. Tag An icon of a tag. Twitter An icon of the Twitter logo. Video Camera An icon of a video camera shape. Speech Bubble Icon A icon displaying a speech bubble WhatsApp An icon of the WhatsApp logo. Information An icon of an information logo. Plus A mathematical 'plus' symbol. Duration An icon indicating Time. Success Tick An icon of a green tick. Success Tick Timeout An icon of a greyed out success tick. Loading Spinner An icon of a loading spinner. Facebook Messenger An icon of the facebook messenger app logo. Facebook An icon of a facebook f logo. Facebook Messenger An icon of the Twitter app logo. LinkedIn An icon of the LinkedIn logo. WhatsApp Messenger An icon of the Whatsapp messenger app logo. Email An icon of an mail envelope. Copy link A decentered black square over a white square.

Experts call for action on medical devices prone to biases

The panel found evidence that pulse oximeters can overestimate the amount of oxygen in the blood of people with darker skin tones (Peter Byrne/PA)
The panel found evidence that pulse oximeters can overestimate the amount of oxygen in the blood of people with darker skin tones (Peter Byrne/PA)

Experts are calling for action on medical devices that are prone to unfair biases, including blood oxygen monitors and certain artificial intelligence (AI) enabled tools, to prevent harm to ethnic minorities and women.

A report details the findings of the Independent Review of Equity in Medical Devices which looked at the extent and impact of ethnic and other unfair biases in the performance of equipment commonly used in the NHS.

It focused on optical devices such as pulse oximeters, AI-enabled devices and certain genomics applications, where evidence suggested there was substantial potential for harm.

The panel found evidence that pulse oximeters (blood oxygen monitors) – widely used during the Covid-19 pandemic – can overestimate the amount of oxygen in the blood of people with darker skin tones.

This could lead to delay in treatment if dangerously low oxygen levels were missed.

The experts say they did not specifically look at the use of these devices during the pandemic, but because there was an overwhelming number of people with very low oxygen levels “the likelihood is that that inaccuracy was large at that time”.

Daniel Martin, professor of perioperative and intensive care medicine, Peninsula Medical School, University of Plymouth, said: “We can only say that there’s association between the harm and the inaccuracy and not causation.

“But I think it’s a reasonably strong signal that there’s a potential of harm there, particularly during Covid when oxygen levels are so very low.”

The review makes a number of recommendations in relation to the devices, including patients being advised to look out for other symptoms such as shortness of breath, chest pain and fast heart rate.

It also suggests researchers and manufacturers should work to produce devices that are not biased by skin tone.

On AI-enabled devices, the review found evidence of potential biases against women, ethnic minority and disadvantaged socioeconomic groups.

It highlights potential underdiagnosis of skin cancers for people with darker skin when using AI-enabled devices.

The report suggests this is as a result of machines mainly being trained on images of lighter skin tones.

There is also a long-standing problem of underdiagnosis of heart conditions in women, which AI algorithms in medical devices could make worse, the panel suggests.

The University of Liverpool’s Professor Dame Margaret Whitehead, chairwoman of the review, said: “The advance of AI in medical devices could bring great benefits, but it could also bring harm through inherent bias against certain groups in the population, notably women, people from ethnic minorities and disadvantaged socio-economic groups.

“Our review reveals how existing biases and injustices in society can unwittingly be incorporated at every stage of the lifecycle of AI-enabled medical devices, and then magnified in algorithm development and machine learning.

“Our recommendations, therefore, call for system-wide action by many stakeholders and now need to be implemented as a matter of priority with full Government support.”

Among its recommendations, the report suggests there should be renewed efforts to increase skin tone diversity in medical imaging databanks used for developing and testing optical devices for dermatology, including in clinical trials, and to improve the tools for measuring skin tone incorporated into optical devices.

Enitan Carrol, professor of paediatric infection, University of Liverpool, said: “The NHS has a responsibility to maintain the highest standards of safety and effectiveness of medical devices in use for patients.

“We found no evidence of actual harm in the NHS, but only the potential for racial and ethnic bias in the performance of some medical devices commonly used in the NHS.”

Panel member Professor Chris Holmes warned that the Government needs to understand how AI, including programmes such as ChatGPT, will disrupt clinical and public health practices.

He said: “We are calling on the Government to appoint an expert panel including clinical, technology and healthcare leaders, patient and public representatives and industry to assess the potential unintended consequences arising from the AI revolution in healthcare.

“Now is the time to seize the opportunity to incorporate action on equity in medical devices into the overarching global strategies on AI safety.”

The review was set up in 2022 by then-secretary of state for health and social care Sir Sajid Javid.

He said: “The colour of someone’s skin or where they are from should not impact health outcomes yet the pandemic highlighted how too many of these inequalities remain.

“I hope this review and its important recommendations will help deliver much-needed change.”

In response to the report, health minister Andrew Stephenson, said: “I am hugely grateful to Professor Dame Margaret Whitehead for carrying out this important review.

“Making sure the healthcare system works for everyone, regardless of ethnicity, is paramount to our values as a nation. It supports our wider work to create a fairer and simpler NHS.”

The Department of Health and Social Care said significant action is already being taken to overcome potential disparities in the performance of medical devices.

This includes the Medicines and Healthcare products Regulatory Agency (MHRA) now requesting that approval applications for new medical devices describe how they will address bias.

NHS guidance has been updated to highlight potential limitations of pulse oximeter devices on patients with darker skin tone.

The government will also work with the MHRA to ensure regulations for medical devices are safe for patients, regardless of their background, while allowing more innovative products to be placed on the UK market.

Professor Bola Owolabi, NHS England’s director of healthcare inequalities, said: “Ensuring all patients get equitable access to high-quality healthcare remains crucial to reducing health inequalities and a priority for the NHS.

“I welcome the report’s findings and the NHS will work alongside Government and the MHRA to implement them and ensure NHS staff have the resources and training they need to tackle racial bias.”