DHSC outlines plans to tackle ethnic and other biases in medical devices

  • 14 March 2024
DHSC outlines plans to tackle ethnic and other biases in medical devices

Earlier this week, the government announced its plan to tackle ethnic and other biases in medical devices, in a bid to position the UK as a world leader on the topic. The plan follows a UK-first independent review into equity in medical devices, which identified both the extent and the impact of biases in the design and use of medical devices.

In particular, the review, led by Professor Dame Margaret Whitehead, professor of public health at the University of Liverpool, explored issues around pulse oximeters. This followed concerns that the devices were not as accurate for patients with darker skin tones.

The report found that ethnic minorities, women and those from disadvantaged communities were most at risk of bias from medical devices, and so poorer health outcomes.

As a result of the publication of the report, the government will now address issues from the design stage of medical devices and provide extra funding for applications for new devices that operate without bias.

Minister of State, Andrew Stephenson, said: “Making sure the healthcare system works for everyone, regardless of ethnicity, is paramount to our values as a nation. It supports our wider work to create a fairer and simpler NHS.”

The government has committed to ensuring that pulse oximetry devices used within the NHS, can be used accurately across a range of skin tones, as well as removing racial bias from data sets being used in clinical studies.

In addition, the Medicines and Healthcare products Regulatory Agency is now requesting that approval applications for new medical devices describe how they will address issues of bias. Ongoing work with NHS England will also support the upskilling of clinical professionals on issues including health equality.

Professor Dame Whitehead said: “The advance of AI in medical devices could bring great benefits, but it could also bring harm through inherent bias against certain groups in the population, notably women, people from ethnic minorities and disadvantaged socio-economic groups.

“Our review reveals how existing biases and injustices in society can unwittingly be incorporated at every stage of the lifecycle of AI-enabled medical devices, and then magnified in algorithm development and machine learning.

“Our recommendations therefore call for system-wide action, requiring full government support. The UK would take the lead internationally if it incorporated equity in AI-enabled medical devices into its global AI safety initiatives.”

The announcement forms part of the government’s ongoing work to tackle disparities in the healthcare systems. In recent years it has established the Office for Health Improvement and Disparities, dedicated to reducing negative health disparities; commissioned Core20Plus5, a national NHS England approach to inform action to reduce healthcare inequalities; and invested £50m in health inequalities research for local authorities in 2022.

Subscribe to our newsletter

Subscribe To Our Newsletter

Subscribe To Our Newsletter

Sign up

Related News

Digital Health Intelligence analyses PACS and RIS market changes

Digital Health Intelligence analyses PACS and RIS market changes

Digital Health Intelligence's latest report shows that Sectra has become the leading PACS supplier for NHS trusts in England.
DHSC slashes investment in NHS AI Lab by £111m

DHSC slashes investment in NHS AI Lab by £111m

The Department of Health and Social Care has slashed investment in NHS AI Lab from a promised £250 million to just £139m, Digital Health News…
DHSC publishes social care digital skills support tender worth £475k

DHSC publishes social care digital skills support tender worth £475k

The Department of Health and Social Care has published a tender to secure an organisation to develop the digital skills of the adult social care…