Science

From oximeters to AI, where bias in medical devices may lurk


The UK health secretary, Sajid Javid, has announced a review into systemic racism and gender bias in medical devices in response to concerns it could contribute to poorer outcomes for women and people of colour.

Writing in the Sunday Times, Javid said: “It is easy to look at a machine and assume that everyone’s getting the same experience. But technologies are created and developed by people, and so bias, however inadvertent, can be an issue here too.”

We take a look at some of the gadgets used in healthcare where concerns over racial bias have been raised.

Oximeters

Oximeters estimate the amount of oxygen in a person’s blood, and are a crucial tool in determining which Covid patients may need hospital care – not least because some can have dangerously low levels of oxygen without realising.

Concerns have been raised, however, that the devices work less well for patients with darker skin. NHS England and the Medicines and Healthcare products Regulatory Agency (MHRA) say pulse oximeters can overestimate the amount of oxygen in the blood.

Javid told the Guardian last month that the devices were designed for caucasians. “As a result, you were less likely to end up on oxygen if you were black or brown, because the reading was just wrong,” he said.

Experts believe the inaccuracies could be one of the reasons why death rates have been higher among minority ethnic people, although other factors may also play a role, such as working in jobs that have greater exposure to others.

Respirator masks

Medical-grade respirators are crucial to help keep healthcare workers safe from Covid because they offer protection to the wearer against both large and small particles that others exhale.

In order to offer the greatest protection, however, filtering face piece (FFP) masks must fit properly and research has shown they do not fit as well on people from some ethnic backgrounds.

“Adequate viral protection can only be provided by respirators that properly fit the wearer’s facial characteristics. Initial fit pass rates [the rate at which they pass a test on how well they fit] vary between 40% and 90% and are especially low in female and in Asian healthcare workers,” one review published in 2020 notes.

Another published in September found that studies on the fit of such PPE largely focused on Caucasian or single ethnic populations. “BAME people remain under-represented, limiting comparisons between ethnic groups,” it said.

Spirometers

Spirometers measure lung capacity, but experts have raised concerns that there are racial biases in the interpretation of data gathered from such gadgets.

A woman blows into a spirometer
A woman blows into a spirometer. Photograph: Justin Tallis/AFP/Getty Images

Writing in the journal Science, Dr Achuta Kadambi, an electrical engineer and computer scientist at the University of California, Los Angeles said Black or Asian people are assumed to have lower lung capacity than white people – a belief he noted may be based on inaccuracies in earlier studies. As a result, “correction” factors are applied to the interpretation of spirometer data – a situation that can affect the order in which patients are treated.

“For example, before ‘correction’ a Black person’s lung capacity might be measured to be lower than the lung capacity of a white person” Kadambi writes.

“After ‘correction’ to a smaller baseline lung capacity, treatment plans would prioritise the white person, because it is expected that a Black person should have lower lung capacity, and so their capacity must be much lower than that of a white person before their reduction is considered a priority.”

Another area Kadambi said may be affected by racial bias is remote plethysmography, a technology in which pulse rates are measured by looking at changes in skin colour captured by video. Kadambi said such visual cues may be biased by subsurface melanin content – in other words, skin colour.

Artificial intelligence systems

AI is increasingly being developed for applications in healthcare, including to aid professionals in diagnosing conditions. There are concerns, however, that biases in data used to develop such systems means they risk being less accurate for people of colour.

Such concerns were recently raised in relation to AI systems for diagnosing skin cancers. Researchers revealed that few freely available image databases that could be used to develop such AI are labelled with ethnicity or skin type. Of those that did have such information recorded, only a handful were of people recorded as having dark brown or black skin.

It is an issue Javid has acknowledged. Announcing new funding last month for AI projects to tackle racial inequalities in healthcare, such as the detection of diabetic retinopathy, he noted that one area of focus would be the development of standards to make sure datasets used in developing AI systems are “diverse and inclusive”.

“If we only train our AI using mostly data from white patients it cannot help our population as a whole. We need to make sure the data we collect is representative of our nation,” he said.



READ SOURCE

Leave a Reply

This website uses cookies. By continuing to use this site, you accept our use of cookies.