Health

Healthcare algorithm used across America has dramatic racial biases


An algorithm used to manage the healthcare of millions of Americans shows dramatic biases against black patients, a new study has found.

Hospitals around the United States use the system sold by Optum, a UnitedHealth Group-owned service, to determine which patients have the most intensive healthcare needs over time. But the algorithm, which has been applied to more than 200 million people each year, significantly underestimates the amount of care black patients need compared with white patients, research published on Friday in Science magazine found.

Although the algorithm did not explicitly apply racial identification to patients, it still played out racial biases in effect. That’s because the parameter the algorithm used to signify health – cost of care – had racial biases baked into it.

Less money is spent on black patients with the same level of need as white patients, causing the algorithm to conclude that black patients were less sick, the researchers found. The study showed black patients incurred about $1,800 in medical costs each year less than white patients at the same level of illness.

Reformulating these biases in the algorithm would more than double the number of black patients flagged for additional care, the study showed. In fact, when the company replicated the analysis on a different data set, they found black patients actually had 48,772 more active chronic conditions than white patients who had been ranked at the same level of risk.

Biases like these are inadvertently built into the technology we use at many different stages, said Ruha Benjamin, author of Race After Technology and associate professor of African American studies at Princeton University.

“Pre-existing social processes shape data collection, algorithm design and even the formulation of problems that need addressing by technology,” she said.

Such algorithms are used to determine which patients could benefit from additional care like monitoring overall health and managing prescription use or doctor visits. The researchers are working with Optum on a fix.

When researchers tweaked the algorithm to make predictions about patients’ future health conditions rather than which patients would incur the highest costs, it reduced biases by 84%. “These results suggest that label biases are fixable,” the study said.

Meanwhile, Optummaintains its system helps “clinicians provide more effective patient care every day”.

“Predictive algorithms that power these tools should be continually reviewed and refined, and supplemented by information such as socio-economic data, to help clinicians make the best-informed care decisions for each patient,” an Optum spokesman, Tyler Mason, said. “As we advise our customers, these tools should never be viewed as a substitute for a doctor’s expertise and knowledge of their patients’ individual needs.”

Although this study was conducted on just one healthcare system algorithm, the researchers suggested similar biases probably exist across a number of industries. As algorithms are increasingly used for job recruiting, housing loans and policing, Benjamin noted that more legislation is needed to ensure algorithms take into consideration historical biases.

“The design of different kinds of systems, whether we’re talking about legal systems or computer systems, can create and reinforce hierarchies precisely because the people who create them are not thinking about how social norms and structures shape their work,” she said. “Indifference to social reality is, perhaps, more dangerous than outright bigotry.”



READ SOURCE

Leave a Reply

This website uses cookies. By continuing to use this site, you accept our use of cookies.