Hospital Algorithms Are Biased Against Black Patients, New Research Shows
Health care software prioritizes white patients, even when they’re not as sick
Algorithms meant to help patients in need of extra medical care are more likely to recommend relatively healthy white patients over sicker black patients, according to new research set to be published in the journal Science.
While the researchers studied one specific algorithm in use at Brigham and Women’s Hospital in Boston, they say their audit found that all algorithms of this kind being sold to hospitals function the same way. It’s a problem that affects up to 200 million patients being sorted through this system, their paper claims.
Sendhil Mullainathan, co-author of the paper and professor at the University of Chicago, says the research is intended to empower “customers” — hospitals in this case — to vet the mechanisms behind the software they’re buying.
“We’re going through this phase where customers buying crucial products aren’t being informed in what they are,” Mullainathan says. “It’s like when I buy a car—I don’t literally know what’s happening under the hood.”
Here’s how the algorithm works: When a patient is enrolled in a hospital’s electronic health record system, the risk…