Racial Bias in Healthcare Algorithms
Optum risk-prediction tool perpetuates racial inequities in healthcare
Welcome to the Health Bytes Newsletter!
This week, we are looking into how widely used predictive algorithms in health systems can unintentionally harm the outcomes of minority patients. Here are the main takeaways:
Biased predictive algorithms can hurt Black patients
“Race-blind” measures overlook structural inequalities in healthcare
Need for strategies to develop equitable algorithms
Questions, feedback, or comments? Send us your thoughts here.
Biased predictive algorithms can hurt Black patients
A study of health algorithms conducted by Obermeyer et al. for Science uncovered evidence of unintentional racial bias that inaccurately assessed health needs for Black patients. One widely used algorithm, developed by Optum, used health costs as a factor to rank which patients would benefit from extra medical now so that hospitals would not have to spend more money on them in the future. However, Black patients that were ranked at the same level as White patients turned out to be much sicker as they had 26.3% more chronic health conditions than their White counterparts. Optum’s algorithm is part of a class of tools that affect treatment plans for 200 million people. It perpetuates racial disparities in healthcare by reducing spending on Black patients with complex medical needs, a group that has experienced historical barriers to healthcare.
“Race-blind” measures overlook structural inequalities in healthcare
Although Optum designed this algorithm to be unbiased by using a “race-blind” metric, health costs are rooted in health inequity. Black patients tend to spend less on healthcare as they have faced barriers to affordable and accessible care. Black Americans are more likely to be uninsured than White Americans and more likely to forgo care due to cost. Therefore, health systems tend to spend less money on Black patients since they do not utilize health services because of these barriers. Therefore, when these algorithms are trained on health costs, they neglect the socioeconomic factors that impact how Black Americans access care which causes inaccurate assessments of their health needs. By not providing care for Black patients who have a greater need, health systems will spend more on end of life care for this group - counterintuitive to the supposed future cost savings that will result from using these predictive algorithms.
Need for strategies to develop equitable algorithms
As predictive algorithms become a staple for hospitals to determine treatment plans, health systems and providers need to be wary of how these algorithms are trained. Biases in algorithms can be easily eliminated by changing a few factors. For the Optum algorithm, replacing health costs with a factor that combines health prediction with cost prediction reduced bias by 84%, which would increase care for Black patients from 17.5% to 46.5%. Optum states that these algorithms are not a replacement for physician judgment, but these tools should be tested on how they evaluate patients. Although using race as a factor in clinical algorithms has its own drawbacks, companies should use multidimensional factors like health prediction and health costs as a strategy for creating equitable predictive tools.