Health Disparity News

AI and Medical Algorithms Perpetuate Racial Health Disparities

In recent years, the intersection of technology and healthcare has exposed alarming racial disparities in medical treatment and outcomes. A stark example is the pulse oximeter, a device crucial during the COVID-19 pandemic, which has been found to overestimate blood oxygen levels in Black patients, potentially leading to delayed or denied care.
 
Dr. Noha Aboelata, founding CEO of Roots Community Health Center in East Oakland, California, was shocked to learn about this inaccuracy. “I just saw red,” she said, recalling her reaction to the New England Journal of Medicine article revealing the issue. The discovery prompted her clinic to file a lawsuit against major manufacturers and sellers of pulse oximeters.
 
The problem extends beyond pulse oximeters. Algorithms used to assess kidney function and determine transplant eligibility have been found to estimate Black patients’ kidney function as higher, potentially lowering their position on organ transplant lists.
 
Experts argue that these issues stem from biased data used to develop medical technologies. Fay Cobb Payton, a professor at North Carolina State University, states, “The algorithms are only as good as the data that’s input.” The overrepresentation of white Americans in clinical trials leads to devices and treatments proven effective for their features while excluding Black patients.
 
While the FDA has acknowledged the need for regulation, progress has been slow. A group of 25 state attorneys general recently urged the agency to take action on pulse oximeters, emphasizing the urgency of preventing “additional severe illness and mortalities.”
 
As lawsuits and public pressure mount, the medical community grapples with a crucial question: who is responsible when racism is embedded in healthcare technology, exacerbating existing racial disparities in America’s health system?

 
Facebook
Twitter

Posts of Interest