You are on page 1of 3
United States Senate WASHINGTON, DC 20510 December 3, 2019 ‘The Honorable Joseph Simons Chairman Federal Trade Commission 600 Pennsylvania Ave NW Washington, DC 20580 Dear Chairman Simons: We write today to request information regarding any actions the Federal Trade Commission (FTC) is taking or plans to take to address bias in algorithms used throughout the health care system. Algorithms are increasingly embedded into every aspect of modern society, including the health care system. Organizations use these automated decision systems, driven by technologies ranging from advanced analytics to artificial intelligence, to organize and optimize the complex choices they need to make on a daily basis. In using algorithms, organizations often make an attempt to remove the human flaws and biases from the process. Unfortunately, both the people who design these complex systems and the massive sets of data that are used have many historical and human biases built in. Without very careful consideration, the algorithms they subsequently create can further perpetuate those very biases. In health care, there is great promise in using algorithms to sort patients and target care to those most in need. However, these systems are not immune to the problem of bias. A study recently published in the journal Science found racial bias in one algorithm widely used in health systems throughout the country.' This particular case of algorithmic bias used health care costs as a proxy for health care needs. The creators of the algorithm did not take into account that health care needs are not the only contributing factor to an individual’s level of health care costs. Other factors may include barriers to accessing care and low levels of trust in the health care system. These factors disproportionately impact black patients. As a result, black patients were less likely to be referred for additional services than were white patients due to their lower historical costs, even though black patients were typically sicker than their white counterparts at a given risk score developed by the algorithm. According to the authors of this study, more than 45 percent of black patients captured under this algorithm would be flagged for additional help after this bias is addressed, an increase from nearly 18 percent under the algorithm as it was originally designed. The findings of this study are deeply troubling, particularly taken in the context of other biases, disparities, and inequities that plague our health care system. For instance, a 2016 study found that most medical students and residents held the false belief that black patients tolerate more pain than white patients, which was related to less accurate treatment recommendations for black ® Obermeyer, Ziad, el, “Dissecting Racial Bias in an Algorithm Used to Manage the Healt of Populations,” Seienss, 25 October 2019, hnups:scence siencemag orlcontent366'6464/547 patients compared to white patients.? It is well documented that certain illnesses have a significantly higher incidence among marginalized populations, like hypertension, which primarily affects black Americans. Black and American Indian/Alaska Native women are significantly more likely to die from complications related to or associated with pregnancy than white women, even when considering education level.’ Technology holds great promise in addressing these issues and improving population health. However, if itis not applied thoughtfully and with acknowledgement of the risk for biases, it can also exacerbate them, ‘We are pleased to lear that the organization featured in the Science study appears to be taking steps to eliminate racial bias from their product. While a fix to this particular algorithm is a step in the right direction, this is just one of the many algorithms currently used in the health care industry. Congress and the Administration must make a concerted effort to find out the degree to which this issue is widespread, and move quickly to make lasting changes. To that end, we request answers to the following questions no later than December 31, 2019: 1. On November 13 and 14 of 2018 the FTC held a hearing regarding “The consumer welfare implications associated with the use of algorithmic decision tools, artificial intelligence, and predictive analytics’ as part of the FTC’s ‘Hearings on Competition and ‘Consumer Protection in the 21st Century’. Did any internal policy changes regarding potential algorithmic bias emerge from those proceedings? 2. Ifno policy changes regarding algorithmic bias within products or businesses overseen by the FTC resulted from the 2018 hearings, are there any currently under serious consideration? If not, why? 3. How well do the FTC’s current enforcement tools, including Section 5 of The FTC Act®, address potential bias against race, gender, or other protected designations within emerging algorithmic decision-making or targeting tools? 4, Does the FTC have any ongoing studies or investigations into damages done to consumer welfare by discriminatory and biased algorithms? 5. Would the FTC commit to undertaking an investigation into the ways that targeting and decision making algorithms currently in use unfairly discriminate against members of protected classes? As algorithms play an increasingly prevalent role in the health care system, we urge the FTC to consider the risk for algorithmic bias and its potential impact on health disparities outcomes. Thank you for your attention to this matter. Should you have any questions, please contact 2 Hotnan, Kelly, al, “Racial bias in pain assessment and treatment commendations and false bits abou biological ferences between blacks and whites,” Proceedings ofthe National Academy of Sciences ofthe United Stas of America, 19 Api 2016, ps wwe pas org eonten 13/16/4296 long * Petersen, Emily Et al, "Racial Enc Disparities in Pregnaney-Relate Deaths — United States, 2007-2016," Morbidity and Morality ‘Weekly Report, Centers for Disease Control and Prevention, 6 September 2019, np: ede govmmrvolumes/68/wrmam6S3Sa3 han?s_cié-mm68383,e@eliveryName=USCDC_921-DME344 “FTC Heating #7; The Competition and Consumer Protection Issues of Algorithms, Artificial Inllgence, and Predictive Analytic," Fedral ‘Trade Commission, ftp: ts gov new evens event-calendarRe-hering-7-competion-consumerprotetin-21st-entry *TSUSC §45, Unfair Methods of Competition Unawfl, Prevention by Commission, psy law omelL dulscode ex 1585 Rashan Colbert with Senator Booker’s office and Kristen Lunde with Senator Wyden’s staff at (202) 224-3224 and (202) 224-4515, respectively. Sincerely, es Cory A. Booker Ron Wyden United States Senator United States Senator