Wednesday, December 11, 2019

Being Denied Healthcare By A Computer

Last month I wrote about AI being used for hiring and how it can be biased, well AI is being used also in health care.
There’s no quick fix to find racial bias in health care algorithms
Legislators are demanding answers on hospital oversight of algorithms
The Verge
By Nicole Wetsman
December 4, 2019

Legislators in Washington, DC are taking a closer look at racial bias in health care algorithms after an October analysis found racial bias in a commonly used health tool. Sens. Cory Booker (D-NJ) and Ron Wyden (D-OR) released letters on Tuesday calling for federal agencies and major health companies to describe how they’re monitoring the medical algorithms they use every day for signs of racial bias.

“Unfortunately, both the people who design these complex systems, and the massive sets of data that are used, have many historical and human biases built in,” they wrote to the Centers for Medicare and Medicaid Services (CMS). The senators’ focus for their letters was mostly on gathering information. They wanted to know if the CMS, which administers Medicare and Medicaid, is collecting information from health care organizations on their use of algorithms. They asked the Federal Trade Commission if it was investigating harm caused by discriminatory algorithms on consumers and asked the health companies, including Aetna and Blue Cross Blue Shield, if and how they audit the algorithms they use for bias.
So as I was reading this what popped into my head was what about other minorities?

How will the AI handle other minorities like those who were not born in this country and English is a second language, or trans people?
Equity in health care depends on identifying and rooting out bias in algorithms. But because the programs are still relatively new, there still aren’t best practices for how to do so. “These practices bubbling up are more like best efforts,” says Nicol Turner-Lee, a fellow in the Center for Technology Innovation at the Brookings Institution.
What they do to “train” the AI is to feed it tons of data to learn from and if that information doesn’t contain enough data from minorities like us it will show up as bias against the minorities and I doubt very much that there is almost zero data being inputted on trans people.

Do you remember last month’s flap over Google analyzing the health care records of millions of people, well that was to feed the AI.
Google to Store and Analyze Millions of Health Records
The tech company’s deal with Ascension is part of a push to use artificial intelligence to aid health services.
New York Times
By Natasha Singer and Daisuke Wakabayashi
November 11, 2019

In a sign of Google’s major ambitions in the health care industry, the search giant is working with the country’s second-largest hospital system to store and analyze the data of millions of patients in an effort to improve medical services, the two organizations announced on Monday.

The partnership between Google and the medical system, Ascension, could have huge reach. Ascension operates 150 hospitals in 20 states and the District of Columbia. Under the arrangement, the data of all Ascension patients could eventually be uploaded to Google’s cloud computing platform.

It is legal for health systems to share patients’ medical information with business partners like electronic medical record companies. Even so, many patients may not trust Google, which has paid multiple fines for violating privacy laws, with their personal medical details.
[…]
Already, the two organizations are testing software that allows medical providers to search a patient’s electronic health record by specific data categories and create graphs of the information, like blood test results over time, according to internal documents obtained by The New York Times. The aim is to give medical professionals better access to patient data, to improve patient care and, ultimately, to try to glean insights from the data to help treatment.
[…]
Google’s health efforts include a push to use artificial intelligence to read electronic health records and then try to predict or more quickly identify medical conditions.
Now picture this…

You are a trans man who goes to the emergency room with a pain in your abdomen. All your medical records list you as male and the medical AI spits out a whole list of possible medical problems but I will bet that what is not on the list is an ovarian cyst.

This is just one example of AI bias, and probably many of our health problems are not even being entered into the AI database. It is not that they are not in the database (the AI probably have thousands of entries about ovarian cyst, just probably not one about a trans man having an ovarian cyst).

Back in the sixties when I was learning computer programming I learned a phrase that is still in effect today… “Garage in garage out.”

Only today it affects our healthcare.

No comments:

Post a Comment