Thursday, October 24, 2019

It’s A Brave New World.

I was reading an employment blog and it was about Artificial Intelligence or AI being used for job interviews and I wrote a comment on their Facebook page…
As a trans woman and a person with a speech impediment I sure hope that they test out their AI with us. From what I read AI cannot recognize trans people, its batting average in identifying trans is zero, I would hate not to be able get a job because some computer program misgendered me.

Or that I didn’t get a job because of a speech impediment or have an accent.
Before I wrote that I did some research on AI and trans people.
Facial Recognition Software Regularly Misgenders Trans People
Human computer interfaces are almost never built with transgender people in mind, and continue to reinforce existing biases.

Vice
By Matthew Gault
February 19, 2019

Facial recognition software is a billion dollar industry, with Microsoft, Apple, Amazon, and Facebook developing systems, some of which have been sold to governments and private companies. Those systems are a nightmare for various reasons—some systems have, for example, been shown to misidentify black people in criminal databases while others have been unable to see black faces at all.

The problems can be severe for transgender and nonbinary people because most facial recognition software is programmed to sort people into two groups—male or female. Because these systems aren’t designed with transgender and gender nonconforming people in mind, something as common as catching a flight can become a complicated nightmare. It’s a problem that will only get worse as the TSA moves to a full biometric system at all airports and facial recognition technology spreads.

These biases programmed into facial recognition software means that transgender and gender nonconforming people may not be able to use facial recognition advancements that are at least nominally intended to make people’s lives easier, and, perhaps more importantly, may be unfairly targeted, discriminated against, misgendered, or otherwise misidentified by the creeping surveillance state's facial recognition software.
My guess is that the people working to develop AI are mainly white and mainly cisgender males. The Next Web writes,
Unfortunately, facial recognition proponents often don’t see this as a problem. Scientists from the University of Boulder in Colorado recently conducted a study to demonstrate how poorly AI performs when attempting to recognize the faces of transgender and non-binary people. This is a problem that’s been framed as horrific by people who believe AI should work for everyone, and “not a problem” by those who think only in unnatural, binary terms.

It’s easy for a bigot to dismiss the tribulations of those whose identity falls outside of their world-view, but these people are missing the point entirely. We’re teaching AI to ignore basic human physiology.
We are teaching AI to be bigots.

And it is not just us…
Researcher Morgan Klaus Scheuerman, who worked on the Boulder study, appears to be a cis-male. But because he has long hair, IBM’s facial recognition software labels him “female.”

And then there’s beards. About 1 in 14 women have a condition called hirsutism that causes them to grow “excess” facial hair. Almost every human, male or female, grows some facial hair. However, at a rate of about 100 percent, AI concludes that facial hair is a male trait. Not because it is, but because it’s socially unacceptable for a woman to have facial hair.
Meanwhile VOX writes,
Human bias can seep into AI systems. Amazon abandoned a recruiting algorithm after it was shown to favor men’s resumes over women’s; researchers concluded an algorithm used in courtroom sentencing was more lenient to white people than to black people; a study found that mortgage algorithms discriminate against Latino and African American borrowers.
[…]
Facial recognition tech has also caused problems for transgender people. For example, some trans Uber drivers have had their accounts suspended because the company uses a facial recognition system as a built-in security feature, and the system is bad at identifying the faces of people who are transitioning. Getting kicked off the app cost the trans drivers fares and effectively cost them a job.
It is hard enough now for trans people to get jobs can you imagine trying to get a job when the AI is a bigot?

Will we end up like this ad…



No comments:

Post a Comment