Well my dear there is a lot more to worry about than the self-checkout lane… AI or Artificial Intelligence is coming to the workplace and it is making inroads in the HR department.
The National Law Review said this about AI biases,
On the website Daze a trans woman talks how trans people are affected by AI,
How do you prove that it was AI bias?
How do prove it in court?
Welcome to the “Brave New World” where you are discriminated by a computer.
This afternoon I am on a panel at the University of Connecticut’s School of Law discussing trans health the other person on the panel is a lawyer from GLAD (with one “A”).
I got a chuckle out of the bio they wrote up for me, I know exactly where they got it from
Ten HR Trends In The Age Of Artificial IntelligenceThe question is, will it be for the better?
Forbes
By Jeanne Meister
January 8, 2019
The future of HR is both digital and human as HR leaders focus on optimizing the combination of human and automated work. This is driving a new priority for HR: one which requires leaders and teams to develop a fluency in artificial intelligence while they re-imagine HR to be more personal, human and intuitive.
As we enter 2019, it's the combination of AI and human intelligence that will transform work and workers as we know it.
1. AI Plus Human Intelligence Enhances the Candidate ExperienceThe question begs to be asked… will the AI have a built-in prejudices?
For many companies the first pilots of artificial intelligence are in talent acquisition, as this is the area where companies see significant, measurable, and immediate results in reducing time to hire, increasing productivity for recruiters, and delivering an enhanced candidate experience that is seamless, simple, and intuitive.
The National Law Review said this about AI biases,
The Growing Use of AIOkay so what does this have to do with trans people?
AI is often used in the workplace to assist employers with recruitment through the use of algorithms to make hiring decisions. Notably, although common sense would suggest that AI would help eliminate unconscious (or conscious) bias in the hiring process, it has quickly become apparent that the risk of bias persists.
The Potential for Implicit Bias and Disparate Treatment
[…]
When companies train computer programs to filter out the best candidates for interviews, the learning is often based on prior resumes or attributes of previously hired successful candidates. Given the disparity between genders or even races in certain professions, using past data will only perpetuate the problem, as algorithms are taught to favor specific characteristics or experience.
On the website Daze a trans woman talks how trans people are affected by AI,
As a transgender woman who researches machine learning and artificial intelligence, I want people to understand how algorithms and automation can impose a model of the world that negatively affects people like me. When we deploy algorithms in large systems that affect millions of people, their harmful effects cascade and multiply. Regardless of the creator’s good intentions, they can function to benefit the powerful, privileged and well-off, while causing devastating impacts on the most vulnerable among us – immigrants, people of colour, trans and gender-nonconforming folks, and other marginalised groups.The article goes on about how some researchers used YouTube videos to identify trans people.
One worrying example is the field of ‘gender recognition’, a sub-category of face recognition which seeks to create algorithms that determine a person’s gender from photographs of their physical bodies. Of course, gender isn’t something that can be determined from a person’s physical appearance alone. And transgender people like me frequently face discrimination, harassment and violence when our appearance doesn’t conform with mainstream, cisgender expectations of ‘male’ or ‘female’.
The harm to trans people, however, is very real. Not only did the researchers create their dataset from videos without the consent of their trans creators, they did it to train predictive algorithms that effectively ‘out’ trans people using archived photos. If the researchers had consulted any trans people prior to beginning the project, they would know that many of us transition because we don’t want to be linked to our past name or appearance. We want to live our truth in the present and define our own future – not be algorithmically chained to false identities we were forced to wear in the past.Now think about a company that uses AI in hiring and the AI can spot trans people will it have a built in bias against us?
These faulty experiments ignore social and economic realities, like the fact that poor people, immigrants and people of colour are policed and incarcerated at disproportionately high rates. Most frighteningly, they effectively revive the practice of physiognomy, a long-debunked and infamously racist pseudoscience that used subtle differences in human faces and bone structure to justify discrimination.The ACLU said this to say about using AI in hiring.
Algorithms that disproportionately weed out job candidates of a particular gender, race, or religion are illegal under Title VII, the federal law prohibiting discrimination in employment. And that’s true regardless of whether employers or toolmakers intended to discriminate — “disparate impact discrimination” is enough to make such practices illegal.Suppose you go for a job interview and an AI reviews your resume and an AI looks at the video of your interview with HR and the AI decides you are not the person listed on your resume because it misgenders and you do not get the job.
But it can be difficult to sue over disparate impact, particularly in “failure-to-hire” cases. Such lawsuits are very rare because it’s so hard for someone who never got an interview to identify the policy or practice that led to her rejection.
How do you prove that it was AI bias?
How do prove it in court?
Welcome to the “Brave New World” where you are discriminated by a computer.
This afternoon I am on a panel at the University of Connecticut’s School of Law discussing trans health the other person on the panel is a lawyer from GLAD (with one “A”).
I got a chuckle out of the bio they wrote up for me, I know exactly where they got it from
No comments:
Post a Comment