Would companies be more diverse if A.I. did the hiring? | Joanna Bryson

2018-06-06 3

The best hiring manager might just be the computer sitting on your desk. AI and ethics expert Joanna Bryson posits that artificial intelligence can go through all the resumes in a stack and find what employers are missing. Most humans, on the other hand, will rely on biases — whether they are aware of them or not — to get them through the selection process. This is sadly why those with European-sounding names get more calls for interviews than others. AI, she says, can change that. Joanna is brought to you today by Amway. Amway believes that ​diversity and inclusion ​are ​essential ​to the ​growth ​and ​prosperity ​of ​today’s ​companies. When woven ​into ​every ​aspect ​of ​the talent ​life ​cycle, companies committed to diversity and inclusion are ​the ​best ​equipped ​to ​innovate, ​improve ​brand image ​and ​drive ​performance.

Read more at BigThink.com: http://bigthink.com/videos/joanna-bryson-would-companies-be-more-diverse-if-ai-did-the-hiring

Follow Big Think here:
YouTube: http://goo.gl/CPTsV5
Facebook: https://www.facebook.com/BigThinkdotcom
Twitter: https://twitter.com/bigthink

Transcript: Can AI remove implicit bias from the hiring process? “Remove”, entirely remove? No.

But as I understand I've had multiple people tell me that it's already reducing the impact of implicit bias, so they're already happy with what they're seeing.

So what is implicit bias, first of all? It's important to understand that implicit bias and explicit bias are two different things.

Implicit bias is stuff that you're not conscious of; you're not aware of it; it's hard for you to control; it's probably impossible for you to control.

It's impossible for you to control, right now, on demand. You might be able to alter it by exposing yourself to different situations or whatever and changing what we in machine learning call priors—so changing your experiences.

So maybe if you see more women in senior positions you'll become less implicitly sexist, or something like that.

But anyway, explicit bias is like “I’m going to choose to only work with women” or “I’m going to choose only to work with men” and I know that and I'm conscious about it.

So HR departments are reasonably good at getting people who hopefully honestly are saying “yeah I'm not going to be racist or sexist or whatever-ist, I'm not going to worry about how long somebody's name is or what the country of origin of someone of their ancestors is.” So hopefully HR people can spot the people who sincerely are neutral, at least at the explicit level.

But at the implicit level, there's a lot of evidence that something else might be going on. Again, we don't know for sure if it's implicit or explicit, but what we do know is that in the paper we did in 2017 one of my co-authors Aylin Caliskan had this brilliant idea of looking at the resume data. So there's this famous study that showed that you have identical resumes and the only thing you do is have more African-American names versus European American names, and the people with European American names get 50 percent more calls in to interview with nothing else changed.

And so now people are talking about “whitening” their CVs just so they get that chance to interview. So anyway, it looks by the measures that we used with the vector spaces as if the data and the implicit bias that also explains implicit bias also explains those choices on the resume.

So does that mean people are looking at it and explicitly saying, “Oh I think that's an African-American?” Or were they just going through huge stacks of CVs and some didn't jump out at them in the same ways that others did? Because we're pretty sure when it comes down to like they're all sitting in the room together that that point was okay.

And so what the AI is doing for them is it's helping them pick out the characteristics they're looking for and ignoring the other characteristics. So they're helping them detect the things that they wanted to be: when they were sitting in the room with multiple eyes looking at something, that they were looking at the right starting place and then they're able to find - they're finding people that were falling through the cracks.

A lot of people have trouble, that there's not enough good people applying or that they thought there weren't enough good people applying, but actually, they were missing people because they didn't see the qualifications buried in the other stuff when they're leafing through these stacks.

So a lot of people are reporting that they have great data or they're very pleased with the results, but that's privately and it's off the record and I can't get anyone to go on the record.

Free Traffic Exchange