To Build Less-Biased AI, Hire a More-Diverse Team

To Build Less-Biased AI

To combat bias in AI, companies need more diverse AI talent. After all, if none of the researchers building facial recognition systems are people of color, ensuring that non-white faces are properly distinguished may be a far lower priority. The problem is that human resume screening is itself inherently biased. Discrimination is so prevalent that minorities often actively whiten resumes (and are subsequently more successful in the job market). Scanning resumes, whether by computer or human, is an archaic practice best relegated to the dustbin of history. At best, it measures a candidate’s ability to tactfully boast about their accomplishments and, at worse, provides all the right ingredients for either intentional or unintentional discrimination. One way that companies can overcome this challenge is to embrace more objective interviewing techniques, such as project-based assessments, which ask candidates to demonstrate their own abilities, rather than just claim them. Companies still focusing on resume screenings while foregoing more objective assessments need to understand the negative repercussions on workplace diversity — and that it may be perpetuating, not diminishing, the bias in their AI and analytics.

We’ve seen no shortage of scandals when it comes to AI. In 2016, Microsoft Tay, an AI bot built to learn in real time from social media content turned into a misogynist, racist troll within 24 hours of launch. A ProPublica report claimed that an algorithm — built by a private contractor — was more likely to rate black parole candidates as higher risk. A landmark U.S. government study reported that more than 200 facial recognition algorithms — comprising a majority in the industry — had a harder time distinguishing non-white faces. The bias in our human-built AI likely owes something to the lack of diversity in the humans who built them. After all, if none of the researchers building facial recognition systems are people of color, ensuring that non-white faces are properly distinguished may be a far lower priority.

Technology has a remarkably non-diverse workforce. A 2019 study found that under 5.7% of Google employees were Latinx, and 3.3% were Black. Similarly low rates exist across the tech industry. And those numbers are hardly better outside the tech industry, with Latinx and Black employees making up just 7% and 9%, respectively, of STEM workers in the general economy. (They comprise 18.5% and 13.4%, respectively, of the U.S. population.) Data science is a special standout — by one estimate, it underrepresents women, Hispanics, and Blacks more than any other role in the tech industry. It may come as no surprise that a 2019 study by the non-profit Female Founders Faster Forward (F4) found that 95% of surveyed candidates reported facing discrimination in the workplace. With such a biased workforce, how can we expect our AI to fare any better?

Sources of bias in hiring abound. Some of this comes from AI. Amazon famously had to scrap its AI recruiting bot when the company discovered it was biased against women. And it’s not just tech titans: LinkedIn’s 2018 Global Recruiting Trends survey found that 64% of employers use AI and data in recruiting, including top employers like Target, Hilton, Cisco, PepsiCo, and Ikea. But we cannot entirely blame AI —­ there is a much deeper and more systemic source of hiring bias. An established field of academic research suggests that human resume screening is inherently biased. Using innovative field experiments, university researchers have shown that resume screeners discriminate on the basis of race, religion, national origin, sex, sexual orientation, and age. Discrimination is so prevalent that minorities often actively whiten resumes (and are subsequently more successful in the job market). Scanning resumes, whether by computer or human, is an archaic practice best relegated to the dustbin of history.

Share it:
Share it:

[Social9_Share class=”s9-widget-wrapper”]

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

You Might Be Interested In

Data lineage: What it is and why it’s important

10 Apr, 2021

Databases are good at inserting, updating, querying, and deleting data and representing the data’s current state. Developers rely on data …

Read more

Historical Data is a Thing of the Past. It’s Prime-time for Real-time Data.

1 Jun, 2022

In this special guest feature, Dan O’Connell, Chief Strategy Officer and a board member at Dialpad, takes a look at …

Read more

How This Cofounder Created An Artificial Intelligence Styling Company To Help Consumers Shop

2 Jan, 2020

“I’m picky about user experiences,” Bacharach explains. “When I was a consumer in my life, shopping, I was always frustrated …

Read more

Recent Jobs

Senior Cloud Engineer (AWS, Snowflake)

Remote (United States (Nationwide))

9 May, 2024

Read More

IT Engineer

Washington D.C., DC, USA

1 May, 2024

Read More

Data Engineer

Washington D.C., DC, USA

1 May, 2024

Read More

Applications Developer

Washington D.C., DC, USA

1 May, 2024

Read More

Do You Want to Share Your Story?

Bring your insights on Data, Visualization, Innovation or Business Agility to our community. Let them learn from your experience.

Get the 3 STEPS

To Drive Analytics Adoption
And manage change

3-steps-to-drive-analytics-adoption

Get Access to Event Discounts

Switch your 7wData account from Subscriber to Event Discount Member by clicking the button below and get access to event discounts. Learn & Grow together with us in a more profitable way!

Get Access to Event Discounts

Create a 7wData account and get access to event discounts. Learn & Grow together with us in a more profitable way!

Don't miss Out!

Stay in touch and receive in depth articles, guides, news & commentary of all things data.