How Tech Algorithms Become Infected With the Biases of Their Creators

How Tech Algorithms Become Infected With the Biases of Their Creators

Bias in artificial intelligence is everywhere. At one point, when you Googled “doctor,” the algorithm that powers Google’s namesake product returned 50 images of white men. But when biased algorithms are used by governments to dispatch police or surveil the public, it can become a matter of life and death.

On Tuesday, OneZero reported that Banjo CEO Damien Patton had associated with members of the Ku Klux Klan in his youth, and was involved in the shooting of a Tennessee synagogue.

Banjo’s product, which is marketed to law enforcement agencies, analyzes audio, video, and social media in real time, using artificial intelligence algorithms to determine what is worthy of police attention and what is not.

To what extent do the decisions of these types of algorithms reflect the conscious or unconscious biases of their creators?

The most common type of artificial intelligence used by tech companies today is called deep learning, which is a technique that analyzes data by breaking it down into smaller, simpler pieces and finds patterns among them. The bigger the datasets, the better an algorithm will be at recognizing patterns.

Say you’re developing a program to identify pets. If you’re a dog person, you might train the algorithm on a million pictures of dogs but only, say, 1,000 pictures of cats. The algorithm’s idea of what cats look like will ultimately be far less fully formed, increasing the likelihood that it will misidentify them. That, in a nutshell, is A.I. Bias — poorly collected, or poorly designed, datasets that reflect human biases, and eventually impact real-world outcomes.

“Personal prejudices are all present in the room where choices about which systems get built and which don’t are made, which data is used… and how to determine whether the system is working well or not,” says Meredith Whittaker, cofounder of AI Now, a research institution that studies the societal impact of A.I.

Small choices, like which dataset is used to recognize a specific event to display on a crime dashboard, can scale into discriminatory practices.

One of the most infamous examples of bias in artificial intelligence emerged in 2015, when Google added a feature to Google Photos that sorted images based on what an algorithm thought was in them. The feature was mostly innocuous. It recognized dogs and flowers and buildings. But it also classified people with darker skin as gorillas.

“I understand HOW this happens; the problem is moreso [sic] on the WHY,” programmer Jacky Alcine, who first tweeted about the algorithm’s mistake, wrote on Twitter.

Google’s PR wrote that the company was “appalled” and issued what it called a temporary fix — the app no longer flagged photos under the “gorilla” tag. But the biased outcome was tough to disentangle. More than two years later, the search results for gorilla and monkey were still censored.

It’s unknown what caused the mistake.

Share it:
Share it:

[Social9_Share class=”s9-widget-wrapper”]

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

You Might Be Interested In

5 Dirty Little Secrets of Industrial Digital Transformation

31 Jul, 2021

In many factories, you’ll see dashboards filled with colorful graphics and numbers that are rarely looked at. Field technicians will …

Read more

Streamlining Data Science and Analytics Workflows for Maximum ROI

12 Aug, 2018

Initially, the relationship between data science and analytics is causal. Accurate analytics is one of the outputs of data science; …

Read more

Data sharing models in the insurance industry

27 Feb, 2021

There is a drive for efficiency in insurance markets, accompanied and enabled by changes in the way that data is …

Read more

Recent Jobs

Senior Cloud Engineer (AWS, Snowflake)

Remote (United States (Nationwide))

9 May, 2024

Read More

IT Engineer

Washington D.C., DC, USA

1 May, 2024

Read More

Data Engineer

Washington D.C., DC, USA

1 May, 2024

Read More

Applications Developer

Washington D.C., DC, USA

1 May, 2024

Read More

Do You Want to Share Your Story?

Bring your insights on Data, Visualization, Innovation or Business Agility to our community. Let them learn from your experience.

Get the 3 STEPS

To Drive Analytics Adoption
And manage change

3-steps-to-drive-analytics-adoption

Get Access to Event Discounts

Switch your 7wData account from Subscriber to Event Discount Member by clicking the button below and get access to event discounts. Learn & Grow together with us in a more profitable way!

Get Access to Event Discounts

Create a 7wData account and get access to event discounts. Learn & Grow together with us in a more profitable way!

Don't miss Out!

Stay in touch and receive in depth articles, guides, news & commentary of all things data.