AI watchdog needed to regulate automated decision-making, say experts

AI watchdog needed to regulate automated decision-making

An Artificial Intelligence watchdog should be set up to make sure people are not discriminated against by the automated computer systems making important decisions about their lives, say experts.

The rise of artificial intelligence (AI) has led to an explosion in the number of algorithms that are used by employers, banks, police forces and others, but the systems can, and do, make bad decisions that seriously impact people’s lives. But because technology companies are so secretive about how their algorithms work – to prevent other firms from copying them – they rarely disclose any detailed information about how AIs have made particular decisions.

In a new report, Sandra Wachter, Brent Mittelstadt, and Luciano Floridi, a research team at the Alan Turing Institute in London and the University of Oxford, call for a trusted third party body that can investigate AI decisions for people who believe they have been discriminated against.

“What we’d like to see is a trusted third party, perhaps a regulatory or supervisory body, that would have the power to scrutinise and audit algorithms, so they could go in and see whether the system is actually transparent and fair,” said Wachter.

It is not a new problem. Back in the 1980s, an algorithm used to sift student applications at St George’s Hospital Medial School in London was found to discriminate against women and people with non-European-looking names. More recently, a veteran American Airlines pilot described how he had been detained at airports on 80 separate occasions after an algorithm repeatedly confused him with an IRA leader. Others to fall foul of AI errors have lost their jobs, had car licences revoked, been kicked off the electoral register or mistakenly chased for child support bills.

People who find themselves on the wrong end of a flawed AI can challenge the decision under national laws – but the report finds that the current laws to protect people are not now effective enough.

In Britain, the Data Protection Act allows automated decisions to be challenged. But UK firms, in common with those in other countries, do not have to release any information they consider a trade secret. In practice, this means that instead of releasing a full explanation for a specific AI decision, a company can simply describe how the algorithm works. For example, a person turned down for a credit card might be told that the algorithm took their credit history, age, and postcode into account, but not learn why their application was rejected.

In 2018, European member states, along with Britain, will adopt new legislation that governs how AIs can be challenged. Early drafts of the General Data Protection Regulation (GDPR) enshrined what is called a “right to explanation” in law. But the authors argue that the final version approved last year contains no legal guarantee.

“There is an idea that the GDPR will deliver accountability and transparency for AI, but that’s not at all guaranteed. It all depends on how it is interpreted in the future by national and European courts,” Mittelstadt said. The best the new regulation offers is a “right to be informed” compelling companies to reveal the purpose of an algorithm, the kinds of data it draws on to make its decisions, and other basic information. In their paper, the researchers argue for the regulation to be amended to make the “right to explanation” legally binding.

“We are already too dependent on algorithms to give up the right to question their decisions,” said Floridi. “The GDPR should be improved to ensure that such a right is fully and unambiguously supported”.

For the study, the authors reviewed legal cases in Austria and Germany which have some of the stricter laws around decision-taking algorithms. In many cases, they found that courts required companies to release only the most general information about the decisions algorithms had made.

Share it:
Share it:

[Social9_Share class=”s9-widget-wrapper”]

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

You Might Be Interested In

These Are the Signs the Right Time Is Right for You to Become an Entrepreneur

23 Aug, 2016

These Are the Signs the Right Time Is Right for You to Become an Entrepreneur Learn More August 19, 2016 …

Read more

How Blockchain’s Decentralization Narrative Can Redefine Data Privacy

2 Feb, 2019

The true value of blockchain technology can be found in data privacy. Blockchain’s prolonged sentiment as a panacea to the …

Read more

4 Reasons Technology Is the Future of Onboarding

24 Jun, 2016

4 Reasons Technology Is the Future of Onboarding Today’s Most Read Founder and president of Onboardia June 17, 2016 We’re …

Read more

Recent Jobs

Senior Cloud Engineer (AWS, Snowflake)

Remote (United States (Nationwide))

9 May, 2024

Read More

IT Engineer

Washington D.C., DC, USA

1 May, 2024

Read More

Data Engineer

Washington D.C., DC, USA

1 May, 2024

Read More

Applications Developer

Washington D.C., DC, USA

1 May, 2024

Read More

Do You Want to Share Your Story?

Bring your insights on Data, Visualization, Innovation or Business Agility to our community. Let them learn from your experience.

Get the 3 STEPS

To Drive Analytics Adoption
And manage change

3-steps-to-drive-analytics-adoption

Get Access to Event Discounts

Switch your 7wData account from Subscriber to Event Discount Member by clicking the button below and get access to event discounts. Learn & Grow together with us in a more profitable way!

Get Access to Event Discounts

Create a 7wData account and get access to event discounts. Learn & Grow together with us in a more profitable way!

Don't miss Out!

Stay in touch and receive in depth articles, guides, news & commentary of all things data.