Tech-savvy investigators are ready to put algorithms under the microscope — if companies let them | CBC News (2024)

Science

Much as companies already have outsiders review their finances and the security of their computer systems, they might soon do the same with their decision-making code.

In the nascent field of algorithm auditing, researchers evaluate the behaviour of decision-making code

Tech-savvy investigators are ready to put algorithms under the microscope — if companies let them | CBC News (1)

Matthew Braga · CBC News

·

Tech-savvy investigators are ready to put algorithms under the microscope — if companies let them | CBC News (2)

Decades before you could buy a plane ticket on your phone, there were computerized reservation systems (CRS). These were rudimentary information systems used by travel agents to book customers' flights. Andthey had one devious flaw.

By the early1980s,80 per cent of travel agenciesusedthesystems operated by American and United airlines. And it didn't take long before the two airlines realized they could use that monopoly to their advantage — namely, by writing code designed toprioritizetheir own flights onCRS screens over thoseof their competitors.

Naturally, U.S. aviation regulatorsweren'tpleased, and the companies were ordered to cut it out. But the case — described in a 2014 paper from researcher Christian Sandvig — lives on today as one of the earliest examples of algorithmic bias.

It's a reminder that algorithms aren't always as neutral or well-intentioned as their creators might think — or want us to believe — a reality that's more evident today than it's ever been.

Tech-savvy investigators are ready to put algorithms under the microscope — if companies let them | CBC News (3)

In U.S. courts, reports generated by proprietary algorithms are already being factored into sentencing decisions — and some have cast doubts on the accuracy of the results. Sexist training sets have taught image recognition software to associatephotos of kitchens with women more than men.

And perhaps most famously, Facebook has been the target of repeated accusations thatit* platform,which serves content according to complexalgorithms, helped amplify the spread of fake news and disinformation, potentially influencing the outcome of the 2016 U.S. presidential election.

Yet, giventhe important role algorithmsplay in so manyparts of our lives, we know incredibly little about how these systems work. It's whya growing number of academics have established a nascent field for algorithmic audits. Much likecompanies already have outsiders review their finances and the security of their computer systems, they might soon do the same with their decision-making code.

Algorithmic auditors

For now, it's mostly researchers operating on their own, devising ways to poke and prod at popular software and services from the outside — varying the inputs in an effort to find evidence of discrimination, biasor other flaws in what comes out.

Some of the field's experts envision a future where crack teams of researchers are called in by companies — or perhaps on the order of a regulator or judge — to more thoroughly evaluate how a particular algorithm behaves.

There are signsthis day is fast approaching.

Last year, the White House called on companies to evaluate their algorithms for bias and fairness through audits and external tests. In Europe, algorithmic decisions believed to have beenmade in error or unfairlymay soonbe subject to a "right to explanation" — though how exactly this will work in practice is not yet clear.

A Harvard project called VerifAI is in the early stages of defining "the technical and legal foundations necessary to establish a due process framework for auditing and improving decisions made by artificial intelligence systems as they evolve over time."

Tech-savvy investigators are ready to put algorithms under the microscope — if companies let them | CBC News (4)

Harvard is one of a handful of schools — including Oxford and Northwestern — with researchers studyingalgorithmic audits, plus a new conference devoted to the subject will kick off in New York next year.

Outside academia, consulting giant Deloitte now has a team that advises clients on how they can manage "algorithmic risks." And mathematician Cathy O'Neil launched an independent algorithm consultancy of her own last year, pledging "to set rigorous standards for the new field of algorithmic auditing."

Scrutinizing secret code

All of this is happening amidst rising political backlash against some of the most powerful tech companies in the world, whose opaque algorithms increasingly shape what we read and how we communicate online with little external scrutiny.

One of the challenges, says Solon Barocas, who researchesaccountability in automated decision-making at Cornell University, will be determining what, exactly, to scrutinize and how. Tech companies aren't regulated the same way asother industries, and the mechanisms that are already used to evaluate discrimination and bias in areas such as hiring or credit may not easily apply to the decisions that, say, a personalization or recommendation engine makes.

And in the absence of oversight, there's also the challenge of convincing companies there's value in letting in algorithmic auditors.O'Neil, the mathematician and a well-known figure in the field, says her consulting firm has no signed clients — "yet."

  • ANALYSIS| When algorithms go bad: Online failures show humans are still needed

Barocasthinks companies "actually fear putting themselves in greater risk by doing these kinds of tests." He suggests some companies may actually prefer to keep themselves — and their users — in the dark by not auditing their systems, rather than discovera bias they don't know how to fix.

But whether companies choose to embrace external audits or not, greater scrutiny may be inevitable. Secret and unknowable code governs more parts of our lives with each passing day. When Facebook has the power to potentiallyinfluencean election, it's not surprising that a growing number of outside observers want to better understand how these systems work, and why they make the decisions they do.

ABOUT THE AUTHOR

Tech-savvy investigators are ready to put algorithms under the microscope — if companies let them | CBC News (5)

Matthew Braga

Senior Technology Reporter

Matthew Braga is the senior technology reporter for CBC News, where he covers stories about how data is collected, used, and shared. You can contact himviaemailat matthew.braga@cbc.ca. For particularly sensitive messages or documents, consider using Secure Drop, an anonymous, confidential system for sharing encrypted information with CBC News.

Corrections and clarifications|Submit a news tip|

Related Stories

  • Facebook announces staffing increase as it turns over thousands of Russia-linked ads to U.S. Congress
  • Twitter hands over ads from Russian TV network to congressional investigators
  • Analysis When algorithms go bad: Online failures show humans are still needed
  • When do Canadian spies disclose the software flaws they find? There's a policy, but few details
Tech-savvy investigators are ready to put algorithms under the microscope — if companies let them | CBC News (2024)

FAQs

What are algorithms used for? ›

Algorithms are used to find the best possible way to solve a problem, based on data storage, sorting and processing, and machine learning. In doing so, they improve the efficiency of a program. Algorithms are used in all areas of computing. Because it is a fantastic way of automating computer decisions.

How is Germany using AI? ›

That's why this future field is in focus, not only in German politics but also in business and research. AI programs have long been used in many companies, and new applications are being developed at universities and by nascent start-ups. The German government supports these developments with funding programmes.

How big is the AI market in McKinsey 2030? ›

Globally, our estimates show that AI has the potential to deliver additional total economic activity of approximately $13 trillion by 2030. “Notes from the AI frontier: Modeling the impact of AI on the world economy,” McKinsey Global Institute, September 4, 2018.

Is China using AI? ›

Beginning in 1993, smart automation and intelligence have been part of China's national technology plan. Since the 2000s, the Chinese government has further expanded its research and development funds for AI and the number of government-sponsored research projects has dramatically increased.

How is USA using AI? ›

The National Oceanic and Atmospheric Administration (NOAA) utilizes AI to analyze urban heat islands, where a highly-developed community or neighborhood experiences much warmer temperatures than nearby areas. By studying urban heat islands with AI, NOAA can work to protect the public from extreme weather.

Which country is leading with AI? ›

The US leads the way, with almost 60% of “top tier” AI researchers and $249 billion in private funding. China and the UK round out the top three, with Israel and Canada following closely behind. Our guide considers government buy-in, the health of startups and key players, and private and public funding.

What jobs is AI replacing? ›

Supply chain optimization, for example, was the most likely to be replaced by AI, with 72% of businesses admitting that had removed at least some jobs to perform the task. Other roles at the bottom of the list include legal research (65%), financial analysis (64%), and predictive maintenance on fixed assets (65%).

Is AI going to take my job? ›

They found that while computer vision AI is today capable of automating tasks that account for 1.6% of worker wages in the U.S. economy (excluding agriculture), only 23% of those wages (0.4% of the economy as a whole) would, at today's costs, be cheaper for firms to automate instead of paying human workers.

What will AI do in 2024? ›

The advent of automation and machine learning will contribute to a more developed tech ecosystem in 2024. In 2022, AI use was twice as likely in larger companies, and 80% of retail executives plan to harness AI to experience automation by 2025.

Why is algorithm important in real life? ›

Using algorithms, you can divide a task into smaller parts, making it easier to complete. Algorithms are fundamental to computational thinking and problem-solving in many aspects of life, as people use them to complete tasks accurately and effectively.

What problems can be solved by algorithms? ›

  • 1 Sorting and searching. One of the most common and fundamental problems that algorithms can solve is sorting and searching data. ...
  • 2 Compression and encryption. ...
  • 3 Optimization and scheduling. ...
  • 4 Machine learning and artificial intelligence. ...
  • 5 Algorithm design and analysis. ...
  • 6 Algorithm skills and career paths.
Jan 15, 2024

What are three benefits of algorithms? ›

Advantages of Algorithms:
  • It is easy to understand.
  • An algorithm is a step-wise representation of a solution to a given problem.
  • In an Algorithm the problem is broken down into smaller pieces or steps hence, it is easier for the programmer to convert it into an actual program.
Aug 3, 2023

Top Articles
Latest Posts
Article information

Author: Geoffrey Lueilwitz

Last Updated:

Views: 6155

Rating: 5 / 5 (60 voted)

Reviews: 83% of readers found this page helpful

Author information

Name: Geoffrey Lueilwitz

Birthday: 1997-03-23

Address: 74183 Thomas Course, Port Micheal, OK 55446-1529

Phone: +13408645881558

Job: Global Representative

Hobby: Sailing, Vehicle restoration, Rowing, Ghost hunting, Scrapbooking, Rugby, Board sports

Introduction: My name is Geoffrey Lueilwitz, I am a zealous, encouraging, sparkling, enchanting, graceful, faithful, nice person who loves writing and wants to share my knowledge and understanding with you.