Data workers detail exploitation by tech industry in DAIR report

Date:

Share post:


The essential labor of data work, like moderation and annotation, is systematically hidden from those who benefit from the fruits of that labor. A new project puts the lived experiences of data workers around the world in the spotlight, showing firsthand the costs and opportunities of tech work abroad.

Many tedious, thankless, or psychologically damaging tasks have been outsourced to poorer countries, where workers are happy to take on jobs for a fraction of an American or European wage. This labor market joins other jobs of the “dull, dirty, or dangerous” category like electronics “recycling” and shipbreaking. The conditions in moderation or annotation work aren’t as likely to cost you an arm or give you cancer, but that doesn’t make them safe, much less pleasant or rewarding.

The Data Workers’ Inquiry, a collaboration between AI ethics research group DAIR and TU Berlin, are nominally modeled on Marx’s work from the late 19th century identifying labor conditions in reports that are “collectively produced and politically actionable.”

All the reports are freely available and were launched today at an online event where those running the project discussed it.

The ever-expanding scope of AI applications is built by necessity on human expertise, and that expertise is bought to this day for the lowest dollar value companies can offer without incurring a public relations problem. When you report a post, it doesn’t say “great, we’ll send this to a guy in Syria who will be paid 3 cents to take care of it.” But the volume of reports (and of content deserving of report) is so high that solutions other than mass outsourcing of the work to cheap labor markets don’t really make sense to the companies involved.

Perusing the reports, they are largely anecdotal, and deliberately so. These reports are more on the level of systematic anthropological observation than quantitative analyses.

Quantifying experiences like these often fails to capture the real costs — the statistics you end up with are the type that companies love to trumpet (and therefore to solicit in studies): higher wages than other companies in the area, job creation, savings passed on to clients. Seldom are things like moderation workers losing sleep to nightmares or rampant chemical dependency mentioned, let alone measured and presented.

Take Fasica Berhane Gebrekidan’s report on Kenyan data workers struggling with mental health and drug issues. (The full PDF is here.)

She and her colleagues worked for Sama, which bills itself as a more ethical data work pipeline, but the reality of the job, as the actual people describe it, is unrelenting misery and a lack of support from the local office.

A whistleblower’s image of the moderation work space at Samasource in Kenya.
Image Credits: Fasica Berhane Gebrekidan

Recruited to handle tickets (i.e., flagged content) in local languages and dialects, they are exposed to a never-ending stream of violence, gore, sexual abuse, hate speech and other content that they must view and “action” quickly lest their performance fall below expected levels, leading to docked pay, the report says. For some that’s more than one per minute, meaning they view a minimum of around 500 such items a day. (In case you’re wondering where the AI is here — they are likely providing the training data.)

“It’s absolutely soul-crushing. I’ve watched the worst things one can imagine. I’m afraid that I will be scarred for life for doing this job,” said Rahel Gebrekirkos, one of the contractors interviewed.

Support personnel were “ill-equipped, unprofessional, and under-qualified,” and moderators frequently turned to drugs to cope, and complained of intrusive thoughts, depression, and other problems.

We’ve heard some of this before, but it is relevant to hear that it is happening still. There are several reports of this type, but others are more personal stories or take different formats.

For instance, Yasser Yousef Alrayes is a data annotator in Syria, working to pay for his higher education. He and his roommate work together on visual annotation tasks like parsing images of text that, as he points out, are often poorly defined, with frustrating demands from clients.

He chose to document his work in the form of a short film that is well worth eight minutes of your time.

Workers like Yasser are often obscured behind many organizational layers, acting as subcontractors to subcontractors so that lines of responsibility are obfuscated should there ever be a problem or lawsuit.

DAIR and TU Berlin’s Milagros Miceli, one of the leaders of the project, told me that they had not seen any comment or changes from the companies indicated in the report but that it was still early. But the results seem strong enough for them to go back for more: “We’re planning to continue this work with a second cohort of data workers,” she wrote, “most probably from Brazil, Finland, China, and India.”

No doubt there are some who will discount these reports for the very quality that makes them valuable: their anecdotal nature. But while it’s easy to lie with statistics, anecdotes always carry at least some truth in them, for these stories are taken direct from the source. Even if these were the only dozen moderators in Kenya, or Syria, or Venezuela with these problems, what they say should concern anyone who relies on them — which is to say, just about everyone.



Source link

Lisa Holden
Lisa Holden
Lisa Holden is a news writer for LinkDaddy News. She writes health, sport, tech, and more. Some of her favorite topics include the latest trends in fitness and wellness, the best ways to use technology to improve your life, and the latest developments in medical research.

Recent posts

Related articles

What is Bluesky when it’s not the underdog?

Bluesky is having a moment — a moment that’s already stretched on for nearly three months. Over the...

The Exploration Company raises $160M to build Europe’s answer to SpaceX Dragon 

Only two companies currently provide cargo delivery to and from the International Space Station, and both are...

Norwegian startup Factiverse wants to fight disinformation with AI

In the wake of the U.S. 2024 presidential election, one fact became clear: Disinformation proliferated online at...

Precursor’s Charles Hudson believes founders should test their investors

Charles Hudson, managing partner of Precursor Ventures, told an audience at AfroTech the basics of knowing when...

Robust AI’s Carter Pro robot is designed to work with, and be moved by, humans

Two things are immediately notable when watching the Carter Pro robot navigate the aisles of the demo warehouse...

A popular technique to make AI more efficient has drawbacks

One of the most widely used techniques to make AI models more efficient, quantization, has limits —...

The best tech for plant lovers

House plants are great. They can also be hard. All of those stress-reducing self-care qualities they promise...

Consumer tech is bouncing back, and consumer founders like Brynn Putnam are bouncing back with it

When Brynn Putnam sold her last company, Mirror, to Lululemon for $500 million at the start of...