Hidden Workers Helping to Make AI Systems Work Well, Fit Into Workplace
By John P. Desmond, AI Trends Editor Behind the scenes workers who enable AI rockstar developers and data scientists to shine while ensuring that data is coded, pictures are flagged, or the system is integrated into the workplace, are often overlooked and undervalued. That is the message of several speakers in a recent session of the EmTech […]
By John P. Desmond, AI Trends Editor
Behind the scenes workers who enable AI rockstar developers and data scientists to shine while ensuring that data is coded, pictures are flagged, or the system is integrated into the workplace, are often overlooked and undervalued.
That is the message of several speakers in a recent session of the EmTech Digital conference hosted by MIT Technology Review.
AI systems often fail to account for the humans who incorporate AI systems into existing workflow, workers doing behind-the-scenes labor to make the programs run, and the people who are negatively affected by AI outcomes, according to an account in the MIT Sloan Management Review.
“This is a common pattern in the social studies of technology,” stated Madeleine Clare Elish, a senior research scientist at Google, who is on the AI Ethical Team. “A focus on new technology, the latest innovation, comes at the expense of the humans who are working to actually allow that innovation to function in the real world… overlooking the role of humans misses what’s actually going on.”
Experiences she had in a previous job leading the AI on the Ground Initiative at the Data & Society Research Institute, taught Elish some lessons. She studied an initiative of the Duke University Duke Health System called Sepsis Watch, a clinical decision support system that used AI to predict a patient’s risk of sepsis, which is a leading cause of death in hospitals and difficult to diagnose and treat quickly.
The program had positive results, but for workers in the hospital, Sepsis Watch was disruptive. It did not fit into the routines the rapid response nurses and doctors were practicing. It fell to the nurses to figure out the best way to communicate the results to doctors; the nurses had to fit the Sepsis Watch information into existing emergency room practices.
‘Repair work’ Fits Technology into the Specific Context
“This hadn’t even crossed the minds of the tech development team, but this strategy proved essential,” stated Elish. In this case, “We saw skilled individuals performing essential but overlooked and undervalued work.”
A term Elish and her fellow researchers coined to describe what the nurses did was “repair work”–work required to make a technology effective in a specific context, and to wave that technology into existing work practices, power dynamics and cultural contexts. The people doing innovative work on-the-ground to get the new AI programs to work effectively get left out of the org chart of technology and development.
In this way, “So much of the actual day-to-day work that is required to make AI function in the world is rendered invisible, and then undervalued,” Elish stated.
She also has suggestions for some of the language used to describe developed AI systems being put to work. “I try to avoid talking about ‘deploying systems,’ she stated. “Deploy is a military term. It connotes a kind of connectless dropping in. And what we actually need to do with systems is integrate them into a particular context. And when you use words like ‘integrate,’ it requires you to say, “Integrate into what, and with whom?”
The nurses were respected at Duke Health, so they were given the room to improvise and create ways to communicate about sepsis risk scores. The creators of AI systems, she suggested, need to allocate resources toward supporting the “repair work” required, and make those doing that work part of the project from beginning to end.
Crowd-Sourcing Platforms Including MTurk, Rely on “Invisible” Workers
Prime examples of invisible workers performing what some call “ghost work,” are crowd-sourcing platforms employing hordes of workers to help make AI programs successful. They perform tasks such as image tagging and classifying and labeling data. Some are beginning to question whether these invisible workers are being exploited, according to a recent account from the BBC.
The most well-established crowdsourcing platform is Amazon Mechanical Turk, known as MTurk, run by Amazon’s Web Services division. Other crowdsourcing platforms include Samasource, CrowdFlower and Microworkers, each enabling businesses to remotely hire workers from anywhere in the world.
The work can include labelling images so that computer vision algorithms improve, providing help for natural language processing, or moderating content for YouTube or Twitter.
MTurk is named after an 18th Century chess-playing machine which toured Europe, and was later revealed to be a hoax, with a human inside directing the chess moves. [Ed. Note: Interesting choice of a name by Amazon.]
MTurk is described on its website as a crowdsourcing marketplace and “a great way to minimize the costs and time for each stage of machine-learning development.” Using a marketplace, customers request workers to perform specific tasks, for which they name a price. An AWS spokesperson was quoted as saying, “Most workers see MTurk as part-time work or a paid hobby, and they enjoy the flexibility to choose the tasks they want to work on and work as much or as little as they like,” according to the BBC account.
Sherry Stanley has been an MTurk worker for six years, a job that has helped her financially while raising three children. “Turking is one of the few job opportunities I have in West Virginia, and like many other Turk workers, we pride ourselves on our work,” she stated to the BBC.
“However, we are at the whim of Amazon. As one of the largest companies in the world, Amazon relies on workers like me staying silent about the conditions of our work.”
She told the BBC that she lived “in constant fear of retaliation for speaking out about the ways we’re being treated”.
The hours and the pay varies day by day. Issues she has with the platform include: rejection of work sometimes with no reason given; accounts can be suddenly suspended without notice and no official avenues for challenging the suspension; some extremely low rates being set by some requesters.
“Turk workers deserve greater transparency around the who, what, why and where of our work,” Stanley stated. The advocacy group Turkopticon is working to make Turk workers feel less invisible. “Turkopticon is the one tool that Turkers have evolved into an organization to engage with each other about the conditions of our work and to make it better,” Stanley stated.
Another voice recently raised on the situation of these high tech ghost workers was that of Alexandrine Royer, an education manager at the Montreal AI Ethics Institute, writer of an account headlined, “The urgent need for regulating global ghost work,” in TechStream from Brookings.
“The decisions made by data workers in Africa and elsewhere, who are responsible for data labelling and content moderation decisions on global platforms, feed back into and shape the algorithms internet users around the world interact with every day,” Royer stated. “Working in the shadows of the digital economy, these so-called ghost workers have immense responsibility as the arbiters of online content.”
Lots of internet content relies on this unseen labor. “It is high time we regulate and properly compensate these workers,” she stated.
Read the source articles and information in the MIT Sloan Management Review, from the BBC and in TechStream.