[iDC] Recent coverage of digital labor work and workers on NPR
Sarah T. Roberts
sarah.roberts at uwo.ca
Tue Nov 26 13:15:51 UTC 2013
Greetings, all -
Of potential interest to members of this list is the recent coverage on NPR’s All Things Considered Weekend of the practice of commercial content moderation (CCM), and the workers who do it. I have included the text version of the story, and the link provided goes to an audio version.
I was heartened that someone there had taken interest in this workforce and practice and so was pleased to participate in the interview, although the brief report really cut in only on a small aspect of what goes on in that world.
Of particular note: Microsoft and Google “declined to make any workers available” for the story (quelle surprise), and Microsoft’s PR person refers to CCM as “a yucky job.”
You don’t say.
—Sarah
—
On Thursday, authorities in Canada announced the bust of an enormous international . It was the end of a three-year investigation into a website that trafficked in illicit videos of young boys. More than 300 people have been arrested in connection with the videos, 76 of them in the United States.
Although busts like this one end with press conferences and high-profile trials, they begin far away from the public eye, with one of the most difficult jobs in the world: content moderation.
The rise of Internet porn has created a shadow industry of people whose job it is to screen vast numbers of images for child pornography.
Richard Brown knows how difficult it can be to see this kind of content. He used to be in charge of the Internet Crimes Against Children task force for the New Jersey State Police. Part of his job was to look through the hard drives of suspects, image by image. And it was hard to forget what he saw.
"I have 2 boys," he says, "and I remember being ultra-protective of my boys during the time that I was involved in this type of work, and I think that's pretty common."
Now, Brown is a law enforcement liaison for the International Center for Missing and Exploited Children. That organization is developing a program that can help police departments automate the reviewing of images of child sexual abuse.
Internet search providers like Google and Microsoft are also investing in similar programs. Samantha Doerr, the director of public affairs and child protection at Microsoft, explains that automation is important because "unlike any other kind of offensive content online, the image itself is a crime scene, and every new viewing of that image is a re-victimization of that child."
The Microsoft system, known as PhotoDNA, was co-developed by a team at Dartmouth and has since been donated to the National Center for Missing and Exploited Children. Earlier this year, Twitter began using it to scan every photo that's uploaded to its site.
The program works by scanning known images of child pornography and giving them a unique signature that goes into a database. If that image appears on another site, it is instantly flagged and removed. Google has its own proprietary tagging system that works in a similar way.
But in order for an image to be identified as child pornography in the first place, a person has to review it. The people who do that work for tech companies are employed all over the world, and very little is known about them, says Sarah Roberts of Western University in Ontario, Canada.
Roberts studies the workers who are part of the new content moderation industry, and she explains that one reason so little is known about them is that most companies require their employees to sign nondisclosure agreements.
"They're precluded from speaking to the media, and it is difficult to reach out and find them," Roberts says. "I think there's an aspect of trauma that can often go along with this work and many workers would rather go home and tune out not talk about it. So I think the unknown aspect of this is by design. It's no mistake that it's difficult to find workers who will talk to you about this."
Many of the workers Roberts has spoken to anonymously have said they feel stigmatized because of the content they come in contact with through their jobs.
"It's exacting a toll on these workers, and because this industry is so new and the need for this work is so new, I think the jury is out as to what the real implications are going to be for these people later on in their life," she says.
But the demand for content moderators is only growing. In March, the eight tech companies who belong to the Technology Coalition against child exploitation online for how to support employees who come in contact with child pornography as part of their jobs.
The guidelines suggest employees take their minds off traumatic content by, for example, taking a 15 minute walk or engaging in a hobby. The guidelines also say companies should have a counseling hotline for employees.
Roberts says providing resources may not be enough.
"If someone were to access some of these support services," she says, "there may be an implicit suggestion that they're not cut out for the kind of work they're trying to do for a living.”
http://www.npr.org/2013/11/17/245829002/laboring-in-the-shadows-to-keep-the-web-free-of-child-porn?live=1
—
S a r a h T. R o b e r t s
Assistant Professor
Faculty of Information and Media Studies (FIMS)
Western University
http://fims.uwo.ca/index.htm
Blogging periodically at
http://illusionofvolition.com
More information about the iDC
mailing list