J. Nathan Matias in the Atlantic: “….Milland and other regular Turkers navigate this precariously free market withTurkopticon, a DIY technology for rating employers created in 2008. To use it, workers install a browser plugin that extends Amazon’s website with special rating features. Before accepting a new task, workers check how others have rated the employer. After finishing, they can also leave their own rating of how well they were treated. Collective rating on Turkopticon is an act of citizenship in the digital world. This digital citizenship acknowledges that online experiences are as much a part of our common life as our schools, sidewalks, and rivers—requiring as much stewardship, vigilance, and improvement as anything else we share.
“How do you fix a broken system that isn’t yours to repair?” That’s the question that motivated the researchers Lilly Irani and Six Silberman to create Turkopticon, and it’s one that comes up frequently in digital environments dominated by large platforms with hands-off policies. (On social networks like Twitter, for example, harassment is a problem for many users.) Irani and Silberman describe Turkopticon as a “mutual aid for accountability” technology, a system that coordinates peer support to hold others accountable when platforms choose not to step in.
Mutual aid accountability is a growing response to the complex social problems people face online. On Twitter, systems like The Block Bot and BlockTogether coordinate collective judgments about alleged online harassers. The systems then collectively block tweets from accounts that a group prefers not to hear from. Last month, the advocacy organization Hollaback raised over $20,000 on Kickstarter to create support networks for people experiencing harassment. In November, I worked with the advocacy organization Women, Action, and the Media, which took a role as “authorized reporter” with Twitter. For three weeks WAM! accepted reports, sorted evidence, and forwarded serious cases to Twitter. In response, the company warned, suspended, and deleted the accounts of many alleged harassers.
These mutual aid technologies operate in the shadow of larger systems with gaps in how people are supported—even when platforms do step in, says Stuart Geiger, a Berkeley Ph.D. student. In other words, sometimes a platform’s system-wide solutions to a problem can create their own problems. For several years, Geiger and his colleague Aaron Halfaker, now a researcher at Wikimedia, were concerned that Wikipedia’s semi-automated anti-vandalism systems might be making the site unfriendly. As a graduate student unable to change Wikipedia’s code, Halfaker created Snuggle, a mutual-aid mentorship technology that tracks the site’s spam responders. When Snuggle users think a newcomer’s edits were mistakenly flagged as spam, the software coordinates Wikipedians to help those users recover from the negative experience of getting revoked.
By organizing peer support at scale, the designers of Turkopticon and its cousins draw attention to common problems, hoping to influence longer-term change on a complex issue. In time, the idea goes, requesters on Mechanical Turk might change their treatment of workers, Amazon might change its policies and software, or regulators might set new rules for digital labor. This is an approach with a long history in an area that might seem unlikely: the conservation movement. (Silberman and Irani cite the movement as inspiration for Turkopticon.)
To better understand how this approach might influence digital citizenship, I followed the history of mutual-aid accountability in a precious common network that the city of Boston enjoys every day: the Charles River. Planned, re-routed, exploited and contested, it has inspired and supported human life since before written history….(More)”