Essay by Sun-ha Hong: “In a society beset with black-boxed algorithms and vast surveillance systems, transparency is often hailed as liberal democracy’s superhero. It’s a familiar story: inject the public with information to digest, then await their rational deliberation and improved decision making. Whether in discussions of facial recognition software or platform moderation, we run into the argument that transparency will correct the harmful effects of algorithmic systems. The trouble is that in our movies and comic books, superheroes are themselves deus ex machina: black boxes designed to make complex problems disappear so that the good guys can win. Too often, transparency is asked to save the day on its own, under the assumption that disinformation or abuse of power can be shamed away with information.
Transparency without adequate support, however, can quickly become fuel for speculation and misunderstanding….
All this is part of a broader pattern in which the very groups who should be held accountable by the data tend to be its gatekeepers. Facebook is notorious for transparency-washing strategies, in which it dangles data access like a carrot but rarely follows through in actually delivering it. When researchers worked to create more independent means of holding Facebook accountable — as New York University’s Ad Observatory did last year, using volunteer researchers to build a public database of ads on the platform — Facebook threatened to sue them. Despite the lofty rhetoric around Facebook’s Oversight Board (often described as a “Supreme Court” for the platform), it falls into the same trap of transparency without power: the scope is limited to individual cases of content moderation, with no binding authority over the company’s business strategy, algorithmic design, or even similar moderation cases in the future.
Here, too, the real bottleneck is not information or technology, but power: the legal, political and economic pressure necessary to compel companies like Facebook to produce information and to act on it. We see this all too clearly when ordinary people do take up this labour of transparency, and attempt to hold technological systems accountable. In August 2020, Facebook users reported the Kenosha Guard group more than 400 times for incitement of violence. But Facebook declined to take any action until an armed shooter travelled to Kenosha, Wisconsin, and killed two protesters. When transparency is compromised by the concentration of power, it is often the vulnerable who are asked to make up the difference — and then to pay the price.
Transparency cannot solve our problems on its own. In his book The Rise of the Right to Know, journalism scholar Michael Schudson argues that transparency is better understood as a “secondary or procedural morality”: a tool that only becomes effective by other means. We must move beyond the pernicious myth of transparency as a universal solution, and address the distribution of economic and political power that is the root cause of technologically amplified irrationality and injustice….(More)”.