Tech groups cannot be allowed to hide from scrutiny


Marietje Schaake at the Financial Times: “Technology companies have governments over a barrel. Whether they are maximising traffic flow efficiency, matching pupils with their school preferences, trying to anticipate drought based on satellite and soil data, most governments heavily rely on critical infrastructure and artificial intelligence developed by the private sector. This growing dependence has profound implications for democracy.

An unprecedented information asymmetry is growing between companies and governments. We can see this in the long-running investigation into interference in the 2016 US presidential elections. Companies build voter registries, voting machines and tallying tools, while social media companies sell precisely targeted advertisements using information gleaned by linking data on friends, interests, location, shopping and search.

This has big privacy and competition implications, yet oversight is minimal. Governments, researchers and citizens risk being blindsided by the machine room that powers our lives and vital aspects of our democracies. Governments and companies have fundamentally different incentives on transparency and accountability.

While openness is the default and secrecy the exception for democratic governments, companies resist providing transparency about their algorithms and business models. Many of them actively prevent accountability, citing rules that protect trade secrets.

We must revisit these protections when they shield companies from oversight. There is a place for protecting proprietary information from commercial competitors, but the scope and context need to be clarified and balanced when they have an impact on democracy and the rule of law.

Regulators must act to ensure that those designing and running algorithmic processes do not abuse trade secret protections. Tech groups also use the EU’s General Data Protection Regulation to deny access to company information. Although the regulation was enacted to protect citizens against the mishandling of personal data, it is now being wielded cynically to deny scientists access to data sets for research. The European Data Protection Supervisor has intervened, but problems could recur. To mitigate concerns about the power of AI, provider companies routinely promise that the applications will be understandable, explainable, accountable, reliable, contestable, fair and — don’t forget — ethical.

Yet there is no way to test these subjective notions without access to the underlying data and information. Without clear benchmarks and information to match, proper scrutiny of the way vital data is processed and used will be impossible….(More)”.