Digital Technology and the Resurrection of Trust


Report by the Select Committee on Democracy and Digital Technologies (UK Parliament): “Democracy faces a daunting new challenge. The age where electoral activity was conducted through traditional print media, canvassing and door knocking, is rapidly vanishing. Instead it is dominated by digital and social media. They are now the source from which voters get most of their information and political messaging.

The digital and social media landscape is dominated by two behemoths–Facebook and Google. They largely pass under the radar, operating outside the rules that govern electoral politics. This has become acutely obvious in the current COVID-19 pandemic where online misinformation poses not only a real and present danger to our democracy but also to our lives. Governments have been dilatory in adjusting regulatory regimes to capture these new realities. The result is a crisis of trust.

Yet our profound belief is that this can change. Technology is not a force of nature. Online platforms are not inherently ungovernable. They can and should be bound by the same restraints that we apply to the rest of society. If this is done well, in the ways we spell out in this Report, technology can become a servant of democracy rather than its enemy. There is a need for Government leadership and regulatory capacity to match the scale and pace of challenges and opportunities that the online world presents.

The Government’s Online Harms programme presents a significant first step towards this goal. It needs to happen; it needs to happen fast; and the necessary draft legislation must be laid before Parliament for scrutiny without delay. The Government must not flinch in the face of the inevitable and powerful lobbying of Big Tech and others that benefit from the current situation.

Well drafted Online Harms legislation can do much to protect our democracy. Issues such as misinformation and disinformation must be included in the Bill. The Government must make sure that online platforms bear ultimate responsibility for the content that their algorithms promote. Where harmful content spreads virally on their service or where it is posted by users with a large audience, they should face sanctions over their output as other broadcasters do.

Individual users need greater protection. They must have redress against large platforms through an ombudsman tasked with safeguarding the rights of citizens.

Transparency of online platforms is essential if democracy is to flourish. Platforms like Facebook and Google seek to hide behind ‘black box’ algorithms which choose what content users are shown. They take the position that their decisions are not responsible for harms that may result from online activity. This is plain wrong. The decisions platforms make in designing and training these algorithmic systems shape the conversations that happen online. For this reason, we recommend that platforms be mandated to conduct audits to show how in creating these algorithms they have ensured, for example, that they are not discriminating against certain groups. Regulators must have the powers to oversee these decisions, with the right to acquire the information from platforms they need to exercise those powers….(More)”.