The Normative Order of the Internet: A Theory of Rule and Regulation Online


Open access book by Matthias C. Kettemann: “There is order on the internet, but how has this order emerged and what challenges will threaten and shape its future? This study shows how a legitimate order of norms has emerged online, through both national and international legal systems. It establishes the emergence of a normative order of the internet, an order which explains and justifies processes of online rule and regulation. This order integrates norms at three different levels (regional, national, international), of two types (privately and publicly authored), and of different character (from ius cogens to technical standards).

Matthias C. Kettemann assesses their internal coherence, their consonance with other order norms and their consistency with the order’s finality. The normative order of the internet is based on and produces a liquefied system characterized by self-learning normativity. In light of the importance of the socio-communicative online space, this is a book for anyone interested in understanding the contemporary development of the internet….(More)”.

Data Justice and COVID-19: Global Perspectives


Book edited by edited by Linnet Taylor, Aaron Martin, Gargi Sharma and Shazade Jameson: “In early 2020, as the COVID-19 pandemic swept the world and states of emergency were declared by one country after another, the global technology sector—already equipped with unprecedented wealth, power, and influence—mobilised to seize the opportunity. This collection is an account of what happened next and captures the emergent conflicts and responses around the world. The essays provide a global perspective on the implications of these developments for justice: they make it possible to compare how the intersection of state and corporate power—and the way that power is targeted and exercised—confronts, and invites resistance from, civil society in countries worldwide.

This edited volume captures the technological response to the pandemic in 33 countries, accompanied by nine thematic reflections, and reflects the unfolding of the first wave of the pandemic.

This book can be read as a guide to the landscape of technologies deployed during the pandemic and also be used to compare individual country strategies. It will prove useful as a tool for teaching and learning in various disciplines and as a reference point for activists and analysts interested in issues of data justice.

The essays interrogate these technologies and the political, legal, and regulatory structures that determine how they are applied. In doing so,the book exposes the workings of state technological power to critical assessment and contestation….(More)”

Governance responses to disinformation: How open government principles can inform policy options


OECD paper by Craig Matasick, Carlotta Alfonsi and Alessandro Bellantoni: “This paper provides a holistic policy approach to the challenge of disinformation by exploring a range of governance responses that rest on the open government principles of transparency, integrity, accountability and stakeholder participation. It offers an analysis of the significant changes that are affecting media and information ecosystems, chief among them the growth of digital platforms. Drawing on the implications of this changing landscape, the paper focuses on four policy areas of intervention: public communication for a better dialogue between government and citizens; direct responses to identify and combat disinformation; legal and regulatory policy; and media and civic responses that support better information ecosystems. The paper concludes with proposed steps the OECD can take to build evidence and support policy in this space…(More)”.

Why Personal Data Is a National Security Issue


Article by Susan Ariel Aaronson: “…Concerns about the national security threat from personal data held by foreigners first emerged in 2013. Several U.S. entities, including Target, J.P. Morgan, and the U.S. Office of Personnel Management were hacked. Many attributed the hacking to Chinese entities. Administration officials concluded that the Chinese government could cross-reference legally obtained and hacked-data sets to reveal information about U.S. objectives and strategy. 

Personal data troves can also be cross-referenced to identify individuals, putting both personal security as well as national security at risk. Even U.S. firms pose a direct and indirect security threat to individuals and the nation because of their failure to adequately protect personal data. For example, Facebook has a disturbing history of sharing personal data without consent and allowing its clients to use that data to manipulate users. Some app designers have enabled functionality unnecessary for their software’s operation, while others, like Anomaly 6, embedded their software in mobile apps without the permission of users or firms. Other companies use personal data without user permission to create new products. Clearview AI scraped billions of images from major web services such as Facebook, Google, and YouTube, and sold these images to law enforcement agencies around the world. 

Firms can also inadvertently aggregate personal data and in so doing threaten national security. Strava, an athletes’ social network, released a heat map of its global users’ activities in 2018. Savvy analysts were able to use the heat map to reveal secret military bases and patrol routes. Chinese-owned data firms could be a threat to national security if they share data with the Chinese government. But the problem lies in the U.S.’s failure to adequately protect personal data and police the misuse of data collected without the permission of users….(More)”.

The EU is launching a market for personal data. Here’s what that means for privacy.


Anna Artyushina at MIT Tech Review: “The European Union has long been a trendsetter in privacy regulation. Its General Data Protection Regulation (GDPR) and stringent antitrust laws have inspired new legislation around the world. For decades, the EU has codified protections on personal data and fought against what it viewed as commercial exploitation of private information, proudly positioning its regulations in contrast to the light-touch privacy policies in the United States.

The new European data governance strategy (pdf) takes a fundamentally different approach. With it, the EU will become an active player in facilitating the use and monetization of its citizens’ personal data. Unveiled by the European Commission in February 2020, the strategy outlines policy measures and investments to be rolled out in the next five years.

This new strategy represents a radical shift in the EU’s focus, from protecting individual privacy to promoting data sharing as a civic duty. Specifically, it will create a pan-European market for personal data through a mechanism called a data trust. A data trust is a steward that manages people’s data on their behalf and has fiduciary duties toward its clients.

The EU’s new plan considers personal data to be a key asset for Europe. However, this approach raises some questions. First, the EU’s intent to profit from the personal data it collects puts European governments in a weak position to regulate the industry. Second, the improper use of data trusts can actually deprive citizens of their rights to their own data.

The Trusts Project, the first initiative put forth by the new EU policies, will be implemented by 2022. With a €7 million budget, it will set up a pan-European pool of personal and nonpersonal information that should become a one-stop shop for businesses and governments looking to access citizens’ information.

Global technology companies will not be allowed to store or move Europeans’ data. Instead, they will be required to access it via the trusts. Citizens will collect “data dividends,” which haven’t been clearly defined but could include monetary or nonmonetary payments from companies that use their personal data. With the EU’s roughly 500 million citizens poised to become data sources, the trusts will create the world’s largest data market.

For citizens, this means the data created by them and about them will be held in public servers and managed by data trusts. The European Commission envisions the trusts as a way to help European businesses and governments reuse and extract value from the massive amounts of data produced across the region, and to help European citizens benefit from their information. The project documentation, however, does not specify how individuals will be compensated.

Data trusts were first proposed by internet pioneer Sir Tim Berners Lee in 2018, and the concept has drawn considerable interest since then. Just like the trusts used to manage one’s property, data trusts may serve different purposes: they can be for-profit enterprises, or they can be set up for data storage and protection, or to work for a charitable cause.

IBM and Mastercard have built a data trust to manage the financial information of their European clients in Ireland; the UK and Canada have employed data trusts to stimulate the growth of the AI industries there; and recently, India announced plans to establish its own public data trust to spur the growth of technology companies.

The new EU project is modeled on Austria’s digital system, which keeps track of information produced by and about its citizens by assigning them unique identifiers and storing the data in public repositories.

Unfortunately, data trusts do not guarantee more transparency. The trust is governed by a charter created by the trust’s settlor, and its rules can be made to prioritize someone’s interests. The trust is run by a board of directors, which means a party that has more seats gains significant control.

The Trusts Project is bound to face some governance issues of its own. Public and private actors often do not see eye to eye when it comes to running critical infrastructure or managing valuable assets. Technology companies tend to favor policies that create opportunity for their own products and services. Caught in a conflict of interest, Europe may overlook the question of privacy….(More)”.

From Desert Battlefields To Coral Reefs, Private Satellites Revolutionize The View


NPR Story: “As the U.S. military and its allies attacked the last Islamic State holdouts last year, it wasn’t clear how many civilians were still in the besieged desert town of Baghouz, Syria.

So Human Rights Watch asked a private satellite company, Planet, for its regular daily photos and also made a special request for video.

“That live video actually was instrumental in convincing us that there were thousands of civilians trapped in this pocket,” said Josh Lyons of Human Rights Watch. “Therefore, the coalition forces absolutely had an obligation to stop and to avoid bombardment of that pocket at that time.”

Which they did until the civilians fled.

Lyons, who’s based in Geneva, has a job title you wouldn’t expect at a human rights group: director of geospatial analysis. He says satellite imagery is increasingly a crucial component of human rights investigations, bolstering traditional eyewitness accounts, especially in areas where it’s too dangerous to send researchers.

“Then we have this magical sort of fusion of data between open-source, eyewitness testimony and data from space. And that becomes essentially a new gold standard for investigations,” he said.

‘A string of pearls’

Satellite photos used to be restricted to the U.S. government and a handful of other nations. Now such imagery is available to everyone, creating a new world of possibilities for human rights groups, environmentalists and researchers who monitor nuclear programs.

They get those images from a handful of private, commercial satellite companies, like Planet and Maxar….(More)”.

Building and maintaining trust in research


Daniel Nunan at the International Journal of Market Research: “One of the many indirect consequences of the COVID pandemic for the research sector may be the impact upon consumers’ willingness to share data. This is reflected in concerns that government mandated “apps” designed to facilitate COVID testing and tracking schemes will undermine trust in the commercial collection of personal data (WARC, 2020). For example, uncertainty over the consequences of handing over data and the ways in which it might be used could reduce consumers’ willingness to share data with organizations, and reverse a trend that has seen growing levels of consumer confidence in Data Protection Regulations (Data & Direct Marketing Association [DMA], 2020). This highlights how central the role of trust has become in contemporary research practice, and how fragile the process of building trust can be due to the ever competing demands of public and private data collectors.

For researchers, there are two sides to trust. One relates to building sufficient trust with research participants to be facilitate data collection, and the second is building trust with the users of research. Trust has long been understood as a key factor in effective research relationships, with trust between researchers and users of research the key factor in determining the extent to which research is actually used (Moorman et al., 1993). In other words, a trusted messenger is just as important as the contents of the message. In recent years, there has been growing concern over declining trust in research from research participants and the general public, manifested in declining response rates and challenges in gaining participation. Understanding how to build consumer trust is more important than ever, as the shift of communication and commercial activity to digital platforms alter the mechanisms through which trust is built. Trust is therefore essential both for ensuring that accurate data can be collected, and that insights from research have necessary legitimacy to be acted upon. The two research notes in this issue provide an insight into new areas where the issue of trust needs to be considered within research practice….(More)”.

Journalists’ guide to COVID data


Guide by RTDNA: “Watch a press conference, turn on a newscast, or overhear just about any phone conversation these days and you’ll hear mayors discussing R values, reporters announcing new fatalities and separated families comparing COVID case rolling averages in their counties. As coronavirus resurges across the country, medical data is no longer just the purview of epidemiologists (though a quick glance at any social media comments section shows an unlikely simultaneous surge in the number of virology experts and statisticians).

Journalists reporting on COVID, however, have a particular obligation to understand the data, to add context and to acknowledge uncertainty when reporting the numbers.

“Journalism requires more than merely reporting remarks, claims or comments. Journalism verifies, provides relevant context, tells the rest of the story and acknowledges the absence of important additional information.” – RTDNA Code of Ethics

This guide to common COVID metrics is designed to help journalists know how each data point is calculated, what it means and, importantly, what it doesn’t mean….(More)”.

Strengthening Privacy Protections in COVID-19 Mobile Phone–Enhanced Surveillance Programs


Rand Report: “Dozens of countries, including the United States, have been using mobile phone tools and data sources for COVID-19 surveillance activities, such as tracking infections and community spread, identifying populated areas at risk, and enforcing quarantine orders. These tools can augment traditional epidemiological interventions, such as contact tracing with technology-based data collection (e.g., automated signaling and record-keeping on mobile phone apps). As the response progresses, other beneficial technologies could include tools that authenticate those with low risk of contagion or that build community trust as stay-at-home orders are lifted.

However, the potential benefits that COVID-19 mobile phone–enhanced public health (“mobile”) surveillance program tools could provide are also accompanied by potential for harm. There are significant risks to citizens from the collection of sensitive data, including personal health, location, and contact data. People whose personal information is being collected might worry about who will receive the data, how those recipients might use the data, how the data might be shared with other entities, and what measures will be taken to safeguard the data from theft or abuse.

The risk of privacy violations can also impact government accountability and public trust. The possibility that one’s privacy will be violated by government officials or technology companies might dissuade citizens from getting tested for COVID-19, downloading public health–oriented mobile phone apps, or sharing symptom or location data. More broadly, real or perceived privacy violations might discourage citizens from believing government messaging or complying with government orders regarding COVID-19.

As U.S. public health agencies consider COVID-19-related mobile surveillance programs, they will need to address privacy concerns to encourage broad uptake and protect against privacy harms. Otherwise, COVID-19 mobile surveillance programs likely will be ineffective and the data collected unrepresentative of the situation on the ground….(More)“.

Policies and Strategies to Promote Grassroots Innovation Workbook


UN-ESCAP: “Grassroots innovation is a modality of inclusive innovation that enables extremely affordable, niche-adapted solutions to local problems, often unaided by public sector or outsiders.

In a context of rising income disparity among the have and have-nots, every effort should be made to convert the ideas and innovations of knowledge-rich but economically poor individuals and communities into viable means of raising income, addressing social needs, and conserving the environment. While grassroots innovation are typically bottom-up initiatives, public policies can also support the emergence, recognition and diffusion of grassroots innovations. The journey of developing a grassroots idea or invention into a viable product or service for commercial or social diffusion requires support from many actors at different stages and levels.

The Honey Bee Network has been leading the grassroots innovation movement in India. In the past three decades, it has strengthened the inclusive innovation ecosystem of the country and has become a global benchmark of frugal, friendly and flexible solutions for men and women farmers, pastoral and artisan households, mechanics, forest dwellers, fishermen etc. This workbook draws on the experience of the Honey Bee Network and discusses experiences, issues and strategies that could also be relevant for other countries….(More)”.