Improving Consumer Welfare with Data Portability


Report by Daniel Castro: “Data protection laws and regulations can contain restrictive provisions, which limit data sharing and use, as well as permissive provisions, which increase it. Data portability is an example of a permissive provision that allows consumers to obtain a digital copy of their personal information from an online service and provide this information to other services. By carefully crafting data portability provisions, policymakers can enable consumers to obtain more value from their data, create new opportunities for businesses to innovate with data, and foster competition….(More)”.

Senators unveil bipartisan bill requiring social media giants to open data to researchers


Article by Rebecca Klar: “Meta and other social media companies would be required to share their data with outside researchers under a new bill announced by a bipartisan group of senators on Thursday. 

Sens. Chris Coons (D-Del.), Amy Klobuchar (D-Minn.) and Rob Portman (R-Ohio) underscored the need for their bill based on information leaked about Meta’s platforms in the so-called Facebook Papers, though the proposal would also apply to other social media companies.

The bill, the Platform Accountability and Transparency Act, would allow independent researchers to submit proposals to the National Science Foundation. If the requests are approved, social media companies would be required to provide the necessary data subject to certain privacy protections. 

“It’s increasingly clear that more transparency is needed so that the billions of people who use Facebook, Twitter, and similar platforms can fully understand the impact of those tradeoffs. This bipartisan proposal is an important step that will bring much needed information about the impact of social media companies to light and ought to be a crucial part of any comprehensive strategy that Congress can take to regulate major social media companies,” Coons said in a statement. 

If companies failed to comply with the requirement under the bill, they would be subject to enforcement from the Federal Trade Commission (FTC) and face losing immunity under Section 230 of the Communications Decency Act. Section 230 is a controversial provision that provides immunity for internet companies based on content posted by third parties, and lawmakers on both sides of the aisle have proposed measures to weaken its reach….(More)”.

Data Science and Official Statistics: Toward a New Data Culture


Essay by Stefan Schweinfest and Ronald Jansen: “In the digital age, data are generated continuously by many different devices and are being used by many different actors. National statistical offices (NSOs) should benefit from these opportunities to improve data for decision-making. What could be the expanding role for official statistics in this context and how does this relate to emerging disciplines like data science? This article explores some new ideas. In the avalanche of new data, society may need a data steward, and the NSO could take on that role, while paying close attention to the protection of privacy. Data science will become increasingly important for extracting meaningful information from large amounts of data. NSOs will need to hire data scientists and data engineers and will need to train their staff in these fast-developing fields. NSOs will also need to clearly communicate new and experimental data and foster a good understanding of statistics. Collaboration of official statistics with the private sector, academia, and civil society will be the new way of working and the fundamental principles of official statistics may have to apply to all those actors. This article envisions that we are gradually working toward such a new data culture…(More)”.

Business Data Sharing through Data Marketplaces: A Systematic Literature Review


Paper by Abbas, Antragama E., Wirawan Agahari, Montijn van de Ven, Anneke Zuiderwijk, and Mark de Reuver: “Data marketplaces are expected to play a crucial role in tomorrow’s data economy, but such marketplaces are seldom commercially viable. Currently, there is no clear understanding of the knowledge gaps in data marketplace research, especially not of neglected research topics that may advance such marketplaces toward commercialization. This study provides an overview of the state-of-the-art of data marketplace research. We employ a Systematic Literature Review (SLR) approach to examine 133 academic articles and structure our analysis using the Service-Technology-Organization-Finance (STOF) model. We find that the extant data marketplace literature is primarily dominated by technical research, such as discussions about computational pricing and architecture. To move past the first stage of the platform’s lifecycle (i.e., platform design) to the second stage (i.e., platform adoption), we call for empirical research in non-technological areas, such as customer expected value and market segmentation….(More)”.

‘Anyway, the dashboard is dead’: On trying to build urban informatics


Paper by Jathan Sadowski: “How do the idealised promises and purposes of urban informatics compare to the material politics and practices of their implementation? To answer this question, I ethnographically trace the development of two data dashboards by strategic planners in an Australian city over the course of 2 years. By studying this techno-political process from its origins onward, I uncovered an interesting story of obdurate institutions, bureaucratic momentum, unexpected troubles, and, ultimately, frustration and failure. These kinds of stories, which often go untold in the annals of innovation, contrast starkly with more common framings of technological triumph and transformation. They also, I argue, reveal much more about how techno-political systems are actualised in the world…(More)”.

Data protection in the context of covid-19. A short (hi)story of tracing applications


Book edited by Elise Poillot, Gabriele Lenzini, Giorgio Resta, and Vincenzo Zeno-Zencovich: “The volume presents the results of a research project  (named “Legafight”) funded by the Luxembourg Fond National de la Recherche in order to verify if and how digital tracing applications could be implemented in the Grand-Duchy in order to counter and abate the Covid-19 pandemic. This inevitably brought to a deep comparative overview of the various existing various models, starting from that of the European Union and those put into practice by Belgium, France, Germany, and Italy, with attention also to some Anglo-Saxon approaches (the UK and Australia). Not surprisingly the main issue which had to be tackled was that of the protection of the personal data collected through the tracing applications, their use by public health authorities and the trust laid in tracing procedures by citizens. Over the last 18 months tracing apps have registered a rise, a fall, and a sudden rebirth as mediums devoted not so much to collect data, but rather to distribute real time information which should allow informed decisions and be used as repositories of health certifications…(More)”.

Helpline data used to monitor population distress in a pandemic


Alexander Tsai in Nature: “An important challenge in addressing mental-health problems is that trends can be difficult to detect because detection relies heavily on self-disclosure. As such, helplines — telephone services that provide crisis intervention to callers seeking help — might serve as a particularly useful source of anonymized data regarding the mental health of a population. This profiling could be especially useful during the COVID-19 pandemic, given the potential emergence or exacerbation of mental-health problems. Together, the threat of disease to oneself and others that is associated with a local epidemic, the restrictiveness of local non-pharmaceutical interventions (such as stay-at-home orders) and the potential associated loss of income could have contributed to a decline in the mental health of a population while at the same time inhibiting or delaying people’s search for help for problems. Writing in Nature, Brülhart et al. present evidence suggesting that helpline-call data can be used to monitor real-time changes in the mental health of a population — including over the course of the COVID-19 pandemic.

More so than in other areas of medicine, the stigma that can be associated with mental illness often prevents people from fully disclosing their experiences and feelings to those in their social networks, or even to licensed mental-health-care professionals. Furthermore, although mental illness contributes immensely to the global disease burden, primary health-care providers are overburdened, mental-health systems are underfunded and access to evidence-based treatment remains poor. For these reasons, helplines have, since their introduction in the United Kingdom by Samaritans in 1953, played a key part in providing low- or no-cost, anonymous support to people with unmet acute and chronic mental-health needs around the world.

Brülhart and colleagues updated and expanded on their previous work looking at helpline calls in one country by assembling data on more than 7 million helpline calls in 19 countries over the course of 2019, 2020 and part of 2021. They found that, within 6 weeks of the start of a country’s initial outbreak (defined as the week in which the cumulative number of reported SARS-CoV-2 infections was higher than 1 in 100,000 inhabitants), call volumes to helplines peaked at 35% higher than pre-pandemic levels (Fig. 1). By examining the changes in the proportion of calls relating to different categories, Brülhart and co-workers attribute these increases to fear, loneliness and concerns about health. The authors also found that suicide-related calls increased in the wake of more-stringent, non-pharmaceutical interventions, but that such calls decreased when income-support policies were introduced. The latter finding is perhaps unsurprising, but is a welcome addition to the evidence base that supports ongoing appeals for financial and other support to mitigate the adverse effects of non-pharmaceutical interventions on uncertainties over employment, income and housing security…(More)”.

22 Questions to Assess Responsible Data for Children (RD4C)


An Audit Tool by The GovLab and UNICEF: “Around the world and across domains, institutions are using data to improve service delivery for children. Data for and about children can, however, pose risks of misuse, such as unauthorized access or data breaches, as well as missed use of data that could have improved children’s lives if harnessed effectively. 

The RD4C Principles — Participatory; Professionally Accountable; People-Centric; Prevention of Harms Across the Data Life Cycle; Proportional; Protective of Children’s Rights; and Purpose-Driven — were developed by the GovLab and UNICEF to guide responsible data handling toward saving children’s lives, defending their rights, and helping them fulfill their potential from early childhood through adolescence. These principles were developed to act as a north star, guiding practitioners toward more responsible data practices.

Today, The GovLab and UNICEF, as part of the Responsible Data for Children initiative (RD4C), are pleased to launch a new tool that aims to put the principles into practice. 22 Questions to Assess Responsible Data for Children (RD4C) is an audit tool to help stakeholders involved in the administration of data systems that handle data for and about children align their practices with the RD4C Principles. 

The tool encourages users to reflect on their data handling practices and strategy by posing questions regarding: 

  • Why: the purpose and rationale for the data system;
  • What: the data handled through the system; 
  • Who: the stakeholders involved in the system’s use, including data subjects;
  • How: the presence of operations, policies, and procedures; and 
  • When and where: temporal and place-based considerations….(More)”.
6b8bb1de 5bb6 474d B91a 99add0d5e4cd

Not all data are created equal -Data sharing and privacy


Paper by Michiel Bijlsma, Carin van der Cruijsen and Nicole Jonker: “The COVID-19 pandemic has increased our online presence and unleashed a new discussion on sharing sensitive personal data. Upcoming European legislation will facilitate data sharing in several areas, following the lead of the revised payments directive (PSD2), which enables payments data sharing with third parties. However, little is known about what drives consumers’ preferences with different types of data, as preferences may differ according to the type of data, type of usage or type of firm using the data.

Using a discrete-choice survey approach among a representative group of Dutch consumers, we find that next to health data, people are hesitant to share their financial data on payments, wealth and pensions, compared to other types of consumer data. Second, consumers are especially cautious about sharing their data when they are not used anonymously. Third, consumers are more hesitant to share their data with BigTechs, webshops and insurers than they are with banks. Fourth, a financial reward can trigger data sharing by consumers. Last, we show that attitudes towards data usage depend on personal characteristics, consumers’ digital skills, online behaviour and their trust in the firms using the data…(More)”

Conceptual and normative approaches to AI governance for a global digital ecosystem supportive of the UN Sustainable Development Goals (SDGs)


Paper by Amandeep S. Gill & Stefan Germann: “AI governance is like one of those mythical creatures that everyone speaks of but which no one has seen. Sometimes, it is reduced to a list of shared principles such as transparency, non-discrimination, and sustainability; at other times, it is conflated with specific mechanisms for certification of algorithmic solutions or ways to protect the privacy of personal data. We suggest a conceptual and normative approach to AI governance in the context of a global digital public goods ecosystem to enable progress on the UN Sustainable Development Goals (SDGs). Conceptually, we propose rooting this approach in the human capability concept—what people are able to do and to be, and in a layered governance framework connecting the local to the global. Normatively, we suggest the following six irreducibles: a. human rights first; b. multi-stakeholder smart regulation; c. privacy and protection of personal data; d. a holistic approach to data use captured by the 3Ms—misuse of data, missed use of data and missing data; e. global collaboration (‘digital cooperation’); f. basing governance more in practice, in particular, thinking separately and together about data and algorithms. Throughout the article, we use examples from the health domain particularly in the current context of the Covid-19 pandemic. We conclude by arguing that taking a distributed but coordinated global digital commons approach to the governance of AI is the best guarantee of citizen-centered and societally beneficial use of digital technologies for the SDGs…(More)”.