Business Data Sharing through Data Marketplaces: A Systematic Literature Review


Paper by Abbas, Antragama E., Wirawan Agahari, Montijn van de Ven, Anneke Zuiderwijk, and Mark de Reuver: “Data marketplaces are expected to play a crucial role in tomorrow’s data economy, but such marketplaces are seldom commercially viable. Currently, there is no clear understanding of the knowledge gaps in data marketplace research, especially not of neglected research topics that may advance such marketplaces toward commercialization. This study provides an overview of the state-of-the-art of data marketplace research. We employ a Systematic Literature Review (SLR) approach to examine 133 academic articles and structure our analysis using the Service-Technology-Organization-Finance (STOF) model. We find that the extant data marketplace literature is primarily dominated by technical research, such as discussions about computational pricing and architecture. To move past the first stage of the platform’s lifecycle (i.e., platform design) to the second stage (i.e., platform adoption), we call for empirical research in non-technological areas, such as customer expected value and market segmentation….(More)”.

‘Anyway, the dashboard is dead’: On trying to build urban informatics


Paper by Jathan Sadowski: “How do the idealised promises and purposes of urban informatics compare to the material politics and practices of their implementation? To answer this question, I ethnographically trace the development of two data dashboards by strategic planners in an Australian city over the course of 2 years. By studying this techno-political process from its origins onward, I uncovered an interesting story of obdurate institutions, bureaucratic momentum, unexpected troubles, and, ultimately, frustration and failure. These kinds of stories, which often go untold in the annals of innovation, contrast starkly with more common framings of technological triumph and transformation. They also, I argue, reveal much more about how techno-political systems are actualised in the world…(More)”.

Data protection in the context of covid-19. A short (hi)story of tracing applications


Book edited by Elise Poillot, Gabriele Lenzini, Giorgio Resta, and Vincenzo Zeno-Zencovich: “The volume presents the results of a research project  (named “Legafight”) funded by the Luxembourg Fond National de la Recherche in order to verify if and how digital tracing applications could be implemented in the Grand-Duchy in order to counter and abate the Covid-19 pandemic. This inevitably brought to a deep comparative overview of the various existing various models, starting from that of the European Union and those put into practice by Belgium, France, Germany, and Italy, with attention also to some Anglo-Saxon approaches (the UK and Australia). Not surprisingly the main issue which had to be tackled was that of the protection of the personal data collected through the tracing applications, their use by public health authorities and the trust laid in tracing procedures by citizens. Over the last 18 months tracing apps have registered a rise, a fall, and a sudden rebirth as mediums devoted not so much to collect data, but rather to distribute real time information which should allow informed decisions and be used as repositories of health certifications…(More)”.

Helpline data used to monitor population distress in a pandemic


Alexander Tsai in Nature: “An important challenge in addressing mental-health problems is that trends can be difficult to detect because detection relies heavily on self-disclosure. As such, helplines — telephone services that provide crisis intervention to callers seeking help — might serve as a particularly useful source of anonymized data regarding the mental health of a population. This profiling could be especially useful during the COVID-19 pandemic, given the potential emergence or exacerbation of mental-health problems. Together, the threat of disease to oneself and others that is associated with a local epidemic, the restrictiveness of local non-pharmaceutical interventions (such as stay-at-home orders) and the potential associated loss of income could have contributed to a decline in the mental health of a population while at the same time inhibiting or delaying people’s search for help for problems. Writing in Nature, Brülhart et al. present evidence suggesting that helpline-call data can be used to monitor real-time changes in the mental health of a population — including over the course of the COVID-19 pandemic.

More so than in other areas of medicine, the stigma that can be associated with mental illness often prevents people from fully disclosing their experiences and feelings to those in their social networks, or even to licensed mental-health-care professionals. Furthermore, although mental illness contributes immensely to the global disease burden, primary health-care providers are overburdened, mental-health systems are underfunded and access to evidence-based treatment remains poor. For these reasons, helplines have, since their introduction in the United Kingdom by Samaritans in 1953, played a key part in providing low- or no-cost, anonymous support to people with unmet acute and chronic mental-health needs around the world.

Brülhart and colleagues updated and expanded on their previous work looking at helpline calls in one country by assembling data on more than 7 million helpline calls in 19 countries over the course of 2019, 2020 and part of 2021. They found that, within 6 weeks of the start of a country’s initial outbreak (defined as the week in which the cumulative number of reported SARS-CoV-2 infections was higher than 1 in 100,000 inhabitants), call volumes to helplines peaked at 35% higher than pre-pandemic levels (Fig. 1). By examining the changes in the proportion of calls relating to different categories, Brülhart and co-workers attribute these increases to fear, loneliness and concerns about health. The authors also found that suicide-related calls increased in the wake of more-stringent, non-pharmaceutical interventions, but that such calls decreased when income-support policies were introduced. The latter finding is perhaps unsurprising, but is a welcome addition to the evidence base that supports ongoing appeals for financial and other support to mitigate the adverse effects of non-pharmaceutical interventions on uncertainties over employment, income and housing security…(More)”.

22 Questions to Assess Responsible Data for Children (RD4C)


An Audit Tool by The GovLab and UNICEF: “Around the world and across domains, institutions are using data to improve service delivery for children. Data for and about children can, however, pose risks of misuse, such as unauthorized access or data breaches, as well as missed use of data that could have improved children’s lives if harnessed effectively. 

The RD4C Principles — Participatory; Professionally Accountable; People-Centric; Prevention of Harms Across the Data Life Cycle; Proportional; Protective of Children’s Rights; and Purpose-Driven — were developed by the GovLab and UNICEF to guide responsible data handling toward saving children’s lives, defending their rights, and helping them fulfill their potential from early childhood through adolescence. These principles were developed to act as a north star, guiding practitioners toward more responsible data practices.

Today, The GovLab and UNICEF, as part of the Responsible Data for Children initiative (RD4C), are pleased to launch a new tool that aims to put the principles into practice. 22 Questions to Assess Responsible Data for Children (RD4C) is an audit tool to help stakeholders involved in the administration of data systems that handle data for and about children align their practices with the RD4C Principles. 

The tool encourages users to reflect on their data handling practices and strategy by posing questions regarding: 

  • Why: the purpose and rationale for the data system;
  • What: the data handled through the system; 
  • Who: the stakeholders involved in the system’s use, including data subjects;
  • How: the presence of operations, policies, and procedures; and 
  • When and where: temporal and place-based considerations….(More)”.
6b8bb1de 5bb6 474d B91a 99add0d5e4cd

Not all data are created equal -Data sharing and privacy


Paper by Michiel Bijlsma, Carin van der Cruijsen and Nicole Jonker: “The COVID-19 pandemic has increased our online presence and unleashed a new discussion on sharing sensitive personal data. Upcoming European legislation will facilitate data sharing in several areas, following the lead of the revised payments directive (PSD2), which enables payments data sharing with third parties. However, little is known about what drives consumers’ preferences with different types of data, as preferences may differ according to the type of data, type of usage or type of firm using the data.

Using a discrete-choice survey approach among a representative group of Dutch consumers, we find that next to health data, people are hesitant to share their financial data on payments, wealth and pensions, compared to other types of consumer data. Second, consumers are especially cautious about sharing their data when they are not used anonymously. Third, consumers are more hesitant to share their data with BigTechs, webshops and insurers than they are with banks. Fourth, a financial reward can trigger data sharing by consumers. Last, we show that attitudes towards data usage depend on personal characteristics, consumers’ digital skills, online behaviour and their trust in the firms using the data…(More)”

Conceptual and normative approaches to AI governance for a global digital ecosystem supportive of the UN Sustainable Development Goals (SDGs)


Paper by Amandeep S. Gill & Stefan Germann: “AI governance is like one of those mythical creatures that everyone speaks of but which no one has seen. Sometimes, it is reduced to a list of shared principles such as transparency, non-discrimination, and sustainability; at other times, it is conflated with specific mechanisms for certification of algorithmic solutions or ways to protect the privacy of personal data. We suggest a conceptual and normative approach to AI governance in the context of a global digital public goods ecosystem to enable progress on the UN Sustainable Development Goals (SDGs). Conceptually, we propose rooting this approach in the human capability concept—what people are able to do and to be, and in a layered governance framework connecting the local to the global. Normatively, we suggest the following six irreducibles: a. human rights first; b. multi-stakeholder smart regulation; c. privacy and protection of personal data; d. a holistic approach to data use captured by the 3Ms—misuse of data, missed use of data and missing data; e. global collaboration (‘digital cooperation’); f. basing governance more in practice, in particular, thinking separately and together about data and algorithms. Throughout the article, we use examples from the health domain particularly in the current context of the Covid-19 pandemic. We conclude by arguing that taking a distributed but coordinated global digital commons approach to the governance of AI is the best guarantee of citizen-centered and societally beneficial use of digital technologies for the SDGs…(More)”.

How the Data Revolution Will Help the World Fight Climate Change


Article by Robert Muggah and Carlo Ratti: “…The rapidly increasing volume and variety of Big Data collected in cities—whose potential has barely been tapped—can help solve the pressing need for actionable insight. For one, it can be used to track the climate crisis as it happens. Collected in real-time and in high resolution, data can serve as an interface between aspirational goals and daily implementation. Take the case of mobility, a key contributor to carbon, nitrogen, and particulate emissions. A wealth of data from fixed sensors, outdoor video footage, navigation devices, and mobile phones could be processed in real time to classify all modes of city transportation. This can be used to generate granular knowledge of which vehicles—from gas-guzzling SUVs to electric bikes—are contributing to traffic and emissions in a given hour, day, week, or month. This kind of just-in-time analytics can inform agile policy adjustments: Data showing too many miles driven by used diesel vehicles might indicate the need for more targeted car buyback programs while better data about bike use can bolster arguments for dedicated lanes and priority at stoplights.

Data-driven analytics are already improving energy use efficiency in buildings, where heating, cooling, and electricity use are among the chief culprits of greenhouse gas emissions. It is now possible to track spatial and temporal electricity consumption patterns inside commercial and residential properties with smart meters. City authorities can use them to monitor which buildings are using the most power and when. This kind of data can then be used to set incentives to reduce consumption and optimize energy distribution over a 24-hour period. Utilities can charge higher prices during peak usage hours that put the most carbon-intensive strain on the grid. Although peak pricing strategies have existed for decades, data abundance and advanced computing could now help utilities make use of their full potential. Likewise, thermal cameras in streets can identify buildings with energy leaks, especially during colder periods. Tenants can use this data to replace windows or add insulation, substantially reducing their utility bills while also contributing to local climate action.

The data revolution is being harnessed by some cities to hasten the energy transition. A good example of this is the Helsinki Hot Heart proposal that recently won a city-wide energy challenge (and which one of our firms—Carlo Ratti Associati—is involved in). Helsinki currently relies on a district heating system powered by coal power plants that are expected to be phased out by 2030. A key question is whether it is possible to power the city using intermittent renewable energy sources. The project proposes giant water basins, floating off the shore in the Baltic Sea, that act as insulated thermal batteries to accumulate heat during peak renewable production, releasing it through the district heating system. This is only possible through a fine-tuned collection of sensors, algorithms, and actuators. Relying on the flow of water and bytes, Helsinki Hot Hearth would offer a path to digital physical systems that could take cities like Helsinki to a sustainable, data-driven future….(More)”.

The “9Rs Framework”: Establishing the Business Case for Data Collaboration and Re-Using Data in the Public Interest


Article by Stefaan G. Verhulst, Andrew Young, and Andrew J. Zahuranec: “When made accessible and re-used responsibly, privately held data has the potential to generate enormous public value. Whether it’s enabling better science, supporting evidence-based government programs, or helping community groups to identify people who need help, data can be used to make better public interest decisions and improve people’s lives.

Yet, for all the discussion of the societal value of having organizations provide access to their data, there’s been little discussion of the business case on why to make data available for reuse. What motivates an organization to make its datasets accessible for societal purposes? How does doing so support their organizational goals? What’s the return on investment of using organizational resources to make data available to others?

GRAPHIC: The 9Rs Framework: The Business Case for Data Reuse in the Public Interest

The Open Data Policy Lab addresses these questions with its “9R Framework,” a method for describing and identifying the business case for data reuse for the public good. The 9R Framework consists of nine motivations identified through several years of studying and establishing data collaboratives, categorized by different types of return on investment: license to operate, brand equity, or knowledge and insights. Considered together, these nine motivations add up to a model to help organizations understand the business value of making their data assets accessible….(More)”.

Towards Efficient Information Sharing in Network Markets


Paper by Bertin Martens, Geoffrey Parker, Georgios Petropoulos and Marshall W. Van Alstyne: “Digital platforms facilitate interactions between consumers and merchants that allow the collection of profiling information which drives innovation and welfare. Private incentives, however, lead to information asymmetries resulting in market failures both on-platform, among merchants, and off-platform, among competing platforms. This paper develops two product differentiation models to study private and social incentives to share information within and between platforms. We show that there is scope for ex-ante regulation of mandatory data sharing that improves social welfare better than competing interventions such as barring entry, break-up, forced divestiture, or limiting recommendation steering. These alternate proposals do not make efficient use of information. We argue that the location of data access matters and develop a regulatory framework that introduces a new data right for platform users, the in-situ data right, which is associated with positive welfare gains. By construction, this right enables effective information sharing, together with its context, without reducing the value created by network effects. It also enables regulatory oversight but limits data privacy leakages. We discuss crucial elements of its implementation in order to achieve innovation-friendly and competitive digital markets…(More)”.