Report by Gillian Diebold: “In the United States, access to many public and private services, including those in the financial, educational, and health-care sectors, are intricately linked to data. But adequate data is not collected equitably from all Americans, creating a new challenge: the data divide, in which not everyone has enough high-quality data collected about them or their communities and therefore cannot benefit from data-driven innovation. This report provides an overview of the data divide in the United States and offers recommendations for how policymakers can address these inequalities…(More)”.
Making Government Data Publicly Available: Guidance for Agencies on Releasing Data Responsibly
Report by Hugh Grant-Chapman, and Hannah Quay-de la Vallee: “Government agencies rely on a wide range of data to effectively deliver services to the populations with which they engage. Civic-minded advocates frequently argue that the public benefits of this data can be better harnessed by making it available for public access. Recent years, however, have also seen growing recognition that the public release of government data can carry certain risks. Government agencies hoping to release data publicly should consider those potential risks in deciding which data to make publicly available and how to go about releasing it.
This guidance offers an introduction to making data publicly available while addressing privacy and ethical data use issues. It is intended for administrators at government agencies that deliver services to individuals — especially those at the state and local levels — who are interested in publicly releasing government data. This guidance focuses on challenges that may arise when releasing aggregated data derived from sensitive information, particularly individual-level data.
The report begins by highlighting key benefits and risks of making government data publicly available. Benefits include empowering members of the general public, supporting research on program efficacy, supporting the work of organizations providing adjacent services, reducing agencies’ administrative burden, and holding government agencies accountable. Potential risks include breaches of individual privacy; irresponsible uses of the data by third parties; and the possibility that the data is not used at all, resulting in wasted resources.
In light of these benefits and risks, the report presents four recommended actions for publishing government data responsibly:
- Establish data governance processes and roles;
- Engage external communities;
- Ensure responsible use and privacy protection; and
- Evaluate resource constraints.
These key considerations also take into account federal and state laws as well as emerging computational and analytical techniques for protecting privacy when releasing data, such as differential privacy techniques and synthetic data. Each of these techniques involves unique benefits and trade-offs to be considered in context of the goals of a given data release…(More)”.
U.S. Government Effort to Tap Private Weather Data Moves Along Slowly
Article by Isabelle Bousquette: “The U.S. government’s six-year-old effort to improve its weather forecasting ability by purchasing data from private-sector satellite companies has started to show results, although the process is moving more slowly than anticipated.
After a period of testing, the National Oceanic and Atmospheric Administration, a scientific, service and regulatory arm of the Commerce Department, began purchasing data from two satellite companies, Spire Global Inc. of Vienna, Va., and GeoOptics Inc. of Pasadena, Calif.
The weather data from these two companies fills gaps in coverage left by NOAA’s own satellites, the agency said. NOAA also began testing data from a third company this year.
Beyond these companies, new entrants to the field offering weather data based on a broader range of technologies have been slow to emerge, the agency said.
“We’re getting a subset of what we hoped,” said Dan St. Jean, deputy director of the Office of System Architecture and Advanced Planning at NOAA’s Satellite and Information Service.
NOAA’s weather forecasts help the government formulate hurricane evacuation plans and make other important decisions. The agency began seeking out private sources of satellite weather data in 2016. The idea was to find a more cost-effective alternative to funding NOAA’s own satellite constellations, the agency said. It also hoped to seed competition and innovation in the private satellite sector.
It isn’t yet clear whether there is a cost benefit to using private data, in part because the relatively small number of competitors in the market has made it challenging to determine a steady market price, NOAA said.
“All the signs in the nascent ‘new space’ industry indicated that there would be a plethora of venture capitalists wanting to compete for NOAA’s commercial pilot/purchase dollars. But that just never materialized,” said Mr. St. Jean…(More)”.
(Re)making data markets: an exploration of the regulatory challenges
Paper by Linnet Taylor, Hellen Mukiri-Smith, Tjaša Petročnik, Laura Savolainen & Aaron Martin: “Regulating the data market will be one of the major challenges of the twenty-first century. In order to think about regulating this market, however, we first need to make its dimensions and dynamics more accessible to observation and analysis. In this paper we explore what the state of the sociological and legal research on markets can tell us about the market for data: what kind of market it is, the practices and configurations of actors that constitute it, and what kinds of data are traded there. We start from the subjective opacity of this market to researchers interested in regulation and governance, review conflicting positions on its extent, diversity and regulability, and then explore comparisons from food and medicine regulation to understand the possible normative and practical implications and aims inherent in attempting to regulate how data is shared and traded. We conclude that there is a strong argument for a normative shift in the aims of regulation with regard to the data market, away from a prioritisation of the economic value of data and toward a more nuanced approach that aims to align the uses of data with the needs and rights of the communities reflected in it…(More)”
Forest data governance as a reflection of forest governance: Institutional change and endurance in Finland and Canada
Paper by Salla Rantala, Brent Swallow, Anu Lähteenmäki-Uutela and Riikka Paloniemi: “The rapid development of new digital technologies for natural resource management has created a need to design and update governance regimes for effective and transparent generation, sharing and use of digital natural resource data. In this paper, we contribute to this novel area of investigation from the perspective of institutional change. We develop a conceptual framework to analyze how emerging natural resource data governance is shaped by related natural resource governance; complex, multilevel systems of actors, institutions and their interplay. We apply this framework to study forest data governance and its roots in forest governance in Finland and Canada. In Finland, an emphasis on open forest data and the associated legal reform represents the instutionalization of a mixed open data-bioeconomy discourse, pushed by higher-level institutional requirements towards greater openness and shaped by changing actor dynamics in relation to diverse forest values. In Canada, a strong institutional lock-in around public-private partnerships in forest management has engendered an approach that is based on voluntary data sharing agreements and fragmented data management, conforming with the entrenched interests of autonomous sub-national actors and thus extending the path-dependence of forest governance to forest data governance. We conclude by proposing how the framework could be further developed and tested to help explain which factors condition the formation of natural resource data institutions and subsequently the (re-)distribution of benefits they govern. Transparent and efficient data approaches can be enabled only if the analysis of data institutions is given equal attention to the technological development of data solutions…(More)”.
Designing Data Spaces: The Ecosystem Approach to Competitive Advantage
Open access book edited by Boris Otto, Michael ten Hompel, and Stefan Wrobel: “…provides a comprehensive view on data ecosystems and platform economics from methodical and technological foundations up to reports from practical implementations and applications in various industries.
To this end, the book is structured in four parts: Part I “Foundations and Contexts” provides a general overview about building, running, and governing data spaces and an introduction to the IDS and GAIA-X projects. Part II “Data Space Technologies” subsequently details various implementation aspects of IDS and GAIA-X, including eg data usage control, the usage of blockchain technologies, or semantic data integration and interoperability. Next, Part III describes various “Use Cases and Data Ecosystems” from various application areas such as agriculture, healthcare, industry, energy, and mobility. Part IV eventually offers an overview of several “Solutions and Applications”, eg including products and experiences from companies like Google, SAP, Huawei, T-Systems, Innopay and many more.
Overall, the book provides professionals in industry with an encompassing overview of the technological and economic aspects of data spaces, based on the International Data Spaces and Gaia-X initiatives. It presents implementations and business cases and gives an outlook to future developments. In doing so, it aims at proliferating the vision of a social data market economy based on data spaces which embrace trust and data sovereignty…(More)”.
Identifying and addressing data asymmetries so as to enable (better) science
Paper by Stefaan Verhulst and Andrew Young: “As a society, we need to become more sophisticated in assessing and addressing data asymmetries—and their resulting political and economic power inequalities—particularly in the realm of open science, research, and development. This article seeks to start filling the analytical gap regarding data asymmetries globally, with a specific focus on the asymmetrical availability of privately-held data for open science, and a look at current efforts to address these data asymmetries. It provides a taxonomy of asymmetries, as well as both their societal and institutional impacts. Moreover, this contribution outlines a set of solutions that could provide a toolbox for open science practitioners and data demand-side actors that stand to benefit from increased access to data. The concept of data liquidity (and portability) is explored at length in connection with efforts to generate an ecosystem of responsible data exchanges. We also examine how data holders and demand-side actors are experimenting with new and emerging operational models and governance frameworks for purpose-driven, cross-sector data collaboratives that connect previously siloed datasets. Key solutions discussed include professionalizing and re-imagining data steward roles and functions (i.e., individuals or groups who are tasked with managing data and their ethical and responsible reuse within organizations). We present these solutions through case studies on notable efforts to address science data asymmetries. We examine these cases using a repurposable analytical framework that could inform future research. We conclude with recommended actions that could support the creation of an evidence base on work to address data asymmetries and unlock the public value of greater science data liquidity and responsible reuse…(More)”.
IPR and the Use of Open Data and Data Sharing Initiatives by Public and Private Actors
Study commissioned by the European Parliament’s Policy Department for Citizens’ Rights and Constitutional Affairs at the request of the Committee on Legal Affairs: “This study analyses recent developments in data related practice, law and policy as well as the current legal framework for data access, sharing, and use in the European Union. The study identifies particular issues of concern and highlights respective need for action. On this basis, the study evaluates the Commission’s proposal for a Data Act…(More)”.
Mobile Big Data for Cities: Urban climate resilience strategies for low- and middle-income countries
GSMA Report: “Cities in low- and middle-income countries (LMICs) are increasingly vulnerable to the impacts of climate change, including rising sea levels and storm surges, heat stress, extreme precipitation, inland and coastal flooding and landslides. The physical effects of climate change have disrupted supply chains, led to lost productivity from health issues and incurred costs associated with rebuilding or repairing physical assets, such as buildings and transport infrastructure.
Resulting from the adverse effects of climate change, municipal governments and systems often lack the adaptive capacity or resources to keep up. Hence, the adaptative capacity of cities can be enhanced by corresponding to more comprehensive and real-time data. Such data will give municipal agencies the ability to watch events as they unfold, understand how demand patterns are changing and respond with faster and lower-cost solutions. This provides a solid basis for innovative data sources, such as mobile big data (MBD), to help strengthen urban climate resilience.
This study highlights the potential value of using mobile big data (MBD) in preparing for and responding to climate-related disasters in cities. In line with the “3As” of urban climate resilience, a framework adopted by the GSMA Mobile for Development programme, this study examines how MBD could help cities and their populations adapt to multiple long-term challenges brought about by climate change, anticipate climate hazards or events and/or absorb (face, manage and recover from) adverse conditions, emergencies or disasters…(More)”.
Efficient and stable data-sharing in a public transit oligopoly as a coopetitive game
Paper by Qi Liu and Joseph Y.J. Chow: “In this study, various forms of data sharing are axiomatized. A new way of studying coopetition, especially data-sharing coopetition, is proposed. The problem of the Bayesian game with signal dependence on actions is observed; and a method to handle such dependence is proposed. We focus on fixed-route transit service markets. A discrete model is first presented to analyze the data-sharing coopetition of an oligopolistic transit market when an externality effect exists. Given a fixed data sharing structure, a Bayesian game is used to capture the competition under uncertainty while a coalition formation model is used to determine the stable data-sharing decisions. A new method of composite coalition is proposed to study efficient markets. An alternative continuous model is proposed to handle large networks using simulation. We apply these models to various types of networks. Test results show that perfect information may lead to perfect selfishness. Sharing more data does not necessarily improve transit service for all groups, at least if transit operators remain non-cooperative. Service complementarity does not necessarily guarantee a grand data-sharing coalition. These results can provide insights on policy-making, like whether city authorities should enforce compulsory data-sharing along with cooperation between operators or setup a voluntary data-sharing platform…(More)”.