Private sector access to public sector personal data: exploring data value and benefit sharing


Literature review for the Scottish Government: “The aim of this review is to enable the Scottish Government to explore the issues relevant to the access of public sector personal data (as defined by the European Union General Data Protection Regulation, GDPR) with or by the private sector in publicly trusted ways, to unlock the public benefit of this data. This literature review will specifically enable the Scottish Government to establish whether there are

(I) models/approaches of costs/benefits/data value/benefit-sharing, and

(II) intellectual property rights or royalties schemes regarding the use of public sector personal data with or by the private sector both in the UK and internationally.

In conducting this literature review, we used an adapted systematic review, and undertook thematic analysis of the included literature to answer several questions central to the aim of this research. Such questions included:

  • Are there any models of costs and/or benefits regarding the use of public sector personal data with or by the private sector?
  • Are there any models of valuing data regarding the use of public sector personal data with or by the private sector?
  • Are there any models for benefit-sharing in respect of the use of public sector personal data with or by the private sector?
  • Are there any models in respect of the use of intellectual property rights or royalties regarding the use of public sector personal data with or by the private sector?..(More)”.

Experts: 90% of Online Content Will Be AI-Generated by 2026


Article by Maggie Harrison: “Don’t believe everything you see on the Internet” has been pretty standard advice for quite some time now. And according to a new report from European law enforcement group Europol, we have all the reason in the world to step up that vigilance.

“Experts estimate that as much as 90 percent of online content may be synthetically generated by 2026,” the report warned, adding that synthetic media “refers to media generated or manipulated using artificial intelligence.”

“In most cases, synthetic media is generated for gaming, to improve services or to improve the quality of life,” the report continued, “but the increase in synthetic media and improved technology has given rise to disinformation possibilities.”…

The report focused pretty heavily on disinformation, notably that driven by deepfake technology. But that 90 percent figure raises other questions, too — what do AI systems like Dall-E and GPT-3 mean for artists, writers, and other content-generating creators? And circling back to disinformation once more, what will the dissemination of information, not to mention the consumption of it, actually look like in an era driven by that degree of AI-generated digital stuff?…(More)’

Toward a 21st Century National Data Infrastructure: Enhancing Survey Programs by Using Multiple Data Sources


Report by National Academies of Sciences, Engineering, and Medicine: “Much of the statistical information currently produced by federal statistical agencies – information about economic, social, and physical well-being that is essential for the functioning of modern society – comes from sample surveys. In recent years, there has been a proliferation of data from other sources, including data collected by government agencies while administering programs, satellite and sensor data, private-sector data such as electronic health records and credit card transaction data, and massive amounts of data available on the internet. How can these data sources be used to enhance the information currently collected on surveys, and to provide new frontiers for producing information and statistics to benefit American society?…(More)”.

Promoting Sustainable Data Use in State Programs


Toolkit by Chapin Hall:”…helps public sector agencies build the culture and infrastructure to apply data analysis routinely, effectively, and accurately—what we call “sustainable data use.”  It is meant to serve as a hands-on resource, containing strategies and tools for agencies seeking to grow their analytic capacity. 

Administrative data can be a rich source of information for human services agencies seeking to improve programs. But too often, data use in state agencies is temporary, dependent on funds and training from short-term resources such as pilot projects and grants. How can agencies instead move from data to knowledge to action routinely, creating a reinforcing cycle of evidence-building and program improvement?

Chapin Hall experts and experts at partner organizations set out to determine who achieves sustainable data use and how they go about doing so. Building on previous work and the results of a literature review, we identified domains that can significantly influence an agency’s ability to establish sustainable data practices. We then focused on eight state TANF agencies and three partner organizations with demonstrated successes in one or more of these domains, and we interviewed staff who work directly with data to learn more about what strategies they used to achieve success. We focused on what worked rather than what didn’t. From those interviews, we identified common themes, developed case studies, and generated tools to help agencies develop sustainable data practices…(More)”.

The adoption of innovation in international development organisations


OECD Report: “Addressing 21st century development challenges requires investments in innovation, including the use of new approaches and technologies. Currently, many development organisations prioritise investments in isolated innovation pilots that leverage a specific approach or technology rather than pursuing a strategic approach to expand the organisation’s toolbox with innovations that have proven their comparative advantage over what is currently used. This Working Paper addresses this challenge of adopting innovations. How can development organisations institutionalise a new way of working, bringing what was once novel to the core of how business is done? Analysing successful adoption efforts across five DAC agencies, the paper lays out a proposed process for the adoption of innovations. The paper features five case-studies and concludes with a set of lessons and recommendations for policy makers on innovation management generally, and adoption of innovation in particular…(More)”.

Health Data Sharing to Support Better Outcomes: Building a Foundation of Stakeholder Trust


A Special Publication from the National Academy of Medicine: “The effective use of data is foundational to the concept of a learning health system—one that leverages and shares data to learn from every patient experience, and feeds the results back to clinicians, patients and families, and health care executives to transform health, health care, and health equity. More than ever, the American health care system is in a position to harness new technologies and new data sources to improve individual and population health.

Learning health systems are driven by multiple stakeholders—patients, clinicians and clinical teams, health care organizations, academic institutions, government, industry, and payers. Each stakeholder group has its own sources of data, its own priorities, and its own goals and needs with respect to sharing that data. However, in America’s current health system, these stakeholders operate in silos without a clear understanding of the motivations and priorities of other groups. The three stakeholder working groups that served as the authors of this Special Publication identified many cultural, ethical, regulatory, and financial barriers to greater data sharing, linkage, and use. What emerged was the foundational role of trust in achieving the full vision of a learning health system.

This Special Publication outlines a number of potentially valuable policy changes and actions that will help drive toward effective, efficient, and ethical data sharing, including more compelling and widespread communication efforts to improve awareness, understanding, and participation in data sharing. Achieving the vision of a learning health system will require eliminating the artificial boundaries that exist today among patient care, health system improvement, and research. Breaking down these barriers will require an unrelenting commitment across multiple stakeholders toward a shared goal of better, more equitable health.

We can improve together by sharing and using data in ways that produce trust and respect. Patients and families deserve nothing less…(More)”.

Experimentation spaces for regulatory learning


Staff Working Document by the European Commission: “..one of the actions of the New European Innovation Agenda sets out available experimentation tools (especially regulatory sandboxes, but also testbeds and living labs) and showcases existing examples from Europe and beyond on how the European Union and national governments can support and engage innovators in the regulatory process.

Experimentation is a key-component of innovation. European innovators are facing new challenges, also in terms of different or limited experimentation spaces and related regulations.

The Staff Working Document presents a general overview on these experimentation spaces and includes a special focus on the energy sector, in line with the RePowerEU Communication.

The New European Innovation Agenda, adopted on 5 July 2022, aims to position Europe at the forefront of the new wave of deep tech innovation and start-ups. It will help Europe to develop new technologies to address the most pressing societal challenges, and to bring them on the market. Europe wants to be the place where the best talent work hand in hand with the best companies and where deep tech innovation thrives and creates breakthrough innovative solutions across the continent.

One of the five flagships of the New European Innovation Agenda refers to “enabling deep tech innovation through experimentation spaces and public procurement. It includes this guidance document on experimentation spaces as one of the main deliverables, together with a revised state aid framework for Research and Development, experimentation facilities for AI innovation and the setting-up of an “Innovation Friendly Regulations Advisory Group” working on virtual worlds.  

Regulatory sandboxes are schemes that enable testing innovations in a controlled real world environment, that may include temporary loosening of applicable rules while safeguarding regulatory objectives such as safety and consumer protection.

Test beds are experimentation spaces with a technological focus that do not necessarily have a regulatory component.

Living labs are based on co-creation and on the experience and involvement of users and citizens…(More)”.

Artificial Intelligence in Health Care: The Hope, the Hype, the Promise, the Peril


Special Publication by the National Academy of Medicine (NAM): “The emergence of artificial intelligence (AI) in health care offers unprecedented opportunities to improve patient and clinical team outcomes, reduce costs, and impact population health. While there have been a number of promising examples of AI applications in health care, it is imperative to proceed with caution or risk the potential of user disillusionment, another AI winter, or further exacerbation of existing health- and technology-driven disparities.

This Special Publication synthesizes current knowledge to offer a reference document for relevant health care stakeholders. It outlines the current and near-term AI solutions; highlights the challenges, limitations, and best practices for AI development, adoption, and maintenance; offers an overview of the legal and regulatory landscape for AI tools designed for health care application; prioritizes the need for equity, inclusion, and a human rights lens for this work; and outlines key considerations for moving forward.

AI is poised to make transformative and disruptive advances in health care, but it is prudent to balance the need for thoughtful, inclusive health care AI that plans for and actively manages and reduces potential unintended consequences, while not yielding to marketing hype and profit motives…(More)”

Primer on Data Sharing


Primer by John Ure: “…encapsulates insights gleaned from the Inter-Modal Transport Data Sharing Programme, a collaborative effort known as Data Trust 1.0 (DT1), conducted in Hong Kong between 2020 and 2021. This initiative was a pioneering project that explored the feasibility of sharing operational data between public transport entities through a Trusted Third Party. The objective was to overcome traditional data silos and promote evidence-based public transport planning.

DT1, led by the ‘HK Team’ in conjunction with Dr. Jiangping Zhou and colleagues from the University of Hong Kong, successfully demonstrated that data sharing between public transport companies, both privately-owned and government-owned, was viable. Operational data, anonymised and encrypted, were shared with a Trusted Third Party and aggregated for analysis, supported by a Transport Data Analytics Service Provider. The data was used solely for analysis purposes, and confidentiality was maintained throughout.

The establishment of the Data Trust was underpinned by the creation of a comprehensive Data Sharing Framework (DSF). This framework, developed collaboratively, laid the groundwork for future data sharing endeavours. The DSF has been shared internationally, fostering the exchange of knowledge and best practices across diverse organisations and agencies. The Guide serves as a repository of lessons learned, accessible studies, and references, aimed at facilitating a comprehensive understanding of data sharing methodologies.

The central aim of the Guide is twofold: to promote self-learning and to offer clarity on intricate approaches related to data sharing. Its intention is to encourage researchers, governmental bodies, commercial enterprises, and civil society entities, including NGOs, to actively engage in data sharing endeavours. By combining data sets, these stakeholders can glean enhanced insights and contribute to the common good…(More)”.

Creating public sector value through the use of open data


Summary paper prepared as part of data.europa.eu: “This summary paper provides an overview of the different stakeholder activities undertaken, ranging from surveys to a focus group, and presents the key insights from this campaign regarding data reuse practices, barriers to data reuse in the public sector and suggestions to overcome these barriers. The following recommendations are made to help data.europa.eu support public administrations to boost open data value creation.

  • When it comes to raising awareness and communication, any action should also contain examples of data reuse by the public sector. Gathering and communicating such examples and use cases greatly helps in understanding the importance of the role of the public sector as a data reuser
  • When it comes to policy and regulation, it would be beneficial to align the ‘better regulation’ activities and roadmaps of the European Commission with the open data publication activities, in order to better explore the internal data needs. Furthermore, it would be helpful to facilitate a similar alignment and data needs analysis for all European public administrations. For example, this could be done by providing examples, best practices and methodologies on how to map data needs for policy and regulatory purposes.
  • Existing monitoring activities, such as surveys, should be revised to ensure that data reuse by the public sector is included. It would be useful to create a panel of users, based on the existing wide community, that could be used for further surveys.
  • The role of data stewards remains central to favouring reuse. Therefore, examples, best practices and methodologies on the role of data stewards should be included in the support activities – not specifically for public sector reusers, but in general…(More)”.