Blockchain for the public good


Blog by Camille Crittenden: “Over the last year, I have had the privilege to lead the California Blockchain Working Group, which delivered its report to the Legislature in early July. Established by AB 2658, the 20-member Working Group comprised experts with backgrounds in computer science, cybersecurity, information technology, law, and policy. We were charged with drafting a working definition of blockchain, providing advice to State offices and agencies considering implementation of blockchain platforms, and offering guidance to policymakers to foster an open and equitable regulatory environment for the technology in California.

What did we learn? Enough to make a few outright recommendations as well as identify areas where further research is warranted.

A few guiding principles: Refine the application of blockchain systems first on things, not people. This could mean implementations of blockchain for tracing food from farms to stores to reduce the economic and human harm of food-borne illnesses; reducing paperwork and increasing reliability of tracing vehicles and parts from manufacturing floor to consumer to future owners or dismantlers; improving workflows for digitizing, cataloging and storing the reams of documents held in the State Archives.

Similarly, blockchain solutions could be implemented for public vital records, such as birth, death and marriage certificates or real estate titles without risk of compromising private information. Greater caution should be taken in applications that affect public service delivery to populations in precarious circumstances, such as the homeless or unemployed. Overarching problems to address, especially for sensitive records, include the need for reliable, persistent digital identification and the evolving requirements for cybersecurity….

The Working Group’s final report, Blockchain in California: A Roadmap, avoids the magical thinking or technological solutionism that sometimes attends shiny new tech ideas. Blockchain won’t cure Covid-19, fix systemic racism, or reverse alarming unemployment trends. But if implemented conscientiously on a case-by-case basis, it could make a dent in improving health outcomes, increasing autonomy for property owners and consumers, and alleviating some bureaucratic practices that may be a drag on the economy. And those are contributions we can all welcome….(More)”.

Medical data has a silo problem. These models could help fix it.


Scott Khan at the WEF: “Every day, more and more data about our health is generated. Data, which if analyzed, could hold the key to unlocking cures for rare diseases, help us manage our health risk factors and provide evidence for public policy decisions. However, due to the highly sensitive nature of health data, much is out of reach to researchers, halting discovery and innovation. The problem is amplified further in the international context when governments naturally want to protect their citizens’ privacy and therefore restrict the movement of health data across international borders. To address this challenge, governments will need to pursue a special approach to policymaking that acknowledges new technology capabilities.

Understanding data siloes

Data becomes siloed for a range of well-considered reasons ranging from restrictions on terms-of-use (e.g., commercial, non-commercial, disease-specific, etc), regulations imposed by governments (e.g., Safe Harbor, privacy, etc.), and an inability to obtain informed consent from historically marginalized populations.

Siloed data, however, also creates a range of problems for researchers looking to make that data useful to the general population. Siloes, for example, block researchers from accessing the most up-to-date information or the most diverse, comprehensive datasets. They can slow the development of new treatments and therefore, curtail key findings that can lead to much needed treatments or cures.

Even when these challenges are overcome, the incidences of data mis-use – where health data is used to explore non-health related topics or without an individual’s consent – continue to erode public trust in the same research institutions that are dependent on such data to advance medical knowledge.

Solving this problem through technology

Technology designed to better protect and decentralize data is being developed to address many of these challenges. Techniques such as homomorphic encryption (a cryptosystem that encrypts data with a public key) and differential privacy (a system leveraging information about a group without revealing details about individuals) both provide means to protect and centralize data while distributing the control of its use to the parties that steward the respective data sets.

Federated data leverages a special type of distributed database management system that can provide an alternative approach to centralizing encoded data without moving the data sets across jurisdictions or between institutions. Such an approach can help connect data sources while accounting for privacy. To further forge trust in the system, a federated model can be implemented to return encoded data to prevent unauthorized distribution of data and learnings as a result of the research activity.

To be sure, within every discussion of the analysis of aggregated data lies challenges with data fusion between data sets, between different studies, between data silos, between institutions. Despite there being several data standards that could be used, most data exist within bespoke data models built for a single purpose rather than for the facilitation of data sharing and data fusion. Furthermore, even when data has been captured into a standardized data model (e.g., the Global Alliance for Genomics and Health offers some models for standardizing sensitive health data), many data sets are still narrowly defined. They often lack any shared identifiers to combine data from different sources into a coherent aggregate data source useful for research. Within a model of data centralization, data fusion can be addressed through data curation of each data set, whereas within a federated model, data fusion is much more vexing….(More)“.

The European data market


European Commission: “It was the first European Data Market study (SMART 2013/0063) contracted by the European Commission in 2013 that made a first attempt to provide facts and figures on the size and trends of the EU data economy by developing a European data market monitoring tool.

The final report of the updated European Data Market (EDM) study (SMART 2016/0063) now presents in detail the results of the final round of measurement of the updated European Data Market Monitoring Tool contracted for the 2017-2020 period.

Designed along a modular structure, as a first pillar of the study, the European Data Market Monitoring Tool is built around a core set of quantitative indicators to provide a series of assessments of the emerging market of data at present, i.e. for the years 2018 through 2020, and with projections to 2025.

The key areas covered by the indicators measured in this report are:

  • The data professionals and the balance between demand and supply of data skills;
  • The data companies and their revenues;
  • The data user companies and their spending for data technologies;
  • The market of digital products and services (“Data market”);
  • The data economy and its impacts on the European economy.
  • Forecast scenarios of all the indicators, based on alternative market trajectories.

Additionally, as a second major work stream, the study also presents a series of descriptive stories providing a complementary view to the one offered by the Monitoring Tool (for example, “How Big Data is driving AI” or “The Secondary Use of Health Data and Data-driven Innovation in the European Healthcare Industry”), adding fresh, real-life information around the quantitative indicators. By focusing on specific issues and aspects of the data market, the stories offer an initial, indicative “catalogue” of good practices of what is happening in the data economy today in Europe and what is likely to affect the development of the EU data economy in the medium term.

Finally, as a third work stream of the study, a landscaping exercise on the EU data ecosystem was carried out together with some community building activities to bring stakeholders together from all segments of the data value chain. The map containing the results of the landscaping of the EU data economy as well as reports from the webinars organised by the study are available on the www.datalandscape.eu website….(More)”.

The National Cancer Institute Cancer Moonshot Public Access and Data Sharing Policy—Initial assessment and implications


Paper by Tammy M. Frisby and Jorge L. Contreras: “Since 2013, federal research-funding agencies have been required to develop and implement broad data sharing policies. Yet agencies today continue to grapple with the mechanisms necessary to enable the sharing of a wide range of data types, from genomic and other -omics data to clinical and pharmacological data to survey and qualitative data. In 2016, the National Cancer Institute (NCI) launched the ambitious $1.8 billion Cancer Moonshot Program, which included a new Public Access and Data Sharing (PADS) Policy applicable to funding applications submitted on or after October 1, 2017. The PADS Policy encourages the immediate public release of published research results and data and requires all Cancer Moonshot grant applicants to submit a PADS plan describing how they will meet these goals. We reviewed the PADS plans submitted with approximately half of all funded Cancer Moonshot grant applications in fiscal year 2018, and found that a majority did not address one or more elements required by the PADS Policy. Many such plans made no reference to the PADS Policy at all, and several referenced obsolete or outdated National Institutes of Health (NIH) policies instead. We believe that these omissions arose from a combination of insufficient education and outreach by NCI concerning its PADS Policy, both to potential grant applicants and among NCI’s program staff and external grant reviewers. We recommend that other research funding agencies heed these findings as they develop and roll out new data sharing policies….(More)”.

The Computermen


Podcast Episode by Jill Lepore: “In 1966, just as the foundations of the Internet were being imagined, the federal government considered building a National Data Center. It would be a centralized federal facility to hold computer records from each federal agency, in the same way that the Library of Congress holds books and the National Archives holds manuscripts. Proponents argued that it would help regulate and compile the vast quantities of data the government was collecting. Quickly, though, fears about privacy, government conspiracies, and government ineptitude buried the idea. But now, that National Data Center looks like a missed opportunity to create rules about data and privacy before the Internet took off. And in the absence of government action, corporations have made those rules themselves….(More)”.

Social-Change Games Can Help Us Understand the Public Health Choices We Face


Blog by the Hastings Institute: “Before there was the Covid-19 pandemic, there was Pandemic. This tabletop game, in which players collaborate to fight disease outbreaks, debuted in 2007. Expansions feature weaponized pathogens, historic pandemics, zoonotic diseases, and vaccine development races. Game mechanics modelled on pandemic vectors provide multiple narratives: battle, quest, detection, discovery. There is satisfaction in playing “against” disease–and winning.

Societies globally are responding to Covid-19 under differing political and economic conditions. In the United States, these conditions include mass unemployment and entrenched social inequalities that drive health disparities by race, class, and neighborhood. Real pandemic is not as tidy as a game. But can games, and the immense appetite for them, support understanding about the societal challenges we now face? Yes.

A well-designed game is structured as a flow chart or a decision tree. Games simulate challenges, require choices, and allow players to see the consequences of their decisions. Visual and narrative elements enhance these vicarious experiences. Game narratives can engage human capacities such as empathy, helping us to imagine the perspectives of people unlike ourselves. In The Waiting Game (2018), an award-winning digital single-player game designed by news outlets ProPublica and WNYC and game design firm Playmatics, the player starts by choosing one of five characters representing asylum seekers. The player is immersed in a day-by-day depiction of their character’s journey and experiences. Each “day,” the player must make a choice: give up or keep going?

Games can also engage the moral imagination by prompting players to reflect on competing values and implicit biases. In the single-player game Parable of the Polygons (2014), a player moves emoji-like symbols into groups. This quick game visualizes how decisions aimed at making members of a community happier can undermine a shared commitment to diversity when happiness relies on living near people “like me.” It is free-to-play on the website of Games for Change (G4C), a nonprofit organization that promotes the development and use of games to imagine and respond to real-world problems.

Also in the G4C arcade is Cards Against Calamity (2018), which focuses on local governance in a coastal town. This game, developed by 1st Playable Productions and the Environmental Law Institute, aims to help local policymakers foresee community planning challenges in balancing environmental protections and economic interests. Plague Inc. (2012) flips the Pandemic script by having players assume the pathogen role, winning by spreading. This game has been used as a teaching tool and has surged in popularity during disease outbreak: in January 2020, its designers issued a statement reminding players that Plague Inc. should not be used for pandemic modeling….(More)”.

The Sisyphean Cycle of Technology Panics


Paper by Amy Orben: “Widespread concerns about new technologies – whether they be novels, radios or smartphones – are repeatedly found throughout history. While past panics are often met with amusement today, current concerns routinely engender large research investments and policy debate. What we learn from studying past technological panics, however, is that these investments are often inefficient and ineffective. What causes technological panics to repeatedly reincarnate? And why does research routinely fail to address them?

To answer such questions, this article examines the network of political, population and academic factors driving the Sisyphean Cycle of Technology Panics. In this cycle, psychologists are encouraged to spend time investigating new technologies, and how they affect children and young people, to calm a worried population. Their endeavour is however rendered ineffective due to a lacking theoretical baseline; researchers cannot build on what has been learnt researching past technologies of concern. Thus academic study seemingly restarts for each new technology of interest, slowing down the policy interventions necessary to ensure technologies are benefitting society. This article highlights how the Sisyphean Cycle of Technology Panics stymies psychology’s positive role in steering technological change, and highlights the pervasive need for improved research and policy approaches to new technologies….(More)”.

Improving Governance with Policy Evaluation


OECD Report: “Policy evaluation is a critical element of good governance, as it promotes public accountability and contributes to citizens’ trust in government. Evaluation helps ensure that decisions are rooted in trustworthy evidence and deliver desired outcomes. Drawing on the first significant cross-country survey of policy evaluation practices covering 42 countries, this report offers a systemic analysis of the institutionalisation, quality and use of evaluation across countries and looks at how these three dimensions interrelate.

The report also covers cross-cutting aspects related to regulatory assessment and performance budgeting. The analysis illustrates the role and functions of key institutions within the executive, such as centres of government and ministries of finance. It also underlines the role of supreme audit institutions….(More)”.

Public understanding and perceptions of data practices: a review of existing research


Report by Helen Kennedy, Susan Oman, Mark Taylor, Jo Bates & Robin Steedman: “The ubiquitous collection and use of digital data is said to have wide-ranging effects. As these practices expand, interest in how the public perceives them has begun to grow. Understanding public views of data
practices is considered to be important, to ensure that data works ‘for people and society’ (the mission of the Ada Lovelace Institute) and is ‘a force for good’ (an aim of the government Centre for Data Ethics and
Innovation)

To improve understanding of public views of data practices, we conducted a review of original empirical research into public perceptions of, attitudes toward and feelings about data practices. We use the term ‘data practices’ to refer to the systematic collection, analysis and sharing of data and the outcomes of these processes. The data at the centre of such practices is often personal data, and related research often focuses on this data. Our review also covered related phenomena such as AI and facial recognition.

We carried out a systematic search of online academic research databases and a manual search, that began with literature with which we were already familiar, and then snowballed out. Our review covered a broad
range of academic disciplines and grey literature – that is, literature produced by independent, civil society, third sector, governmental or commercial organisations or by academics for non-academic audiences. It focused on the past five years. We excluded a) literature about children’s understandings and perceptions of data practices because this is a specialist area beyond our remit, and b) literature focused on the health domain because high quality syntheses of literature focusing on this domain already exist. The grey literature we reviewed focused on the UK, whereas academic literature was international….(More)”.

The Ages of Globalization: Geography, Technology, and Institutions


Book by Jeffrey D. Sachs: “Today’s most urgent problems are fundamentally global. They require nothing less than concerted, planetwide action if we are to secure a long-term future. But humanity’s story has always been on a global scale. In this book, Jeffrey D. Sachs, renowned economist and expert on sustainable development, turns to world history to shed light on how we can meet the challenges and opportunities of the twenty-first century.

Sachs takes readers through a series of seven distinct waves of technological and institutional change, starting with the original settling of the planet by early modern humans through long-distance migration and ending with reflections on today’s globalization. Along the way, he considers how the interplay of geography, technology, and institutions influenced the Neolithic revolution; the role of the horse in the emergence of empires; the spread of large land-based empires in the classical age; the rise of global empires after the opening of sea routes from Europe to Asia and the Americas; and the industrial age. The dynamics of these past waves, Sachs demonstrates, offer fresh perspective on the ongoing processes taking place in our own time—a globalization based on digital technologies. Sachs emphasizes the need for new methods of international governance and cooperation to prevent conflicts and to achieve economic, social, and environmental objectives aligned with sustainable development. The Ages of Globalization is a vital book for all readers aiming to make sense of our rapidly changing world….(More)”.