Citizen participation in food systems policy making: A case study of a citizens’ assembly


Paper by Bob Doherty et al: “In this article, we offer a contribution to the emerging debate on the role of citizen participation in food system policy making. A key driver is a recognition that solutions to complex challenges in the food system need the active participation of citizens to drive positive change. To achieve this, it is crucial to give citizens the agency in processes of designing policy interventions. This requires authentic and reflective engagement with citizens who are affected by collective decisions. One such participatory approach is citizen assemblies, which have been used to deliberate a number of key issues, including climate change by the UK Parliament’s House of Commons (House of Commons., 2019). Here, we have undertaken analysis of a citizen food assembly organized in the City of York (United Kingdom). This assembly was a way of hearing about a range of local food initiatives in Yorkshire, whose aim is to both relocalise food supply and production, and tackle food waste.

These innovative community-based business models, known as ‘food hubs’, are increasing the diversity of food supply, particularly in disadvantaged communities. Among other things, the assembly found that the process of design and sortation of the assembly is aided by the involvement of local stakeholders in the planning of the assembly. It also identified the potential for public procurement at the city level, to drive a more sustainable sourcing of food provision in the region. Furthermore, this citizen assembly has resulted in a galvanizing of individual agency with participants proactively seeking opportunities to create prosocial and environmental change in the food system….(More)”.

The Coronavirus Is Rewriting Our Imaginations


Kim Stanley Robinson at the New Yorker: “…We are individuals first, yes, just as bees are, but we exist in a larger social body. Society is not only real; it’s fundamental. We can’t live without it. And now we’re beginning to understand that this “we” includes many other creatures and societies in our biosphere and even in ourselves. Even as an individual, you are a biome, an ecosystem, much like a forest or a swamp or a coral reef. Your skin holds inside it all kinds of unlikely coöperations, and to survive you depend on any number of interspecies operations going on within you all at once. We are societies made of societies; there are nothing but societies. This is shocking news—it demands a whole new world view. And now, when those of us who are sheltering in place venture out and see everyone in masks, sharing looks with strangers is a different thing. It’s eye to eye, this knowledge that, although we are practicing social distancing as we need to, we want to be social—we not only want to be social, we’ve got to be social, if we are to survive. It’s a new feeling, this alienation and solidarity at once. It’s the reality of the social; it’s seeing the tangible existence of a society of strangers, all of whom depend on one another to survive. It’s as if the reality of citizenship has smacked us in the face.

As for government: it’s government that listens to science and responds by taking action to save us. Stop to ponder what is now obstructing the performance of that government. Who opposes it?…

There will be enormous pressure to forget this spring and go back to the old ways of experiencing life. And yet forgetting something this big never works. We’ll remember this even if we pretend not to. History is happening now, and it will have happened. So what will we do with that?

A structure of feeling is not a free-floating thing. It’s tightly coupled with its corresponding political economy. How we feel is shaped by what we value, and vice versa. Food, water, shelter, clothing, education, health care: maybe now we value these things more, along with the people whose work creates them. To survive the next century, we need to start valuing the planet more, too, since it’s our only home.

It will be hard to make these values durable. Valuing the right things and wanting to keep on valuing them—maybe that’s also part of our new structure of feeling. As is knowing how much work there is to be done. But the spring of 2020 is suggestive of how much, and how quickly, we can change. It’s like a bell ringing to start a race. Off we go—into a new time….(More)”.

Viruses Cross Borders. To Fight Them, Countries Must Let Medical Data Flow, Too


Nigel Cory at ITIF: “If nations could regulate viruses the way many regulate data, there would be no global pandemics. But the sad reality is that, in the midst of the worst global pandemic in living memory, many nations make it unnecessarily complicated and costly, if not illegal, for health data to cross their borders. In so doing, they are hindering critically needed medical progress.

In the COVID-19 crisis, data analytics powered by artificial intelligence (AI) is critical to identifying the exact nature of the pandemic and developing effective treatments. The technology can produce powerful insights and innovations, but only if researchers can aggregate and analyze data from populations around the globe. And that requires data to move across borders as part of international research efforts by private firms, universities, and other research institutions. Yet, some countries, most notably China, are stopping health and genomic data at their borders.

Indeed, despite the significant benefits to companies, citizens, and economies that arise from the ability to easily share data across borders, dozens of countries—across every stage of development—have erected barriers to cross-border data flows. These data-residency requirements strictly confine data within a country’s borders, a concept known as “data localization,” and many countries have especially strict requirements for health data.

China is a noteworthy offender, having created a new digital iron curtain that requires data localization for a range of data types, including health data, as part of its so-called “cyber sovereignty” strategy. A May 2019 State Council regulation required genomic data to be stored and processed locally by Chinese firms—and foreign organizations are prohibited. This is in service of China’s mercantilist strategy to advance its domestic life sciences industry. While there has been collaboration between U.S. and Chinese medical researchers on COVID-19, including on clinical trials for potential treatments, these restrictions mean that it won’t involve the transfer, aggregation, and analysis of Chinese personal data, which otherwise might help find a treatment or vaccine. If China truly wanted to make amends for blocking critical information during the early stages of the outbreak in Wuhan, then it should abolish this restriction and allow genomic and other health data to cross its borders.

But China is not alone in limiting data flows. Russia requires all personal data, health-related or not, to be stored locally. India’s draft data protection bill permits the government to classify any sensitive personal data as critical personal data and mandate that it be stored and processed only within the country. This would be consistent with recent debates and decisions to require localization for payments data and other types of data. And despite its leading role in pushing for the free flow of data as part of new digital trade agreementsAustralia requires genomic and other data attached to personal electronic health records to be only stored and processed within its borders.

Countries also enact de facto barriers to health and genomic data transfers by making it harder and more expensive, if not impractical, for firms to transfer it overseas than to store it locally. For example, South Korea and Turkey require firms to get explicit consent from people to transfer sensitive data like genomic data overseas. Doing this for hundreds or thousands of people adds considerable costs and complexity.

And the European Union’s General Data Protection Regulation encourages data localization as firms feel pressured to store and process personal data within the EU given the restrictions it places on data transfers to many countries. This is in addition to the renewed push for local data storage and processing under the EU’s new data strategy.

Countries rationalize these steps on the basis that health data, particularly genomic data, is sensitive. But requiring health data to be stored locally does little to increase privacy or data security. The confidentiality of data does not depend on which country the information is stored in, only on the measures used to store it securely, such as via encryption, and the policies and procedures the firms follow in storing or analyzing the data. For example, if a nation has limits on the use of genomics data, then domestic organizations using that data face the same restrictions, whether they store the data in the country or outside of it. And if they share the data with other organizations, they must require those organizations, regardless of where they are located, to abide by the home government’s rules.

As such, policymakers need to stop treating health data differently when it comes to cross-border movement, and instead build technical, legal, and ethical protections into both domestic and international data-governance mechanisms, which together allow the responsible sharing and transfer of health and genomic data.

This is clearly possible—and needed. In February 2020, leading health researchers called for an international code of conduct for genomic data following the end of their first-of-its-kind international data-driven research project. The project used a purpose-built cloud service that stored 800 terabytes of genomic data on 2,658 cancer genomes across 13 data centers on three continents. The collaboration and use of cloud computing were transformational in enabling large-scale genomic analysis….(More)”.

The Analog City and the Digital City


L. M. Sacasas at The New Atlantis: “…The challenges we are facing are not merely the bad actors, whether they be foreign agents, big tech companies, or political extremists. We are in the middle of a deep transformation of our political culture, as digital technology is reshaping the human experience at both an individual and a social level. The Internet is not simply a tool with which we do politics well or badly; it has created a new environment that yields a different set of assumptions, principles, and habits from those that ordered American politics in the pre-digital age.

We are caught between two ages, as it were, and we are experiencing all of the attendant confusion, frustration, and exhaustion that such a liminal state involves. To borrow a line from the Marxist thinker Antonio Gramsci, “The crisis consists precisely in the fact that the old is dying and the new cannot be born; in this interregnum a great variety of morbid symptoms appear.”

Although it’s not hard to see how the Internet, given its scope, ubiquity, and closeness to human life, radically reshapes human consciousness and social structures, that does not mean that the nature of that reshaping is altogether preordained or that it will unfold predictably and neatly. We must then avoid crassly deterministic just-so stories, and this essay is not an account of how digital media will necessarily change American politics irrespective of competing ideologies, economic forces, or already existing political and cultural realities. Rather, it is an account of how the ground on which these realities play out is shifting. Communication technologies are the material infrastructure on which so much of the work of human society is built. One cannot radically transform that infrastructure without radically altering the character of the culture built upon it. As Neil Postman once put it, “In the year 1500, fifty years after the printing press was invented, we did not have old Europe plus the printing press. We had a different Europe.” So, likewise, we may say that in the year 2020, fifty years after the Internet was invented, we do not have old America plus the Internet. We have a different America….(More)”.

Polycentric governance and policy advice: lessons from Whitehall policy advisory systems


Paper by Patrick Diamond: “In countries worldwide, the provision of policy advice to central governments has been transformed by the deinstitutionalisation of policymaking, which has engaged a diverse range of actors in the policy process. Scholarship should therefore address the impact of deinstitutionalisation in terms of the scope and scale of policy advisory systems, as well as in terms of the influence of policy advisors. This article addresses this gap, presenting a programme of research on policy advice in Whitehall. Building on Craft and Halligan’s conceptualisation of a ‘policy advisory system’, it argues that in an era of polycentric governance, policy advice is shaped by ‘interlocking actors’ beyond government bureaucracy, and that the pluralisation of advisory bodies marginalises the civil service. The implications of such alterations are considered against the backdrop of governance changes, particularly the hybridisation of institutions, which has made policymaking processes complex, prone to unpredictability and at risk of policy blunders….(More)”.

Models v. Evidence


Jonathan Fuller at the Boston Review: “COVID-19 has revealed a contest between two competing philosophies of scientific knowledge. To manage the crisis, we must draw on both….The lasting icon of the COVID-19 pandemic will likely be the graphic associated with “flattening the curve.” The image is now familiar: a skewed bell curve measuring coronavirus cases that towers above a horizontal line—the health system’s capacity—only to be flattened by an invisible force representing “non-pharmaceutical interventions” such as school closures, social distancing, and full-on lockdowns.

How do the coronavirus models generating these hypothetical curves square with the evidence? What roles do models and evidence play in a pandemic? Answering these questions requires reconciling two competing philosophies in the science of COVID-19.

To some extent, public health epidemiology and clinical epidemiology are distinct traditions in health care, competing philosophies of scientific knowledge.

In one camp are infectious disease epidemiologists, who work very closely with institutions of public health. They have used a multitude of models to create virtual worlds in which sim viruses wash over sim populations—sometimes unabated, sometimes held back by a virtual dam of social interventions. This deluge of simulated outcomes played a significant role in leading government actors to shut borders as well as doors to schools and businesses. But the hypothetical curves are smooth, while real-world data are rough. Some detractors have questioned whether we have good evidence for the assumptions the models rely on, and even the necessity of the dramatic steps taken to curb the pandemic. Among this camp are several clinical epidemiologists, who typically provide guidance for clinical practice—regarding, for example, the effectiveness of medical interventions—rather than public health.

The latter camp has won significant media attention in recent weeks. Bill Gates—whose foundation funds the research behind the most visible outbreak model in the United States, developed by the Institute for Health Metrics and Evaluation (IHME) at the University of Washington—worries that COVID-19 might be a “once-in-a-century pandemic.” A notable detractor from this view is Stanford’s John Ioannidis, a clinical epidemiologist, meta-researcher, and reliable skeptic who has openly wondered whether the coronavirus pandemic might rather be a “once-in-a-century evidence fiasco.” He argues that better data are needed to justify the drastic measures undertaken to contain the pandemic in the United States and elsewhere.

Ioannidis claims, in particular, that our data about the pandemic are unreliable, leading to exaggerated estimates of risk. He also points to a systematic review published in 2011 of the evidence regarding physical interventions that aim to reduce the spread of respiratory viruses, worrying that the available evidence is nonrandomized and prone to bias. (A systematic review specific to COVID-19 has now been published; it concurs that the quality of evidence is “low” to “very low” but nonetheless supports the use of quarantine and other public health measures.) According to Ioannidis, the current steps we are taking are “non-evidence-based.”…(More)”.

An Artificial Revolution: On Power, Politics and AI


Book by Ivana Bartoletti: “AI has unparalleled transformative potential to reshape society but without legal scrutiny, international oversight and public debate, we are sleepwalking into a future written by algorithms which encode regressive biases into our daily lives. As governments and corporations worldwide embrace AI technologies in pursuit of efficiency and profit, we are at risk of losing our common humanity: an attack that is as insidious as it is pervasive.

Leading privacy expert Ivana Bartoletti exposes the reality behind the AI revolution, from the low-paid workers who train algorithms to recognise cancerous polyps, to the rise of data violence and the symbiotic relationship between AI and right-wing populism.

Impassioned and timely, An Artificial Revolution is an essential primer to understand the intersection of technology and geopolitical forces shaping the future of civilisation, and the political response that will be required to ensure the protection of democracy and human rights….(More)”.

10 transformative data questions related to gender


Press Release: “As part of efforts to identify priorities across sectors in which data and data science could make a difference, The Governance Lab (The GovLab) at the New York University Tandon School of Engineering has partnered with Data2X, the gender data alliance housed at the United Nations Foundation, to release ten pressing questions on gender that experts have determined can be answered using data. Members of the public are invited to share their views and vote to help develop a data agenda on gender.

The questions are part of the 100 Questions Initiative, an effort to identify the most important societal questions that can be answered by data. The project relies on an innovative process of sourcing “bilinguals,” individuals with both subject-matter and data expertise, who in this instance provided questions related to gender they considered to be urgent and answerable. The results span issues of labor, health, climate change, and gender-based violence.

Through the initiative’s new online platform, anyone can now vote on what they consider to be the most pressing, data-related questions about gender that researchers and institutions should prioritize. Through voting, the public can steer the conversation and determine which topics should be the subject of data collaboratives, an emerging form of collaboration that allows organizations from different sectors to exchange data to create public value.

The GovLab has conducted significant research on the value and practice of data collaboratives, and its research shows that inter-sectoral collaboration can both increase access to data as well as unleash the potential of that data to serve the public good.

Data2X supported the 100 Questions Initiative by providing expertise and connecting The GovLab with relevant communities, events, and resources. The initiative helped inform Data2X’s “Big Data, Big Impact? Towards Gender-Sensitive Data Systems” report, which identifies gaps of information on gender equality across key policy domains.

“Asking the right questions is a critical first step in fostering data production and encouraging data use to truly meet the unique experiences and needs of women and girls,” said Emily Courey Pryor, executive director of Data2X. “Obtaining public feedback is a crucial way to identify the most urgent questions — and to ultimately incentivize investment in gender data collection and use to find the answers.”Said Stefaan Verhulst, co-founder and chief research and development officer at The GovLab, “Sourcing and prioritizing questions related to gender can inform resource and funding allocation to address gender data gaps and support projects with the greatest potential impact. This way, we can be confident about solutions that address the challenges facing women and girls.”…(More)”.

10 Tips for Making Sense of COVID-19 Models for Decision-Making


Elizabeth Stuart et al at John Hopkins School of Public Health: “Models can be enormously useful in the context of an epidemic if they synthesize evidence and scientific knowledge. The COVID-19 pandemic is a complex phenomenon and in such a dynamic setting it is nearly impossible to make informed decisions without the assistance models can provide. However, models don’t perfectly capture reality: They simplify reality to help answer specific questions.

Below are 10 tips for making sense of COVID-19 models for decision-making such as directing health care resources to certain areas or identifying how long social distancing policies may need to be in effect.

Flattening the Curve for COVIX-19
  1. Make sure the model fits the question you are trying to answer.
    There are many different types of models and a wide variety of questions that models can be used to address. There are three that can be helpful for COVID-19:
    1. Models that simplify how complex systems work, such as disease transmission. This is often done by putting people into compartments related to how a disease spreads, like “susceptible,” “infected,” and “recovered.” While these can be overly simplistic with few data inputs and don’t allow for the uncertainty that exists in a pandemic, they can be useful in the short term to understand basic structures. But these models generally cannot be implemented in ways that account for complex systems or when there is ongoing system or individual behavioral change.
    2. Forecasting models try to predict what will actually happen. They work by using existing data to project out conclusions over a relatively short time horizon. But these models are challenging to use for mid-term assessment—like a few months out—because of the evolving nature of pandemics.
    3. Strategic models show multiple scenarios to consider the potential implications of different interventions and contexts. These models try to capture some of the uncertainty about the underlying disease processes and behaviors. They might take a few values of such as the case fatality ratio or the effectiveness of social distancing measures, and play out different scenarios for disease spread over time. These kinds of models can be particularly useful for decision-making.
  2. Be mindful that forecast models are often built with the goal of change, which affects their shelf life.
    The irony of many COVID-19 modeling purposes is that in some cases, especially for forecasting, a key purpose in building and disseminating the model is to invoke behavior change at individual or system levels—e.g., to reinforce the need for physical distancing.

    This makes it difficult to assess the performance of forecasting models since the results of the model itself (and reactions to it) become part of the system. In these cases, a forecasting model may look like it was inaccurate, but it may have been accurate for an unmitigated scenario with no behavior change. In fact, a public health success may be when the forecasts do not come to be!
  3. Look for models (and underlying collaborations) that include diverse aspects and expertise.
    One of the challenges in modeling COVID-19 is the multitude of factors involved: infectious disease dynamics, social and behavioral factors such as how frequently individuals interact, economic factors such as employment and safety net policies, and more.

    One benefit is that we do know that COVID-19 is an infectious disease and we have a good understanding about how related diseases spread. Likewise, health economists and public health experts have years of experience understanding complex social systems. Look for models, and their underlying collaborations, that take advantage of that breadth of existing knowledge….(More)”.

A call for a new generation of COVID-19 models


Blog post by Alex Engler: “Existing models have been valuable, but they were not designed to support these types of critical decisions. A new generation of models that estimate the risk of COVID-19 spread for precise geographies—at the county or even more localized level—would be much more informative for these questions. Rather than produce long-term predictions of deaths or hospital utilization, these models could estimate near-term relative risk to inform local policymaking. Going forward, governors and mayors need local, current, and actionable numbers.

Broadly speaking, better models would substantially aid in the “adaptive response” approach to re-opening the economy. In this strategy, policymakers cyclically loosen and re-tighten restrictions, attempting to work back towards a healthy economy without moving so fast as to allow infections to take off again. In an ideal process, restrictions would be eased at such a pace that balances a swift return to normalcy with reducing total COVID-19 infections. Of course, this is impossible in practice, and thus some continued adjustments—the flipping of various controls off and on again—will be necessary. More precise models can help improve this process, providing another lens into when it will be safe to relax restrictions, thus making it easier to do without a disruptive back-and-forth. A more-or-less continuous easing of restrictions is especially valuable, since it is unlikely that second or third rounds of interventions (such as social distancing) would achieve the same high rates of compliance as the first round.

The proliferation of Covid19 Data

These models can incorporate cases, test-positive rates, hospitalization information, deaths, excess deaths, and other known COVID-19 data. While all these data sources are incomplete, an expanding body of research on COVID-19 is making the data more interpretable. This research will become progressively more valuable with more data on the spread of COVID-19 in the U.S. rather than data from other countries or past pandemics.

Further, a broad range of non-COVID-19 data can also inform risk estimates: Population density, age distributions, poverty and uninsured rates, the number of essential frontline workers, and co-morbidity factors can also be included. Community mobility reports from Google and Unacast’s social distancing scorecard can identify how easing restrictions are changing behavior. Small area estimates also allow the models to account for the risk of spread from other nearby geographies. Geospatial statistics cannot account for infectious spread between two large neighboring states, but they would add value for adjacent zip codes. Lastly, many more data sources are in the works, like open patient data registries, the National Institutes of Health’s (NIH) study of asymptomatic personsself-reported symptoms data from Facebook, and (potentially) new randomized surveys. In fact, there are so many diverse and relevant data streams, that models can add value simply be consolidating daily information into just a few top-line numbers that are comparable across the nation.

FiveThirtyEight has effectively explained that making these models is tremendously difficult due to incomplete data, especially since the U.S. is not testing enough or in statistically valuable ways. These challenges are real, but decision-makers are currently using this same highly flawed data to make inferences and policy choices. Despite the many known problems, elected officials and public health services have no choice. Frequently, they are evaluating the data without the time and expertise to make reasoned statistical interpretations based on epidemiological research, leaving significant opportunity for modeling to help….(More)”.