The Food Systems Dashboard


About: “The Food Systems Dashboard combines data from multiple sources to give users a complete view of food systems. Users can compare components of food systems across countries and regions. They can also identify and prioritize ways to sustainably improve diets and nutrition in their food systems.

Dashboards are useful tools that help users visualize and understand key information for complex systems. Users can track progress to see if policies or other interventions are working at a country or regional level

In recent years, the public health and nutrition communities have used dashboards to track the progress of health goals and interventions, including the Sustainable Development Goals. To our knowledge, this is the first dashboard that collects country-level data across all components of the food system.

The Dashboard contains over 150 indicators that measure components, drivers, and outcomes of food systems at the country level. As new indicators and data become available, the Dashboard will be updated. Most data used for the Dashboard is open source and available to download directly from the website. Data is pooled from FAO, Euromonitor International, World Bank, and other global and regional data sources….(More)”.

Saving Our Oceans: Scaling the Impact of Robust Action Through Crowdsourcing


Paper by Amanda J. Porter, Philipp Tuertscher, and Marleen Huysman: “One approach for tackling grand challenges that is gaining traction in recent management literature is robust action: by allowing diverse stakeholders to engage with novel ideas, initiatives can cultivate successful ideas that yield greater impact. However, a potential pitfall of robust action is the length of time it takes to generate momentum. Crowdsourcing, we argue, is a valuable tool that can scale the generation of impact from robust action.

We studied an award‐winning environmental sustainability crowdsourcing initiative and found that robust action principles were indeed successful in attracting a diverse stakeholder network to generate novel ideas and develop these into sustainable solutions. Yet we also observed that the momentum and novelty generated was at risk of getting lost as the actors and their roles changed frequently throughout the process. We show the vital importance of robust action principles for connecting ideas and actors across crowdsourcing phases. These observations allow us to make a contribution to extant theory by explaining the micro‐dynamics of scaling robust action’s impact over time…(More)”.

Narrative Change: How Changing the Story Can Transform Society, Business, and Ourselves


Book by Hans Hansen: “Texas prosecutors are powerful: in cases where they seek capital punishment, the defendant is sentenced to death over ninety percent of the time. When management professor Hans Hansen joined Texas’s newly formed death penalty defense team to rethink their approach, they faced almost insurmountable odds. Yet while Hansen was working with the office, they won seventy of seventy-one cases by changing the narrative for death penalty defense. To date, they have succeeded in preventing well over one hundred executions—demonstrating the importance of changing the narrative to change our world.

In this book, Hansen offers readers a powerful model for creating significant organizational, social, and institutional change. He unpacks the lessons of the fight to change capital punishment in Texas—juxtaposing life-and-death decisions with the efforts to achieve a cultural shift at Uber. Hansen reveals how narratives shape our everyday lives and how we can construct new narratives to enact positive change. This narrative change model can be used to transform corporate cultures, improve public services, encourage innovation, craft a brand, or even develop your own leadership.

Narrative Change provides an unparalleled window into an innovative model of change while telling powerful stories of a fight against injustice. It reminds us that what matters most for any organization, community, or person is the story we tell about ourselves—and the most effective way to shake things up is by changing the story….(More)”.

Policy Priority Inference


Turing Institute: “…Policy Priority Inference builds on a behavioural computational model, taking into account the learning process of public officials, coordination problems, incomplete information, and imperfect governmental monitoring mechanisms. The approach is a unique mix of economic theory, behavioural economics, network science and agent-based modelling. The data that feeds the model for a specific country (or a sub-national unit, such as a state) includes measures of the country’s DIs and how they have moved over the years, specified government policy goals in relation to DIs, the quality of government monitoring of expenditure, and the quality of the country’s rule of law.

From these data alone – and, crucially, with no specific information on government expenditure, which is rarely made available – the model can infer the transformative resources a country has historically allocated to transform its SDGs, and assess the importance of SDG interlinkages between DIs. Importantly, it can also reveal where previously hidden inefficiencies lie.

How does it work? The researchers modelled the socioeconomic mechanisms of the policy-making process using agent-computing simulation. They created a simulator featuring an agent called “Government”, which makes decisions about how to allocate public expenditure, and agents called “Bureaucrats”, each of which is essentially a policy-maker linked to a single DI. If a Bureaucrat is allocated some resource, they will use a portion of it to improve their DI, with the rest lost to some degree of inefficiency (in reality, inefficiencies range from simple corruption to poor quality policies and inefficient government departments).

How much resource a Bureaucrat puts towards moving their DI depends on that agent’s experience: if becoming inefficient pays off, they’ll keep doing it. During the process, Government monitors the Bureaucrats, occasionally punishing inefficient ones, who may then improve their behaviour. In the model, a Bureaucrat’s chances of getting caught is linked to the quality of a government’s real-world monitoring of expenditure, and the extent to which they are punished is reflected in the strength of that country’s rule of law.

Diagram of the Policy Priority Inference model
Using data on a country or state’s development indicators and its governance, Policy Priority Inference techniques can model how a government and its policy-makers allocate “transformational resources” to reach their sustainable development goals.

When the historical movements of a country’s DIs are reproduced through the internal workings of the model, the researchers have a powerful proxy for the real-world relationships between government activity, the movement of DIs, and the effects of the interlinkages between DIs, all of which are unique to that country. “Once we can match outcomes, we can discern something that’s going on in reality. But the fact that the method is matching the dynamics of real-world development indicators is just one of multiple ways that we validate our results,” Guerrero notes. This proxy can then be used to project which policy areas should be prioritised in future to best achieve the government’s specified development goals, including predictions of likely timescales.

What’s more, in combination with techniques from evolutionary computation, the model can identify DIs that are linked to large positive spillover effects. These DIs are dubbed “accelerators”. Targeting government resources at such development accelerators fosters not only more rapid results, but also more generalised development…(More)”.

Using Data for COVID-19 Requires New and Innovative Governance Approaches


Stefaan G. Verhulst and Andrew Zahuranec at Data & Policy blog: “There has been a rapid increase in the number of data-driven projects and tools released to contain the spread of COVID-19. Over the last three months, governments, tech companies, civic groups, and international agencies have launched hundreds of initiatives. These efforts range from simple visualizations of public health data to complex analyses of travel patterns.

When designed responsibly, data-driven initiatives could provide the public and their leaders the ability to be more effective in addressing the virus. The Atlantic andNew York Times have both published work that relies on innovative data use. These and other examples, detailed in our #Data4COVID19 repository, can fill vital gaps in our understanding and allow us to better respond and recover to the crisis.

But data is not without risk. Collecting, processing, analyzing and using any type of data, no matter how good intention of its users, can lead to harmful ends. Vulnerable groups can be excluded. Analysis can be biased. Data use can reveal sensitive information about people and locations. In addressing all these hazards, organizations need to be intentional in how they work throughout the data lifecycle.

Decision Provenance: Documenting decisions and decision makers across the Data Life Cycle

Unfortunately the individuals and teams responsible for making these design decisions at each critical point of the data lifecycle are rarely identified or recognized by all those interacting with these data systems.

The lack of visibility into the origins of these decisions can impact professional accountability negatively as well as limit the ability of actors to identify the optimal intervention points for mitigating data risks and to avoid missed use of potentially impactful data. Tracking decision provenance is essential.

As Jatinder Singh, Jennifer Cobbe, and Chris Norval of the University of Cambridge explain, decision provenance refers to tracking and recording decisions about the collection, processing, sharing, analyzing, and use of data. It involves instituting mechanisms to force individuals to explain how and why they acted. It is about using documentation to provide transparency and oversight in the decision-making process for everyone inside and outside an organization.

Toward that end, The GovLab at NYU Tandon developed the Decision Provenance Mapping. We designed this tool for designated data stewards tasked with coordinating the responsible use of data across organizational priorities and departments….(More)”

Canadian smart cities: Are we wiring new citizen‐local government interactions?


Paper by Peter A. Johnson, Albert Acedo and Pamela J. Robinson: “Governments around the world are developing smart city projects, with the aim to realize diverse goals of increased efficiency, sustainability, citizen engagement, and improved delivery of services. The processes through which these projects are conceptualized vary dramatically, with potential implications for how citizens are involved or engaged.

This research examines the 20 finalists in the Canadian Smart Cities Challenge, a Canadian federal government contest held from 2017 to 2019 to disburse funding in support of smart city projects. We analyzed each of the finalist proposals, coding all instances of citizen engagement used to develop the proposal. A significant majority of the proposals used traditional types of citizen engagement, notably citizen meetings, round tables, and workshops, to develop their smart city plans. We also noted the use of transactional forms of citizen engagement, such as apps, and the use of social media. Despite the general rhetoric of innovation in the development of smart cities, this research finds that citizens are most commonly engaged in traditional ways. This research provides cues for governments that are developing smart city projects, placing an emphasis on the importance of the process of smart city development, and not simply the product….(More)”.

Standards and Innovations in Information Technology and Communications


Book by Dina Šimunić and Ivica Pavić: “This book gives a thorough explanation of standardization, its processes, its life cycle, and its related organization on a national, regional and global level. The book provides readers with an insight in the interaction cycle between standardization organizations, government, industry, and consumers. The readers can gain a clear insight to standardization and innovation process, standards, and innovations life-cycle and the related organizations with all presented material in the field of information and communications technologies. The book introduces the reader to understand perpetual play of standards and innovation cycle, as the basis for the modern world.

  • Provides a thorough explanation of standardization and innovation in relation to communications engineering and information technology
  • Discusses the standardization and innovation processes and organizations on global, regional, and national levels
  • Interconnects standardization and innovation, showing the perpetual life-cycle that is the basis of technology progress…(More)”.

Why open science is critical to combatting COVID-19


Article by the OECD: “…In January 2020, 117 organisations – including journals, funding bodies, and centres for disease prevention – signed a statement titled “Sharing research data and findings relevant to the novel coronavirus outbreakcommitting to provide immediate open access for peer-reviewed publications at least for the duration of the outbreak, to make research findings available via preprint servers, and to share results immediately with the World Health Organization (WHO). This was followed in March by the Public Health Emergency COVID-19 Initiative, launched by 12 countries1 at the level of chief science advisors or equivalent, calling for open access to publications and machine-readable access to data related to COVID-19, which resulted in an even stronger commitment by publishers.

The Open COVID Pledge was launched in April 2020 by an international coalition of scientists, lawyers, and technology companies, and calls on authors to make all intellectual property (IP) under their control available, free of charge, and without encumbrances to help end the COVID-19 pandemic, and reduce the impact of the disease….

Remaining challenges

While clinical, epidemiological and laboratory data about COVID-19 is widely available, including genomic sequencing of the pathogen, a number of challenges remain:

  • All data is not sufficiently findable, accessible, interoperable and reusable (FAIR), or not yet FAIR data.
  • Sources of data tend to be dispersed, even though many pooling initiatives are under way, curation needs to be operated “on the fly”.
  • Providing access to personal health record sharing needs to be readily accessible, pending the patient’s consent. Legislation aimed at fostering interoperability and avoiding information blocking are yet to be passed in many OECD countries. Access across borders is even more difficult under current data protection frameworks in most OECD countries.
  • In order to achieve the dual objectives of respecting privacy while ensuring access to machine readable, interoperable and reusable clinical data, the Virus Outbreak Data Network (VODAN) proposes to create FAIR data repositories which could be used by incoming algorithms (virtual machines) to ask specific research questions.
  • In addition, many issues arise around the interpretation of data – this can be illustrated by the widely followed epidemiological statistics. Typically, the statistics concern “confirmed cases”, “deaths” and “recoveries”. Each of these items seem to be treated differently in different countries, and are sometimes subject to methodological changes within the same country.
  • Specific standards for COVID-19 data therefore need to be established, and this is one of the priorities of the UK COVID-19 Strategy. A working group within Research Data Alliance has been set up to propose such standards at an international level.
  • In some cases it could be inferred that the transparency of the statistics may have guided governments to restrict testing in order to limit the number of “confirmed cases” and avoid the rapid rise of numbers. Lower testing rates can in turn reduce the efficiency of quarantine measures, lowering the overall efficiency of combating the disease….(More)”.

How Humanitarian Blockchain Can Deliver Fair Labor to Global Supply Chains


Paper by  Ashley Mehra and John G. Dale: “Blockchain technology in global supply chains has proven most useful as a tool for storing and keeping records of information or facilitating payments with increased efficiency. The use of blockchain to improve supply chains for humanitarian projects has mushroomed over the last five years; this increased popularity is in large part due to the potential for transparency and security that the design of the technology proposes to offer. Yet, we want to ask an important but largely unexplored question in the academic literature about the human rights of the workers who produce these “humanitarian blockchain” solutions: “How can blockchain help eliminate extensive labor exploitation issues embedded within our global supply chains?”

To begin to answer this question, we suggest that proposed humanitarian blockchain solutions must (1) re-purpose the technical affordances of blockchain to address relations of power that, sometimes unwittingly, exploit and prevent workers from collectively exercising their voice; (2) include legally or socially enforceable mechanisms that enable workers to meaningfully voice their knowledge of working conditions without fear of retaliation; and (3) re-frame our current understanding of human rights issues in the context of supply chains to include the labor exploitation within supply chains that produce and sustain the blockchain itself….(More)”.

System-wide Roadmap for Innovating UN Data and Statistics


Roadmap by the United Nations System: “Since 2018, the Secretary-General has pursued an ambitious agenda to prepare the UN System for the challenges of the 21st century. In lockstep with other structural UN reforms, he has launched a portfolio of initiatives through the CEB to help transform system-wide approaches to new technologies, innovation and data. Driven by the urgency and ambition of the “Decade of Action”, these initiatives are designed to nurture cross-cutting capabilities the UN System will need to deliver better “for people and planet”. Unlocking data and harnessing the potential of statistics will be critical to the success of UN reform.

Recognizing that data are a strategic asset for the UN System, the UN Secretary-General’s overarching Data Strategy sets out a vision for a “data ecosystem that maximizes the value of our data assets for our organizations and the stakeholders we serve”, including high-level objectives, principles, core workstreams and concrete system-wide data initiatives. The strategy signals that improving how we collect, manage, use and share data should be a crosscutting strategic concern: Across all pillars of the UN System, across programmes and operations, and across all level of our organizations.

The System-wide Roadmap for Innovating UN Data and Statistics contributes to the overall objectives of the Data Strategy of the Secretary-General that constitutes a framework to support the Roadmap as a priority initiative. The two strategic plans converge around a vision that recognizes the power of data and stimulates the United Nations to embrace a more coherent and modern approach to data…(More)”.