Measuring the predictability of life outcomes with a scientific mass collaboration


Paper by Matthew J. Salganik et al: “Hundreds of researchers attempted to predict six life outcomes, such as a child’s grade point average and whether a family would be evicted from their home. These researchers used machine-learning methods optimized for prediction, and they drew on a vast dataset that was painstakingly collected by social scientists over 15 y. However, no one made very accurate predictions. For policymakers considering using predictive models in settings such as criminal justice and child-protective services, these results raise a number of concerns. Additionally, researchers must reconcile the idea that they understand life trajectories with the fact that none of the predictions were very accurate….(More)”.

How Humanitarian Blockchain Can Deliver Fair Labor to Global Supply Chains


Paper by  Ashley Mehra and John G. Dale: “Blockchain technology in global supply chains has proven most useful as a tool for storing and keeping records of information or facilitating payments with increased efficiency. The use of blockchain to improve supply chains for humanitarian projects has mushroomed over the last five years; this increased popularity is in large part due to the potential for transparency and security that the design of the technology proposes to offer. Yet, we want to ask an important but largely unexplored question in the academic literature about the human rights of the workers who produce these “humanitarian blockchain” solutions: “How can blockchain help eliminate extensive labor exploitation issues embedded within our global supply chains?”

To begin to answer this question, we suggest that proposed humanitarian blockchain solutions must (1) re-purpose the technical affordances of blockchain to address relations of power that, sometimes unwittingly, exploit and prevent workers from collectively exercising their voice; (2) include legally or socially enforceable mechanisms that enable workers to meaningfully voice their knowledge of working conditions without fear of retaliation; and (3) re-frame our current understanding of human rights issues in the context of supply chains to include the labor exploitation within supply chains that produce and sustain the blockchain itself….(More)”.

Rapid Multi-Dimensional Impact Assessment of Floods


Paper by David Pastor-Escuredo et al: “Natural disasters affect hundreds of millions of people worldwide every year. The impact assessment of a disaster is key to improve the response and mitigate how a natural hazard turns into a social disaster. An actionable quantification of impact must be integratively multi-dimensional. We propose a rapid impact assessment framework that comprises detailed geographical and temporal landmarks as well as the potential socio-economic magnitude of the disaster based on heterogeneous data sources: Environment sensor data, social media, remote sensing, digital topography, and mobile phone data.

As dynamics of floods greatly vary depending on their causes, the framework may support different phases of decision-making during the disaster management cycle. To evaluate its usability and scope, we explored four flooding cases with variable conditions. The results show that social media proxies provide a robust identification with daily granularity even when rainfall detectors fail. The detection also provides information of the magnitude of the flood, which is potentially useful for planning. Network analysis was applied to the social media to extract patterns of social effects after the flood. This analysis showed significant variability in the obtained proxies, which encourages the scaling of schemes to comparatively characterize patterns across many floods with different contexts and cultural factors.

This framework is presented as a module of a larger data-driven system designed to be the basis for responsive and more resilient systems in urban and rural areas. The impact-driven approach presented may facilitate public–private collaboration and data sharing by providing real-time evidence with aggregated data to support the requests of private data with higher granularity, which is the current most important limitation in implementing fully data-driven systems for disaster response from both local and international actors…(More)”.

How Covid-19 Is Accelerating the Rise of Digital Democracy


Blog post by Rosie Beacon: “Covid-19 has created an unprecedented challenge for parliaments and legislatures. Social distancing and restrictions on movement have forced parliaments to consider new methods of scrutiny, debate, and voting. The immediate challenge was simply to replicate existing procedures remotely, but the crisis has presented a unique window of opportunity to innovate.

As policymakers slowly transition back to “normal”, they should not easily dismiss the potential of this new relationship between democracy and technology. Parliamentarians should use what they’ve learned and the expertise of the democracy tech and deliberative democracy community to build greater trust in public institutions and open up traditional processes to wider deliberation, bringing people closer to the source of democratic power.

This note sets out some of the most interesting examples of crisis-led parliamentary innovation from around the world and combines it with some of the lessons we already know from democracy and deliberative tech to chart a way forward.

There are five core principles political leaders should embrace from this great experiment in digital parliamentary democracy:

  1. Discover and adopt: The world’s parliaments and legislatures have been through the same challenge. This is an opportunity to learn and improve democratic engagement in the long-term.
  2. Experiment with multiple tools: There is no one holistic approach to applying digital tools in any democracy. Some will work, others will fail – technology does not promise infallibility.
  3. Embrace openness: Where things can be open, experiment with using this to encourage open dialogue and diversify ideas in the democratic and representative process.
  4. Don’t start from scratch: Learn how the deliberative democracy community is already using technology to help remake representative systems and better connect to communities.
  5. Use multi-disciplinary approaches: Create diverse teams, with diverse skill sets. Build flexible tools that meet today’s needs of democracies, citizens and representatives.

Approaches From Around the World

The approaches globally to Covid-19 continuity have been varied depending on the geographical, political and social context, but they generally follow one of these scenarios:

  1. Replicating everything using digital tools – Welsh Assembly, Crown dependencies (Jersey, Isle of Man),Brazil
    • Using technology in every way possible to continue the current parliamentary agenda online.
  2. Moving priority processes online, deprioritising the rest – France National Assembly,New ZealandCanada
    • No physical presence in parliaments and prioritising the most important elements of the current parliamentary agenda, usually Covid-19-related legislation, to adapt for online continuation.
  3. Shifting what you can online while maintaining a minimal physical parliament – Denmark, Germany, UK
    • Hybrid parliaments appear to be a popular choice for larger parliaments. This generally allows for the parliamentary agenda to continue with amendments to how certain procedures are conducted.
  4. Reducing need for physical attendance and moving nothing online – Ireland, Sweden
    • Houses can continue to sit in quorum (an agreed proportion of MPs representative of overall party representation), but certain parts of legislative agenda have been suspended for the time being….(More)”.

System-wide Roadmap for Innovating UN Data and Statistics


Roadmap by the United Nations System: “Since 2018, the Secretary-General has pursued an ambitious agenda to prepare the UN System for the challenges of the 21st century. In lockstep with other structural UN reforms, he has launched a portfolio of initiatives through the CEB to help transform system-wide approaches to new technologies, innovation and data. Driven by the urgency and ambition of the “Decade of Action”, these initiatives are designed to nurture cross-cutting capabilities the UN System will need to deliver better “for people and planet”. Unlocking data and harnessing the potential of statistics will be critical to the success of UN reform.

Recognizing that data are a strategic asset for the UN System, the UN Secretary-General’s overarching Data Strategy sets out a vision for a “data ecosystem that maximizes the value of our data assets for our organizations and the stakeholders we serve”, including high-level objectives, principles, core workstreams and concrete system-wide data initiatives. The strategy signals that improving how we collect, manage, use and share data should be a crosscutting strategic concern: Across all pillars of the UN System, across programmes and operations, and across all level of our organizations.

The System-wide Roadmap for Innovating UN Data and Statistics contributes to the overall objectives of the Data Strategy of the Secretary-General that constitutes a framework to support the Roadmap as a priority initiative. The two strategic plans converge around a vision that recognizes the power of data and stimulates the United Nations to embrace a more coherent and modern approach to data…(More)”.

Removing the pump handle: Stewarding data at times of public health emergency


Reema Patel at Significance: “There is a saying, incorrectly attributed to Mark Twain, that states: “History never repeat itself but it rhymes”. Seeking to understand the implications of the current crisis for the effective use of data, I’ve drawn on the nineteenth-century cholera outbreak in London’s Soho to identify some “rhyming patterns” that might inform our approaches to data use and governance at this time of public health crisis.

Where better to begin than with the work of Victorian pioneer John Snow? In 1854, Snow’s use of a dot map to illustrate clusters of cholera cases around public water pumps, and of statistics to establish the connection between the quality of water sources and cholera outbreaks, led to a breakthrough in public health interventions – and, famously, the removal of the handle of a water pump in Broad Street.

Data is vital

We owe a lot to Snow, especially now. His examples teaches us that data has a central role to play in saving lives, and that the effective use of (and access to) data is critical for enabling timely responses to public health emergencies.

Take, for instance, transport app CityMapper’s rapid redeployment of its aggregated transport data. In the early days of the Covid-19 pandemic, this formed part of an analysis of compliance with social distancing restrictions across a range of European cities. There is also the US-based health weather map, which uses anonymised and aggregated data to visualise fever, specifically influenza-like illnesses. This data helped model early indications of where, and how quickly, Covid-19 was spreading….

Ethics and human rights still matter

As the current crisis evolves, many have expressed concern that the pandemic will be used to justify the rapid roll out of surveillance technologies that do not meet ethical and human rights standards, and that this will be done in the name of the “public good”. Examples of these technologies include symptom- and contact-tracing applications. Privacy experts are also increasingly concerned that governments will be trading off more personal data than is necessary or proportionate to respond to the public health crisis.

Many ethical and human rights considerations (including those listed at the bottom of this piece) are at risk of being overlooked at this time of emergency, and governments would be wise not to press ahead regardless, ignoring legitimate concerns about rights and standards. Instead, policymakers should begin to address these concerns by asking how we can prepare (now and in future) to establish clear and trusted boundaries for the use of data (personal and non-personal) in such crises.

Democratic states in Europe and the US have not, in recent memory, prioritised infrastructures and systems for a crisis of this scale – and this has contributed to our current predicament. Contrast this with Singapore, which suffered outbreaks of SARS and H1N1, and channelled this experience into implementing pandemic preparedness measures.

We cannot undo the past, but we can begin planning and preparing constructively for the future, and that means strengthening global coordination and finding mechanisms to share learning internationally. Getting the right data infrastructure in place has a central role to play in addressing ethical and human rights concerns around the use of data….(More)”.

Open science: after the COVID-19 pandemic there can be no return to closed working


Article by Virginia Barbour and Martin Borchert: “In the few months since the first case of COVID-19 was identified, the underlying cause has been isolated, its symptoms agreed on, its genome sequenced, diagnostic tests developed, and potential treatments and vaccines are on the horizon. The astonishingly short time frame of these discoveries has only happened through a global open science effort.

The principles and practices underpinning open science are what underpin good research—research that is reliable, reproducible, and has the broadest impact possible. It specifically requires the application of principles and practices that make research FAIR (Findable, Accessible, Interoperable, Reusable); researchers are making their data and preliminary publications openly accessible, and then publishers are making the peer-reviewed research immediately and freely available to all. The rapid dissemination of research—through preprints in particular as well as journal articles—stands in contrast to what happened in the 2003 SARS outbreak when the majority of research on the disease was published well after the outbreak had ended.

Many outside observers might reasonably assume, given the digital world we all now inhabit, that science usually works like this. Yet this is very far from the norm for most research. Science is not something that just happens in response to emergencies or specific events—it is an ongoing, largely publicly funded, national and international enterprise….

Sharing of the underlying data that journal articles are based on is not yet a universal requirement for publication, nor are researchers usually recognised for data sharing.

There are many benefits associated with an open science model. Image adapted from: Gaelen Pinnock/UCT; CC-BY-SA 4.0 .

Once published, even access to research is not seamless. The majority of academic journals still require a subscription to access. Subscriptions are expensive; Australian universities alone currently spend more than $300 million per year on subscriptions to academic journals. Access to academic journals also varies between universities with varying library budgets. The main markets for subscriptions to the commercial journal literature are higher education and health, with some access to government and commercial….(More)”.

The Big Failure of Small Government


Mariana Mazzucato and Giulio Quaggiotto at Project Syndicate: “Decades of privatization, outsourcing, and budget cuts in the name of “efficiency” have significantly hampered many governments’ responses to the COVID-19 crisis. At the same time, successful responses by other governments have shown that investments in core public-sector capabilities make all the difference in times of emergency. The countries that have handled the crisis well are those where the state maintains a productive relationship with value creators in society, by investing in critical capacities and designing private-sector contracts to serve the public interest.

From the United States and the United Kingdom to Europe, Japan, and South Africa, governments are investing billions – and, in some cases, trillions – of dollars to shore up national economies. Yet, if there is one thing we learned from the 2008 financial crisis, it is that quality matters at least as much as quantity. If the money falls on empty, weak, or poorly managed structures, it will have little effect, and may simply be sucked into the financial sector. Too many lives are at stake to repeat past errors.

Unfortunately, for the last half-century, the prevailing political message in many countries has been that governments cannot – and therefore should not – actually govern. Politicians, business leaders, and pundits have long relied on a management creed that focuses obsessively on static measures of efficiency to justify spending cuts, privatization, and outsourcing.

As a result, governments now have fewer options for responding to the crisis, which may be why some are now desperately clinging to the unrealistic hope of technological panaceas such as artificial intelligence or contact-tracing apps. With less investment in public capacity has come a loss of institutional memory (as the UK’s government has discovered) and increased dependence on private consulting firms, which have raked in billions. Not surprisingly, morale among public-sector employees has plunged in recent years.

Consider two core government responsibilities during the COVID-19 crisis: public health and the digital realm. In 2018 alone, the UK government outsourced health contracts worth £9.2 billion ($11.2 billion), putting 84% of beds in care homes in the hands of private-sector operators (including private equity firms). Making matters worse, since 2015, the UK’s National Health Service has endured £1 billion in budget cuts.

Outsourcing by itself is not the problem. But the outsourcing of critical state capacities clearly is, especially when the resulting public-private “partnerships” are not designed to serve the public interest. Ironically, some governments have outsourced so eagerly that they have undermined their own ability to structure outsourcing contracts. After a 12-year effort to spur the private sector to develop low-cost ventilators, the US government is now learning that outsourcing is not a reliable way to ensure emergency access to medical equipment….(More)”.

Big data, privacy and COVID-19 – learning from humanitarian expertise in data protection


Andrej Zwitter & Oskar J. Gstrein at the Journal of International Humanitarian Action: “The use of location data to control the coronavirus pandemic can be fruitful and might improve the ability of governments and research institutions to combat the threat more quickly. It is important to note that location data is not the only useful data that can be used to curb the current crisis. Genetic data can be relevant for AI enhanced searches for vaccines and monitoring online communication on social media might be helpful to keep an eye on peace and security (Taulli n.d.). However, the use of such large amounts of data comes at a price for individual freedom and collective autonomy. The risks of the use of such data should ideally be mitigated through dedicated legal frameworks which describe the purpose and objectives of data use, its collection, analysis, storage and sharing, as well as the erasure of ‘raw’ data once insights have been extracted. In the absence of such clear and democratically legitimized norms, one can only resort to fundamental rights provisions such as Article 8 paragraph 2 of the ECHR that reminds us that any infringement of rights such as privacy need to be in accordance with law, necessary in a democratic society, pursuing a legitimate objective and proportionate in their application.

However as shown above, legal frameworks including human rights standards are currently not capable of effectively ensuring data protection, since they focus too much on the individual as the point of departure. Hence, we submit that currently applicable guidelines and standards for responsible data use in the humanitarian sector should also be fully applicable to corporate, academic and state efforts which are currently enacted to curb the COVID-19 crisis globally. Instead of ‘re-calibrating’ the expectations of individuals on their own privacy and collective autonomy, the requirements for the use of data should be broader and more comprehensive. Applicable principles and standards as developed by OCHA, the 510 project of the Dutch Red Cross, or by academic initiatives such as the Signal Code are valid minimum standards during a humanitarian crisis. Hence, they are also applicable minimum standards during the current pandemic.

Core findings that can be extracted from these guidelines and standards for the practical implementation into data driven responses to COVIC-19 are:

  • data sensitivity is highly contextual; one and the same data can be sensitive in different contexts. Location data during the current pandemic might be very useful for epidemiological analysis. However, if (ab-)used to re-calibrate political power relations, data can be open for misuse. Hence, any party supplying data and data analysis needs to check whether data and insights can be misused in the context they are presented.
  • privacy and data protection are important values; they do not disappear during a crisis. Nevertheless, they have to be weighed against respective benefits and risks.
  • data-breaches are inevitable; with time (t) approaching infinity, the chance of any system being hacked or becoming insecure approaches 100%. Hence, it is not a question of whether, but when. Therefore, organisations have to prepare sound data retention and deletion policies.
  • data ethics is an obligation to provide high quality analysis; using machine learning and big data might be appealing for the moment, but the quality of source data might be low, and results might be unreliable, or even harmful. Biases in incomplete datasets, algorithms and human users are abundant and widely discussed. We must not forget that in times of crisis, the risk of bias is more pronounced, and more problematic due to the vulnerability of data subjects and groups. Therefore, working to the highest standards of data processing and analysis is an ethical obligation.

The adherence to these principles is particularly relevant in times of crisis such as now, where they mark the difference between societies that focus on control and repression on the one hand, and those who believe in freedom and autonomy on the other. Eventually, we will need to think of including data policies into legal frameworks for state of emergency regulations, and coordinate with corporate stakeholders as well as private organisations on how to best deal with such crises. Data-driven practices have to be used in a responsible manner. Furthermore, it will be important to observe whether data practices and surveillance assemblages introduced under current circumstances will be rolled back to status quo ante when returning to normalcy. If not, our rights will become hollowed out, just waiting for the next crisis to eventually become irrelevant….(More)”.

Testing Transparency


Paper by Brigham Daniels, Mark Buntaine and Tanner Bangerter: “In modern democracies, governmental transparency is thought to have great value. When it comes to addressing administrative corruption and mismanagement, many would agree with Justice Brandeis’s observation that sunlight is the best disinfectant. Beyond this, many credit transparency with enabling meaningful citizen participation.

But even though transparency appears highly correlated with successful governance in developed democracies, assumptions about administrative transparency have remained empirically untested. Testing effects of transparency would prove particularly helpful in developing democracies where transparency norms have not taken hold or only have done so slowly. In these contexts, does administrative transparency really create the sorts of benefits attributed to it? Transparency might grease the gears of developed democracies, but what good is grease when many of the gears seem to be broken or missing entirely?

This Article presents empirical results from a first-of-its-kind field study that tested two major promises of administrative transparency in a developing democracy: that transparency increases public participation in government affairs and that it increases government accountability. To test these hypotheses, we used two randomized controlled trials.

Surprisingly, we found transparency had no significant effect in almost any of our quantitative measurements, although our qualitative results suggested that when transparency interventions exposed corruption, some limited oversight could result. Our findings are particularly significant for developing democracies and show, at least in this context, that Justice Brandeis may have oversold the cleansing effects of transparency. A few rays of transparency shining light on government action do not disinfect the system and cure government corruption and mismanagement. Once corruption and mismanagement are identified, it takes effective government institutions and action from civil society to successfully act as a disinfectant….(More)”.