Explore our articles
View All Results

Stefaan Verhulst

Report by the Future of Privacy Forum: “The transparency goals of the open data movement serve important social, economic, and democratic functions in cities like Seattle. At the same time, some municipal datasets about the city and its citizens’ activities carry inherent risks to individual privacy when shared publicly. In 2016, the City of Seattle declared in its Open Data Policy that the city’s data would be “open by preference,” except when doing so may affect individual privacy. To ensure its Open Data Program effectively protects individuals, Seattle committed to performing an annual risk assessment and tasked the Future of Privacy Forum (FPF) with creating and deploying an initial privacy risk assessment methodology for open data.

This Report provides tools and guidance to the City of Seattle and other municipalities navigating the complex policy, operational, technical, organizational, and ethical standards that support privacyprotective open data programs. Although there is a growing body of research regarding open data privacy, open data managers and departmental data owners need to be able to employ a standardized methodology for assessing the privacy risks and benefits of particular datasets internally, without access to a bevy of expert statisticians, privacy lawyers, or philosophers. By optimizing its internal processes and procedures, developing and investing in advanced statistical disclosure control strategies, and following a flexible, risk-based assessment process, the City of Seattle – and other municipalities – can build mature open data programs that maximize the utility and openness of civic data while minimizing privacy risks to individuals and addressing community concerns about ethical challenges, fairness, and equity.

This Report first describes inherent privacy risks in an open data landscape, with an emphasis on potential harms related to re-identification, data quality, and fairness. To address these risks, the Report includes a Model Open Data Benefit-Risk Analysis (“Model Analysis”). The Model Analysis evaluates the types of data contained in a proposed open dataset, the potential benefits – and concomitant risks – of releasing the dataset publicly, and strategies for effective de-identification and risk mitigation. This holistic assessment guides city officials to determine whether to release the dataset openly, in a limited access environment, or to withhold it from publication (absent countervailing public policy considerations). …(More)”.

Open Data Risk Assessment

After 200 years of expansion, democracy’s growth in the world has stalled. A handful of democracies like Venezuela and Hungary are backsliding into authoritarianism. And even in established Western democracies, voters are losing faith in democratic institutions and norms.

That has left us and scholars who study democracy obsessed with a set of questions. Is this all just a blip, or is democracy in real trouble? Are the oldest and sturdiest democracies, like those of Europe and the United States, really as safe as they seem? And why would people voluntarily dismantle their own democracy from within?

No one knows the answers for sure. But we’re starting to figure them out and it’s not all good news. Here, in the first of what will become a regular series of videos exploring big questions and ideas about the world, we explain what we know about democracy’s troubles, what’s causing them and where it leads….(See VIDEO)”.

Is There Something Wrong with Democracy?

Andrew Zolli at the Stanford Social Innovation Review: “Consider, for a moment, some of the most pernicious challenges facing humanity today: the increasing prevalence of natural disasters; the systemic overfishing of the world’s oceans; the clear-cutting of primeval forests; the maddening persistence of poverty; and above all, the accelerating effects of global climate change.

Each item in this dark litany inflicts suffering on the world in its own, awful way. Yet as a group, they share some common characteristics. Each problem is messy, with lots of moving parts. Each is riddled with perverse incentives, which can lead local actors to behave in a way that is not in the common interest. Each is opaque, with dynamics that are only partially understood, even by experts; each can, as a result, often be made worse by seemingly rational and well-intentioned interventions. When things do go wrong, each has consequences that diverge dramatically from our day-to-day experiences, making their full effects hard to imagine, predict, and rehearse. And each is global in scale, raising questions about who has the legal obligation to act—and creating incentives for leaders to disavow responsibility (and sometimes even question the legitimacy of the problem itself).

With dynamics like these, it’s little wonder systems theorists label these kinds of problems “wicked” or even “super wicked.” It’s even less surprising that these challenges remain, by and large, externalities to the global system—inadequately measured, perennially underinvested in, and poorly accounted for—until their consequences spill disastrously and expensively into view.

For real progress to occur, we’ve got to move these externalities into the global system, so that we can fully assess their costs, and so that we can sufficiently incentivize and reward stakeholders for addressing them and penalize them if they don’t. And that’s going to require a revolution in measurement, reporting, and financial instrumentation—the mechanisms by which we connect global problems with the resources required to address them at scale.

Thankfully, just such a revolution is under way.

It’s a complex story with several moving parts, but it begins with important new technical developments in three critical areas of technology: remote sensing and big data, artificial intelligence, and cloud computing.

Remote sensing and big data allow us to collect unprecedented streams of observations about our planet and our impacts upon it, and dramatic advances in AI enable us to extract the deeper meaning and patterns contained in those vast data streams. The rise of the cloud empowers anyone with an Internet connection to access and interact with these insights, at a fraction of the traditional cost.

In the years to come, these technologies will shift much of the current conversation focused on big data to one focused on “big indicators”—highly detailed, continuously produced, global indicators that track change in the health of the Earth’s most important systems, in real time. Big indicators will form an important mechanism for guiding human action, allow us to track the impact of our collective actions and interventions as never before, enable better and more timely decisions, transform reporting, and empower new kinds of policy and financing instruments. In short, they will reshape how we tackle a number of global problems, and everyone—especially nonprofits, NGOs, and actors within the social and environmental sectors—will play a role in shaping and using them….(More)”.

After Big Data: The Coming Age of “Big Indicators”

Paper by Marc D. Joffe and Frank Partnoy: “In the aftermath of the 2007-08 global financial crisis, regulators and policy makers recognized the importance of making bond ratings publicly available. Although rating agencies have made some dataavailable, obtaining this information in bulk can be difficult or impossible. At some times, the data is costly; at other times, it is simply unavailable. Some rating agencies have provided data only on a subscription basis for tens or even hundreds of thousands of dollars annually.

The cost and lack of availability of ratings data are particularly striking given the regulatory requirement that rating agencies publish such data. We describe the relevant Securities and Exchange Commission publication rules and requirements. Unfortunately, the ways in which the major credit rating agencies have responded to these rules have not made data available in an easily accessed or comprehensive way and have instead hindered academic and think-tank research into credit ratings. Financial researchers who lack the funds required to purchase bulk ratings must use a variety of ad hoc methods to obtain rating data or limit their studies of credit ratings.

This brief paper describes our recent initiative to make credit ratings data publicly available. We provide links to a software tool written in Python that crawls credit rating agency websites, downloads the XRBL files, and converts them to Comma Separated Value (CSV) format. We also provide a link to the most recently processed ratings data, separated by agency and asset category, as well as the entire universe of ratings actions, including more than eight million assignments, upgrades, downgrades, and withdrawals…(More)”.

Making Credit Ratings Data Publicly Available

Paper by Eyal Benvenisti at the The European Journal of International Law: “The law on global governance that emerged after the Second World War was grounded in irrefutable trust in international organizations and an assumption that their subjection to legal discipline and judicial review would be unnecessary and, in fact, detrimental to their success. The law that evolved systematically insulated international organizations from internal and external scrutiny and absolved them of any inherent legal obligations – and, to a degree, continues to do so.

Indeed, it was only well after the end of the Cold War that mistrust in global governance began to trickle through into the legal discourse and the realization gradually took hold that the operation of international organizations needed to be subject to the disciplining power of the law. Since the mid-1990s, scholars have sought to identify the conditions under which trust in global bodies can be regained, mainly by borrowing and adapting domestic public law precepts that emphasize accountability through communications with those affected.

Today, although a ‘culture of accountability’ may have taken root, its legal tools are still shaping up and are often contested. More importantly, these communicative tools are ill-equipped to address the new modalities of governance that are based on decision-making by machines using raw data (rather than two-way exchange with stakeholders) as their input.

The new information and communication technologies challenge the foundational premise of the accountability school – that ‘the more communication, the better’ – as voters-turned-users obtain their information from increasingly fragmented and privatized marketplaces of ideas that are manipulated for economic and political gain.

In this article, I describe and analyse how the law has evolved to acknowledge the need for accountability, how it has designed norms for this purpose and continues in this endeavour – yet how the challenges it faces today are leaving its most fundamental assumptions open to question. I argue that, given the growing influence of public and private global governance bodies on our daily lives and the shape of our political communities, the task of the law of global governance is no longer limited to ensuring the accountability of global bodies, but is also to protect human dignity and the very viability of the democratic state….(More)”.

Upholding Democracy Amid the Challenges of New Technology

Report from the Commission on the Future of Localism (UK): “…When we think about power we tend to look upwards – towards Westminster-based institutions and elected politicians. Those who wish to see greater localism often ask politicians to give it away and push power downwards. But this is looking at things the wrong way round. Instead, we need to start with the power of community. The task of our political system should be to support this, harness it, and reflect it in our national debate.

Screen Shot 2018-01-23 at 8.24.32 PM

Our Commission has heard evidence about what makes a powerful community. While different communities build and experience power in different ways, there are common sources. We heard how the power of any community lies with its people, their collective ideas, innovation, creativity and local knowledge, as well as their sense of belonging, connectedness and shared identity. We need to bring this into political life much more effectively via a renewed effort to foster localism in future.

However, our Commission has also heard about a fundamental imbalance of power that is preventing this power of community from coming to life and restricting collective agency: top-down decisions leaving community groups and local councils unable to make the change they know their neighbourhood needs; a lack of trust and risk aversion from public bodies, dampening community energy; a lack of control and access to local resources, limiting the scope of local action….(More)”.

People Power

Kirk Bansak, et al in Science Magazine: “Developed democracies are settling an increased number of refugees, many of whom face challenges integrating into host societies. We developed a flexible data-driven algorithm that assigns refugees across resettlement locations to improve integration outcomes. The algorithm uses a combination of supervised machine learning and optimal matching to discover and leverage synergies between refugee characteristics and resettlement sites.

The algorithm was tested on historical registry data from two countries with different assignment regimes and refugee populations, the United States and Switzerland. Our approach led to gains of roughly 40 to 70%, on average, in refugees’ employment outcomes relative to current assignment practices. This approach can provide governments with a practical and cost-efficient policy tool that can be immediately implemented within existing institutional structures….(More)”.

Improving refugee integration through data-driven algorithmic assignment

Kyle Duggan at iPolitics: “The national statistics agency is launching a crowdsourcing project to find out how much weed Canadians are consuming and how much it costs them.

Statistics Canada is searching for the best picture of consumption it can find ahead of legalization, and is turning to average Canadians to improve its rough estimates about a product that’s largely been accessed illegally by the population.

Thursday it released a suite of “experimental” data that make up its current best guesses on Canadian consumption habits, along with a crowdsourcing website and app to get its own estimates – a project officials said is an experiment itself.

Statscan is also rolling out a quarterly cannabis survey this year.

The agency has been combing through historical research on legal and illegal cannabis prices, scraping price data from illegal vendors online and, for some data, is relying largely on the self-reporting website priceofweed.com to assemble as much pot information as possible, even if it’s not perfect data.

The agency has been quietly preparing for the July legalization deadline by compiling health, justice and economic datasets and scouring to fill in the blanks where it can. Come July, legal cannabis will suddenly also need to be rolled into other important data products, like the GDP accounts….(More)”.

StatCan now crowdsourcing cannabis data

Cass R. Sunstein in Special Issue on Evaluating Nudging of the Missouri Law Journal: “It can be paternalistic to force people to choose. Often people do not wish to choose, but both private and public institutions ask or force them to do so, thus overriding their wishes. As a result, people’s autonomy may be badly compromised and their welfare may be greatly reduced. These points have implications for a range of issues in law and policy, suggesting that those who favor active choosing, and insist on it, may well be overriding people’s preferences and values, and thus running afoul of John Stuart Mill’s Harm Principle (for better or for worse). People have limited mental bandwidth, and forcing choices can impose a hedonic or cognitive tax. Sometimes that tax is high….(More)”.

Forcing People to Choose is Paternalistic

Paper by Vanessa Herringshaw: “Narratives in the field of information and communications technology (ICT) for governance are full of claims, of either enormous success or almost none. But understanding ‘success’ and ‘failure’ depends on how these are framed. Research supported by Making All Voices Count suggests that different actors can seek very different goals from the same ICT-enabled interventions – some stated, some not.

This programme learning report proposes two important dimensions for framing variations in visions of success for ICT-enabled governance interventions: (1) the kind of change in governance systems sought (‘functional’, ‘instrumental’, ‘transformative’ and ‘no change’); and (2) the vision of the ideal citizen–state relationship. It applies this framing to three areas where ICTs are being used, at least on paper, to encourage and channel citizen voice into governance processes, and to improve government responsiveness in return: participatory policy- and strategymaking; participatory budgeting; and citizen feedback to improve service delivery.

In terms of the kind of change in governance systems sought, much of the rhetoric touts the use of ICTs as inherently ‘transformative’. However, findings suggest that it has mostly been deployed in ‘functional’, ‘instrumental’ and ‘no change’ ways. That said, the possibility of ICT-enabled ‘transformative’ change appears somewhat higher when citizens have more direct control over outcomes, and more online and offline processes are mixed and used in ways that foster collective, rather than individualised, inputs, deliberation and answerability.

In terms of the vision of the state–citizen relationship, the findings show great variation in outcomes sought regarding the kinds and levels of participatory democracy, who this should benefit, the ideal size of the state, and the desired stability of actor groups and decision-making structures.

The evidence suggests that the use of ICTs may have the potential to support change, including transformative change, but only when the political goals of key actors are pre-structured to support this. The choice of ICTs does matter to the effectiveness of this support, as does the way in which they are used. But overall, ICTs do not appear to be inherently ‘generative’ of change. They are, rather, ‘reflective’, ‘enabling’ or ‘amplifying’ of existing political agendas and levels of commitment.

The recommendations of this report focus on the need to understand deeply and face the realities of these varying agendas and visions of success at the start of intervention planning, and throughout implementation as they evolve over time. This imperative should remain undiminished, regardless of any rhetoric of the inherently transformative or ‘democratising’ nature of ICTs, and of interventions to strengthen citizen voice and government responsiveness more broadly….(More).

Increasing citizen voice and government responsiveness: what does success really look like, and who decides?

Get the latest news right in you inbox

Subscribe to curated findings and actionable knowledge from The Living Library, delivered to your inbox every Friday