Measuring human rights: facing a necessary challenge


Essay by Eduardo Burkle: “Given the abundance of data available today, many assume the world already has enough accurate metrics on human rights performance. However, the political sensitivity of human rights has proven a significant barrier to access. Governments often avoid producing and sharing this type of information.

States’ compliance with their human rights obligations often receives a lot of attention. But there is still much discussion about how to measure it. At the same time, statistics and data increasingly drive political and bureaucratic decisions. This, in turn, brings some urgency to the task of ensuring the best possible data are available.

Establishing cross-national human rights measures is vital for research, advocacy, and policymaking. It can also have a direct effect on people’s enjoyment of human rights. Good data allow states and actors to evaluate how well their country is performing. It also lets them make comparisons that highlight which policies and institutions are truly effective in promoting human rights.

Good human rights data does more than simply evaluate how well a country is performing – it also identifies which policies and institutions are truly effective in promoting human rights

Such context makes it crucial to arm researchers, journalists, advocates, practitioners, investors, and companies with reliable information when raising human rights issues in their countries, and around the world…(More)”.

To Fix Tech, Democracy Needs to Grow Up


Article by Divya Siddarth: “There isn’t much we can agree on these days. But two sweeping statements that might garner broad support are “We need to fix technology” and “We need to fix democracy.”

There is growing recognition that rapid technology development is producing society-scale risks: state and private surveillance, widespread labor automation, ascending monopoly and oligopoly power, stagnant productivity growth, algorithmic discrimination, and the catastrophic risks posed by advances in fields like AI and biotechnology. Less often discussed, but in my view no less important, is the loss of potential advances that lack short-term or market-legible benefits. These include vaccine development for emerging diseases and open source platforms for basic digital affordances like identity and communication.

At the same time, as democracies falter in the face of complex global challenges, citizens (and increasingly, elected leaders) around the world are losing trust in democratic processes and are being swayed by autocratic alternatives. Nation-state democracies are, to varying degrees, beset by gridlock and hyper-partisanship, little accountability to the popular will, inefficiency, flagging state capacity, inability to keep up with emerging technologies, and corporate capture. While smaller-scale democratic experiments are growing, locally and globally, they remain far too fractured to handle consequential governance decisions at scale.

This puts us in a bind. Clearly, we could be doing a better job directing the development of technology towards collective human flourishing—this may be one of the greatest challenges of our time. If actually existing democracy is so riddled with flaws, it doesn’t seem up to the task. This is what rings hollow in many calls to “democratize technology”: Given the litany of complaints, why subject one seemingly broken system to governance by another?…(More)”.

(Re)making data markets: an exploration of the regulatory challenges


Paper by Linnet Taylor, Hellen Mukiri-Smith, Tjaša Petročnik, Laura Savolainen & Aaron Martin: “Regulating the data market will be one of the major challenges of the twenty-first century. In order to think about regulating this market, however, we first need to make its dimensions and dynamics more accessible to observation and analysis. In this paper we explore what the state of the sociological and legal research on markets can tell us about the market for data: what kind of market it is, the practices and configurations of actors that constitute it, and what kinds of data are traded there. We start from the subjective opacity of this market to researchers interested in regulation and governance, review conflicting positions on its extent, diversity and regulability, and then explore comparisons from food and medicine regulation to understand the possible normative and practical implications and aims inherent in attempting to regulate how data is shared and traded. We conclude that there is a strong argument for a normative shift in the aims of regulation with regard to the data market, away from a prioritisation of the economic value of data and toward a more nuanced approach that aims to align the uses of data with the needs and rights of the communities reflected in it…(More)”

The fear of technology-driven unemployment and its empirical base


Article by Kerstin Hötte, Melline Somers and Angelos Theodorakopoulos:”New technologies may replace human labour, but can simultaneously create jobs if workers are needed to use these technologies or if new economic activities emerge. At the same time, technology-driven productivity growth may increase disposable income, stimulating a demand-induced employment expansion. Based on a systematic review of the empirical literature on technological change and its impact on employment published in the past four decades, this column suggests that the empirical support for the labour-creating effects of technological change dominates that for labour-replacement…(More)”.

Toward a Demand-Driven, Collaborative Data Agenda for Adolescent Mental Health


Paper by Stefaan Verhulst et al: “Existing datasets and research in the field of adolescent mental health do not always meet the needs of practitioners, policymakers, and program implementers, particularly in the context of vulnerable populations. Here, we introduce a collaborative, demand-driven methodology for the development of a strategic adolescent mental health research agenda. Ultimately, this agenda aims to guide future data sharing and collection efforts that meet the most pressing data needs of key stakeholders…

We conducted a rapid literature search to summarize common themes in adolescent mental health research into a “topic map”. We then hosted two virtual workshops with a range of international experts to discuss the topic map and identify shared priorities for future collaboration and research…

Our topic map identifies 10 major themes in adolescent mental health, organized into system-level, community-level, and individual-level categories. The engagement of cross-sectoral experts resulted in the validation of the mapping exercise, critical insights for refining the topic map, and a collaborative list of priorities for future research…

This innovative agile methodology enables a focused deliberation with diverse stakeholders and can serve as the starting point for data generation and collaboration practices, both in the field of adolescent mental health and other topics…(More)”.

Forest data governance as a reflection of forest governance: Institutional change and endurance in Finland and Canada


Paper by Salla Rantala, Brent Swallow, Anu Lähteenmäki-Uutela and Riikka Paloniemi: “The rapid development of new digital technologies for natural resource management has created a need to design and update governance regimes for effective and transparent generation, sharing and use of digital natural resource data. In this paper, we contribute to this novel area of investigation from the perspective of institutional change. We develop a conceptual framework to analyze how emerging natural resource data governance is shaped by related natural resource governance; complex, multilevel systems of actors, institutions and their interplay. We apply this framework to study forest data governance and its roots in forest governance in Finland and Canada. In Finland, an emphasis on open forest data and the associated legal reform represents the instutionalization of a mixed open data-bioeconomy discourse, pushed by higher-level institutional requirements towards greater openness and shaped by changing actor dynamics in relation to diverse forest values. In Canada, a strong institutional lock-in around public-private partnerships in forest management has engendered an approach that is based on voluntary data sharing agreements and fragmented data management, conforming with the entrenched interests of autonomous sub-national actors and thus extending the path-dependence of forest governance to forest data governance. We conclude by proposing how the framework could be further developed and tested to help explain which factors condition the formation of natural resource data institutions and subsequently the (re-)distribution of benefits they govern. Transparent and efficient data approaches can be enabled only if the analysis of data institutions is given equal attention to the technological development of data solutions…(More)”.

Who Should Represent Future Generations in Climate Planning?


Paper by Morten Fibieger Byskov and Keith Hyams: “Extreme impacts from climate change are already being felt around the world. The policy choices that we make now will affect not only how high global temperatures rise but also how well-equipped future economies and infrastructures are to cope with these changes. The interests of future generations must therefore be central to climate policy and practice. This raises the questions: Who should represent the interests of future generations with respect to climate change? And according to which criteria should we judge whether a particular candidate would make an appropriate representative for future generations? In this essay, we argue that potential representatives of future generations should satisfy what we call a “hypothetical acceptance criterion,” which requires that the representative could reasonably be expected to be accepted by future generations. This overarching criterion in turn gives rise to two derivative criteria. These are, first, the representative’s epistemic and experiential similarity to future generations, and second, his or her motivation to act on behalf of future generations. We conclude that communities already adversely affected by climate change best satisfy these criteria and are therefore able to command the hypothetical acceptance of future generations…(More)”.

EU Court Expands Definition of Sensitive Data, Prompting Legal Concerns for Companies


Article by Catherine Stupp: “Companies will be under increased pressure after Europe’s top court ruled they must apply special protections to data that firms previously didn’t consider sensitive.

Under the European Union’s General Data Protection Regulation, information about health, religion, political views and sexual orientation are considered sensitive. Companies generally aren’t allowed to process it unless they apply special safeguards.

The European Court of Justice on Aug. 1 determined that public officials in Lithuania had their sensitive data revealed because their spouses’ names were published online, which could indicate their sexual orientation. Experts say the implications will extend to other types of potentially sensitive information.

Data that might be used to infer a sensitive piece of information about a person is also sensitive, the court said. That could include unstructured data—which isn’t organized in databases and is therefore more difficult to search through and analyze—such as surveillance camera footage in a hospital that indicates a person was treated there, legal experts say. Records of a special airplane meal might reveal religious views.

The court ruling “raises a lot of practical complexities and a lot of difficulty in understanding if the data [organizations] have is sensitive or not,” said Dr. Gabriela Zanfir-Fortuna, vice president for global privacy at the Future of Privacy Forum, a think tank based in Washington, D.C.

Many companies with large data sets may not know they hold details that indirectly relate to sensitive information, privacy experts say. Identifying where that data is and deciding whether it could reveal personal details about an individual would be a huge undertaking, said Tobias Judin, head of the international section at the Norwegian data protection regulator.

“You can’t really comply with the law if your data set becomes so big that you don’t really know what’s in it,” Mr. Judin said.

The GDPR says companies can only process sensitive data in a few circumstances, such as if a person gives explicit consent for it to be used for a specified purpose.

Regulators have been grappling with the question of how to determine what is sensitive data. The Norwegian regulator last year fined gay-dating app Grindr LLC 65 million kroner, equivalent to roughly $6.7 million The regulator said the user data was sensitive because use of the app indicated their sexual orientation.

Grindr said it doesn’t require users to share that data. The company appealed in February. Mr. Judin said his office is reviewing material submitted by the company as part of its appeal. Spain’s regulator came to a different conclusion in January, and found that data Grindr shared for advertising purposes wasn’t sensitive….(More)”.

Can open-source technologies support open societies?


Report by Victoria Welborn, and George Ingram: “In the 2020 “Roadmap for Digital Cooperation,” U.N. Secretary General António Guterres highlighted digital public goods (DPGs) as a key lever in maximizing the full potential of digital technology to accelerate progress toward the Sustainable Development Goals (SDGs) while also helping overcome some of its persistent challenges. 

The Roadmap rightly pointed to the fact that, as with any new technology, there are risks around digital technologies that might be counterproductive to fostering prosperous, inclusive, and resilient societies. In fact, without intentional action by the global community, digital technologies may more naturally exacerbate exclusion and inequality by undermining trust in critical institutions, allowing consolidation of control and economic value by the powerful, and eroding social norms through breaches of privacy and disinformation campaigns. 

Just as the pandemic has served to highlight the opportunity for digital technologies to reimagine and expand the reach of government service delivery, so too has it surfaced specific risks that are hallmarks of closed societies and authoritarian states—creating new pathways to government surveillance, reinforcing existing socioeconomic inequalities, and enabling the rapid proliferation of disinformation. Why then—in the face of these real risks—focus on the role of digital public goods in development?

As the Roadmap noted, DPGs are “open source software, open data, open AI models, open standards and open content that adhere to privacy and other applicable laws and best practices, do no harm, and help attain the SDGs.”[1] There are a number of factors why such products have unique potential to accelerate development efforts, including widely recognized benefits related to more efficient and cost effective implementation of technology-enabled development programming. 

Historically, the use of digital solutions for development in low- and middle-income countries (LMICs) has been supported by donor investments in sector-specific technology systems, reinforcing existing silos and leaving countries with costly, proprietary software solutions with duplicative functionality and little interoperability across government agencies, much less underpinning private sector innovation. These silos are further codified through the development of sector-specific maturity models and metrics. An effective DPG ecosystem has the potential to enable the reuse and improvement of existing tools, thereby lowering overall cost of deploying technology solutions and increasing efficient implementation.

Beyond this proven reusability of DPGs and the associated cost and deployment efficiencies, do DPGs have even more transformational potential? Increasingly, there is interest in DPGs as drivers of inclusion and products through which to standardize and safeguard rights; these opportunities are less understood and remain unproven. To begin to fill that gap, this paper first examines the unique value proposition of DPGs in supporting open societies by advancing more equitable systems and by codifying rights. The paper then considers the persistent challenges to more fully realizing this opportunity and offers some recommendations for how to address these challenges…(More)”.

The New Moral Mathematics


Book Review by Kieran Setiya: “Space is big,” wrote Douglas Adams in The Hitchhiker’s Guide to the Galaxy (1979). “You just won’t believe how vastly, hugely, mind-bogglingly big it is. I mean, you may think it’s a long way down the road to the chemist’s, but that’s just peanuts to space.”

What we do now affects future people in dramatic ways—above all, whether they will exist at all.

Time is big, too—even if we just think on the timescale of a species. We’ve been around for approximately 300,000 years. There are now about 8 billion of us, roughly 15 percent of all humans who have ever lived. You may think that’s a lot, but it’s just peanuts to the future. If we survive for another million years—the longevity of a typical mammalian species—at even a tenth of our current population, there will be 8 trillion more of us. We’ll be outnumbered by future people on the scale of a thousand to one.

What we do now affects those future people in dramatic ways: whether they will exist at all and in what numbers; what values they embrace; what sort of planet they inherit; what sorts of lives they lead. It’s as if we’re trapped on a tiny island while our actions determine the habitability of a vast continent and the life prospects of the many who may, or may not, inhabit it. What an awful responsibility.

This is the perspective of the “longtermist,” for whom the history of human life so far stands to the future of humanity as a trip to the chemist’s stands to a mission to Mars.

Oxford philosophers William MacAskill and Toby Ord, both affiliated with the university’s Future of Humanity Institute, coined the word “longtermism” five years ago. Their outlook draws on utilitarian thinking about morality. According to utilitarianism—a moral theory developed by Jeremy Bentham and John Stuart Mill in the nineteenth century—we are morally required to maximize expected aggregate well-being, adding points for every moment of happiness, subtracting points for suffering, and discounting for probability. When you do this, you find that tiny chances of extinction swamp the moral mathematics. If you could save a million lives today or shave 0.0001 percent off the probability of premature human extinction—a one in a million chance of saving at least 8 trillion lives—you should do the latter, allowing a million people to die.

Now, as many have noted since its origin, utilitarianism is a radically counterintuitive moral view. It tells us that we cannot give more weight to our own interests or the interests of those we love than the interests of perfect strangers. We must sacrifice everything for the greater good. Worse, it tells us that we should do so by any effective means: if we can shave 0.0001 percent off the probability of human extinction by killing a million people, we should—so long as there are no other adverse effects.

But even if you think we are allowed to prioritize ourselves and those we love, and not allowed to violate the rights of some in order to help others, shouldn’t you still care about the fate of strangers, even those who do not yet exist? The moral mathematics of aggregate well-being may not be the whole of ethics, but isn’t it a vital part? It belongs to the domain of morality we call “altruism” or “charity.” When we ask what we should do to benefit others, we can’t ignore the disquieting fact that the others who occupy the future may vastly outnumber those who occupy the present, and that their very existence depends on us.

From this point of view, it’s an urgent question how what we do today will affect the further future—urgent especially when it comes to what Nick Bostrom, the philosopher who directs the Future of Humanity Institute, calls the “existential risk” of human extinction. This is the question MacAskill takes up in his new book, What We Owe the Future, a densely researched but surprisingly light read that ranges from omnicidal pandemics to our new AI overlords without ever becoming bleak…(More)”.