Measuring Partial Democracies: Rules and their Implementation


Paper by Debarati Basu,  Shabana Mitra &  Archana Purohit: “This paper proposes a new index that focuses on capturing the extent of democracy in a country using not only the existence of rules but also the extent of their implementation. The measure, based on the axiomatically robust framework of (Alkire and Foster, J Public Econ 95:476–487, 2011), is able to moderate the existence of democratic rules by their actual implementation. By doing this we have a meaningful way of capturing the notion of a partial democracy within a continuum between non-democratic and democratic, separating out situations when the rules exist but are not implemented well. We construct our index using V-Dem data from 1900 to 2010 for over 100 countries to measure the process of democratization across the world. Our results show that we can track the progress in democratization, even when the regime remains either a democracy or an autarchy. This is the notion of partial democracy that our implementation-based index measures through a wide-based index that is consistent, replicable, extendable, easy to interpret, and more nuanced in its ability to capture the essence of democracy…(More)”.

Federated machine learning in data-protection-compliant research


Paper by Alissa Brauneck et al : “In recent years, interest in machine learning (ML) as well as in multi-institutional collaborations has grown, especially in the medical field. However, strict application of data-protection laws reduces the size of training datasets, hurts the performance of ML systems and, in the worst case, can prevent the implementation of research insights in clinical practice. Federated learning can help overcome this bottleneck through decentralised training of ML models within the local data environment, while maintaining the predictive performance of ‘classical’ ML. Thus, federated learning provides immense benefits for cross-institutional collaboration by avoiding the sharing of sensitive personal data(Fig. 1; refs.). Because existing regulations (especially the General Data Protection Regulation 2016/679 of the European Union, or GDPR) set stringent requirements for medical data and rather vague rules for ML systems, researchers are faced with uncertainty. In this comment, we provide recommendations for researchers who intend to use federated learning, a privacy-preserving ML technique, in their research. We also point to areas where regulations are lacking, discussing some fundamental conceptual problems with ML regulation through the GDPR, related especially to notions of transparency, fairness and error-free data. We then provide an outlook on how implications from data-protection laws can be directly incorporated into federated learning tools…(More)”.

Work and meaning in the age of AI


Report by Daniel Susskind: “It is often said that work is not only a source of income but also of meaning. In this paper, I explore the theoretical and empirical literature that addresses this relationship between work and meaning. I show that the relationship is far less clear than is commonly supposed: There is a great heterogeneity in its nature, both among today’s workers and workers over time. I explain why this relationship matters for policymakers and economists concerned about the impact of technology on work. In the short term, it is important for predicting labour market outcomes of interest. It also matters for understanding how artificial intelligence (AI) affects not only the quantity of work but its quality as well: These new technologies may erode the meaning that people get from their work. In the medium term, if jobs are lost, this relationship also matters for designing bold policy interventions like the ‘Universal Basic Income’ and ‘Job Guarantee Schemes’: Their design, and any choice between them, is heavily dependent on policymakers’—often tacit—assumptions about the nature of this underlying relationship between work and meaning. For instance, policymakers must decide whether to simply focus on replacing lost income alone (as with a Universal Basic Income) or, if they believe that work is an important and non-substitutable source of meaning, on protecting jobs for that additional role as well (as with a Job Guarantee Scheme). In closing, I explore the challenge that the age of AI presents for an important feature of liberal political theory: the idea of ‘neutrality.’..(More)”

Ready, set, share: Researchers brace for new data-sharing rules


Jocelyn Kaiser and Jeffrey Brainard in Science: “…By 2025, new U.S. requirements for data sharing will extend beyond biomedical research to encompass researchers across all scientific disciplines who receive federal research funding. Some funders in the European Union and China have also enacted data-sharing requirements. The new U.S. moves are feeding hopes that a worldwide movement toward increased sharing is in the offing. Supporters think it could speed the pace and reliability of science.

Some scientists may only need to make a few adjustments to comply with the policies. That’s because data sharing is already common in fields such as protein crystallography and astronomy. But in other fields the task could be weighty, because sharing is often an afterthought. For example, a study involving 7750 medical research papers found that just 9% of those published from 2015 to 2020 promised to make their data publicly available, and authors of just 3% actually shared, says lead author Daniel Hamilton of the University of Melbourne, who described the finding at the International Congress on Peer Review and Scientific Publication in September 2022. Even when authors promise to share their data, they often fail to follow through. Out of 21,000 journal articles that included data-sharing plans, a study published in PLOS ONE in 2020 found, fewer than 21% provided links to the repository storing the data.

Journals and funders, too, have a mixed record when it comes to supporting data sharing. Research presented at the September 2022 peer-review congress found only about half of the 110 largest public, corporate, and philanthropic funders of health research around the world recommend or require grantees to share data…

“Health research is the field where the ethical obligation to share data is the highest,” says Aidan Tan, a clinician-researcher at the University of Sydney who led the study. “People volunteer in clinical trials and put themselves at risk to advance medical research and ultimately improve human health.”

Across many fields of science, researchers’ support for sharing data has increased during the past decade, surveys show. But given the potential cost and complexity, many are apprehensive about the NIH policy, and other requirements to follow. “How we get there is pretty messy right now,” says Parker Antin, a developmental biologist and associate vice president for research at the University of Arizona. “I’m really not sure whether the total return will justify the cost. But I don’t know of any other way to find out than trying to do it.”

Science offers this guide as researchers prepare to plunge in….(More)”.

The State of Open Data Policy Repository


The State of Open Data Policy Repository is a collection of recent policy developments surrounding open data, data reuse, and data collaboration around the world. 

A refinement of compilation of policies launched at the Open Data Policy Summit last year, the State of Open Data Policy Online Repository is an interactive resource that looks at recent legislation, directives, and proposals that affect open data and data collaboration all around the world. It captures what kinds of data collaboration issues policymakers are currently focused on and where the momentum for data innovation is heading in countries around the world.

Users can filter policies according to region, country, focus, and type of data sharing. The review currently surfaced approximately 60 examples of recent legislative acts, proposals, directives, and other policy documents, from which the Open Data Policy Lab draws findings about the need to promote more innovative policy frameworks.

This collection shows that, despite increased interest in the third wave conception of open data, policy development remains nascent. It is primarily concerned with open data repositories at the expense of alternative forms of collaboration. Most policies listed focus on releasing government data and, elsewhere, most nations still don’t have open data rules or a method to put the policies in place. 

This work reveals a pressing need for institutions to create frameworks that can direct data professionals since there are worries that inaction may both allow for misuse of data and lead to missed chances to use data…(More)”.

Computational Social Science for the Public Good: Towards a Taxonomy of Governance and Policy Challenges


Chapter by Stefaan G. Verhulst: “Computational Social Science (CSS) has grown exponentially as the process of datafication and computation has increased. This expansion, however, is yet to translate into effective actions to strengthen public good in the form of policy insights and interventions. This chapter presents 20 limiting factors in how data is accessed and analysed in the field of CSS. The challenges are grouped into the following six categories based on their area of direct impact: Data Ecosystem, Data Governance, Research Design, Computational Structures and Processes, the Scientific Ecosystem, and Societal Impact. Through this chapter, we seek to construct a taxonomy of CSS governance and policy challenges. By first identifying the problems, we can then move to effectively address them through research, funding, and governance agendas that drive stronger outcomes…(More)”. Full Book: Handbook of Computational Social Science for Policy

Automating Immigration and Asylum: The Uses of New Technologies in Migration and Asylum Governance in Europe


Report by Derya Ozkul: “The EU’s Artificial Intelligence Act proposal categorises AI uses for immigration, asylum and border as high risk, but new technologies are already used in many aspects of migration and asylum ‘management’ beyond imagination. To be able to reflect on the AI Act proposal, we first need to understand what current uses are, but this information is not always publicly available.

The new report by the Algorithmic Fairness for Asylum Seekers and Refugees (AFAR) project shows the multitude of uses of new technologies across Europe at the national and the EU levels. In particular, the report explores in detail the use of forecasting tools, risk assessment and triaging systems, processing of short- and long-term residency and citizenship applications, document verification, speech and dialect recognition, distribution of welfare benefits, matching tools, mobile phone data extraction and electronic monitoring, across Europe. It highlights the need for transparency and thorough training of decision-makers, as well as the inclusion of migrants’ interests in the design, decision, and implementation stages…(More)”.

The Health of Democracies During the Pandemic: Results from a Randomized Survey Experiment


Paper by Marcella Alsan et al: “Concerns have been raised about the “demise of democracy”, possibly accelerated by pandemic-related restrictions. Using a survey experiment involving 8,206 respondents from five Western democracies, we find that subjects randomly exposed to information regarding civil liberties infringements undertaken by China and South Korea to contain COVID-19 became less willing to sacrifice rights and more worried about their long-term-erosion. However, our treatment did not increase support for democratic procedures more generally, despite our prior evidence that pandemic-related health risks diminished such support. These results suggest that the start of the COVID-19 crisis was a particularly vulnerable time for democracies…(More)”.

Global Renewables Watch


About: “The Global Renewables Watch is a first-of-its-kind living atlas intended to map and measure all utility-scale solar and wind installations on Earth using artificial intelligence (AI) and satellite imagery, allowing users to evaluate clean energy transition progress and track trends over time. It also provides unique spatial data on land use trends to help achieve the dual aims of the environmental protection and increasing renewable energy capacity….(More)”

The Smartness Mandate


Book by Orit Halpern and Robert Mitchell: “Smart phones. Smart cars. Smart homes. Smart cities. The imperative to make our world ever smarter in the face of increasingly complex challenges raises several questions: What is this “smartness mandate”? How has it emerged, and what does it say about our evolving way of understanding—and managing—reality? How have we come to see the planet and its denizens first and foremost as data-collecting instruments?

In The Smartness Mandate, Orit Halpern and Robert Mitchell radically suggest that “smartness” is not primarily a technology, but rather an epistemology. Through this lens, they offer a critical exploration of the practices, technologies, and subjects that such an understanding relies upon—above all, artificial intelligence and machine learning. The authors approach these not simply as techniques for solving problems of calculations, but rather as modes of managing life (human and other) in terms of neo-Darwinian evolution, distributed intelligences, and “resilience,” all of which have serious implications for society, politics, and the environment.

The smartness mandate constitutes a new form of planetary governance, and Halpern and Mitchell aim to map the logic of this seemingly inexorable and now naturalized demand to compute, illuminate the genealogy of how we arrived here, and point to alternative imaginaries of the possibilities and potentials of smart technologies and infrastructures…(More)”.