Constitutional democracy and technology in the age of artificial intelligence


Paper by Paul Nemitz: “Given the foreseeable pervasiveness of artificial intelligence (AI) in modern societies, it is legitimate and necessary to ask the question how this new technology must be shaped to support the maintenance and strengthening of constitutional democracy.

This paper first describes the four core elements of today’s digital power concentration, which need to be seen in cumulation and which, seen together, are both a threat to democracy and to functioning markets. It then recalls the experience with the lawless Internet and the relationship between technology and the law as it has developed in the Internet economy and the experience with GDPR before it moves on to the key question for AI in democracy, namely which of the challenges of AI can be safely and with good conscience left to ethics, and which challenges of AI need to be addressed by rules which are enforceable and encompass the legitimacy of democratic process, thus laws.

The paper closes with a call for a new culture of incorporating the principles of democracy, rule of law and human rights by design in AI and a three-level technological impact assessment for new technologies like AI as a practical way forward for this purpose….(More)”.

Information and Technology in Open Justice


Introduction by Mila Gasco-Hernandez and Carlos Jimenez-Gomezto to Special Issue of Social Science Computer Review:  “The topic of open justice has only been little explored perhaps due to its traditionally having been considered a “closed” field. There is still a need to know what open justice really means, to explore the use of information and technology in enabling open justice, and to understand what openness in the judiciary can do to improve government, society, and democracy. This special issue aims to shed light on the concept of openness in the judiciary by identifying and analyzing initiatives across the world….(More)”.

Regulating the Regulators: Tracing the Emergence of the Political Transparency Laws in Chile


Conference Paper by Bettina Schorr: “Due to high social inequalities and weak public institutions, political corruption and the influence of business elites on policy-makers are widespread in the Andean region. The consequences for the opportunities of sustainable development are serious: regulation limiting harmful business activities or (re-)distributive reforms are difficult to achieve and public resources often end up as private gains instead of serving development purposes.

Given international and domestic pressures, political corruption has reached the top of the political agendas in many countries. However, frequently transparency goals do not materialize into new binding policies or, when reforms are enacted, they suffer from severe implementation gaps.

The paper analyses transparency politics in Chile where a series of reforms regarding political transparency were implemented since 2014. Hence, Chile counts among the few successful cases in the region. By tracing the process that led to the emergence of new transparency policies in Chile, the paper elaborates an analytical framework for the explanation of institutional innovation in the case of political transparency. In particular, the study emphasizes the importance of civil society actors´ involvement in the whole policy cycle, particularly in the stages of formulation, implementation and evaluation….(More)”.

When Ostrom Meets Blockchain: Exploring the Potentials of Blockchain for Commons Governance


Paper by David Rozas et all:  “Blockchain technologies have generated excitement, yet their potential to enable new forms of governanceremains largely unexplored. Two confronting standpoints dominate the emergent debate around blockchain-based governance: discourses characterised by the presence of techno-determinist and market-driven values, which tend to ignore the complexity of social organisation; and critical accounts of such discourses which, whilst contributing to identifying limitations, consider the role of traditional centralised institutions as inherently necessary to enable democratic forms of governance. Therefore the question arises, can we build perspectives of blockchain-based governance that go beyond markets and states? In this article we draw on the Nobel laureate economist Elinor Ostrom’s principles for self-governance of communities to explore the transformative potential of blockchain.

We approach blockchain through the identification and conceptualisation of affordances that this technology may provide to communities. For each affordance, we carry out a detailed analysis situating each in the context of Ostrom’s principles, considering both the potentials of algorithmic governance and the importance of incorporating communities’ social practices. The relationships found between these affordances and Ostrom’s principles allow us to provide a perspective focussed on blockchain-based commons governance. By carrying out this analysis, we aim to expand the debate from one dominated by a culture of competition to one that promotes a culture of cooperation….(More)”.

The Janus Face of the Liberal Information Order


Paper by Henry Farrell and Abraham L. Newman: “…Domestically, policy-makers and scholars argued that information openness, like economic openness, would go hand-in-glove with political liberalization and the spread of democratic values. This was perhaps, in part an accident of timing: the Internet – which seemed to many to be inherently resistant to censorship – burgeoned shortly after the collapse of Communism in the Soviet Union and Eastern Europe. Politicians celebrated the dawn of a new era of open communication, while scholars began to argue that the spread of the Internet would lead to the spread of democracy (Diamond 2010;Shirky 2008).

A second wave of literature suggested that Internet-based social media had played a crucial role in spreading freedom in the Arab Spring (Howard 2010; Hussain and Howard 2013). There were some skeptics who highlighted the vexed relationship between open networks and the closed national politics of autocracies (Goldsmith and Wu 2006), or who pointed out that the Internet was nowhere near as censorship-resistant as early optimists had supposed (Deibert et al. 2008). Even these pessimists seemed to believe that the Internet could bolster liberalism in healthy democracies, although it would by no means necessarily prevail over tyranny.

The international liberal order for information, however, finds itself increasingly on shaky ground. Non-democratic regimes ranging from China to Saudi Arabia have created domestic technological infrastructures, which undermine and provide an alternative to the core principles of the regime (Boas 2006; Deibert 2008).

The European Union, while still generally supportive of open communication and free speech, has grown skeptical of the regime’s focus on unfettered economic access and has used privacy and anti-trust policy to challenge its most neo-liberal elements (Newman 2008). Non-state actors like Wikileaks have relied on information openness as a channel of disruption and perhaps manipulation. 

More troubling are the arguments of a new literature – that open information flows are less a harbinger of democracy than a vector of attack…

How can IR scholars make sense of this Janus-face quality of information? In this brief memo, we argue that much of the existing work on information technology and information flows suffers from two key deficiencies.

First – there has been an unhelpful separation between two important debates about information flows and liberalism. One – primarily focused on the international level – concerned global governance of information networks, examining how states (especially the US) arrived at and justified their policy stances, and how power dynamics shaped the battles between liberal and illiberal states over what the relevant governance arrangements should be (Klein 2002; Singh 2008; Mueller 2009). …

This leads to the second problem – that research has failed to appreciate the dynamics of contestation over time…(More)”

The Private Impact of Public Information: Landsat Satellite Maps and Gold Exploration


Paper by Abhishek Nagaraj: “The public sector provides many types of information, such as geographic and census maps, that firms use when making decisions. However, the economic implications of such information infrastructure remain unexamined.

This study estimates the impact of information from Landsat, a NASA satellite mapping program, on the discovery of new deposits by large and small firms in the gold exploration industry. Using a simple theoretical framework, I argue that public sector information guides firms on the viability of risky projects and increases the likelihood of project success.

This effect is especially relevant for smaller firms, who face higher project costs and are particularly deterred from engaging in risky projects. I test the predictions of this framework by exploiting idiosyncratic timing variation in Landsat coverage across regions. Landsat maps nearly doubled the rate of significant gold discoveries after a region was mapped and increased the market share of smaller, junior firms from about 10% to 25%.

Public information infrastructure, including mapping efforts, seem to be an important, yet overlooked, driver of private-sector productivity and small business performance…(More)”

The Global Commons of Data


Paper by Jennifer Shkabatur: “Data platform companies (such as Facebook, Google, or Twitter) amass and process immense amounts of data that is generated by their users. These companies primarily use the data to advance their commercial interests, but there is a growing public dismay regarding the adverse and discriminatory impacts of their algorithms on society at large. The regulation of data platform companies and their algorithms has been hotly debated in the literature, but current approaches often neglect the value of data collection, defy the logic of algorithmic decision-making, and exceed the platform companies’ operational capacities.

This Article suggests a different approach — an open, collaborative, and incentives-based stance toward data platforms that takes full advantage of the tremendous societal value of user-generated data. It contends that this data shall be recognized as a “global commons,” and access to it shall be made available to a wide range of independent stakeholders — research institutions, journalists, public authorities, and international organizations. These external actors would be able to utilize the data to address a variety of public challenges, as well as observe from within the operation and impacts of the platforms’ algorithms.

After making the theoretical case for the “global commons of data,” the Article explores the practical implementation of this model. First, it argues that a data commons regime should operate through a spectrum of data sharing and usage modalities that would protect the commercial interests of data platforms and the privacy of data users. Second, it discusses regulatory measures and incentives that can solicit the collaboration of platform companies with the commons model. Lastly, it explores the challenges embedded in this approach….(More)”.

Beyond Open vs. Closed: Balancing Individual Privacy and Public Accountability in Data Sharing


Paper by Bill Howe et al: “Data too sensitive to be “open” for analysis and re-purposing typically remains “closed” as proprietary information. This dichotomy undermines efforts to make algorithmic systems more fair, transparent, and accountable. Access to proprietary data in particular is needed by government agencies to enforce policy, researchers to evaluate methods, and the public to hold agencies accountable; all of these needs must be met while preserving individual privacy and firm competitiveness. In this paper, we describe an integrated legaltechnical approach provided by a third-party public-private data trust designed to balance these competing interests.

Basic membership allows firms and agencies to enable low-risk access to data for compliance reporting and core methods research, while modular data sharing agreements support a wide array of projects and use cases. Unless specifically stated otherwise in an agreement, all data access is initially provided to end users through customized synthetic datasets that offer a) strong privacy guarantees, b) removal of signals that could expose competitive advantage for the data providers, and c) removal of biases that could reinforce discriminatory policies, all while maintaining empirically good fidelity to the original data. We find that the liberal use of synthetic data, in conjunction with strong legal protections over raw data, strikes a tunable balance between transparency, proprietorship, privacy, and research objectives; and that the legal-technical framework we describe can form the basis for organizational data trusts in a variety of contexts….(More)”.

The Nail Finds a Hammer: Self-Sovereign Identity, Design Principles, and Property Rights in the Developing World


Report by Michael Graglia, Christopher Mellon and Tim Robustelli: “Our interest in identity systems was an inevitable outgrowth of our earlier work on blockchain-based1 land registries.2 Property registries, which at the simplest level are ledgers of who has which rights to which asset, require a very secure and reliable means of identifying both people and properties. In the course of investigating solutions to that problem, we began to appreciate the broader challenges of digital identity and its role in international development. And the more we learned about digital identity, the more convinced we became of the need for self-sovereign identity, or SSI. This model, and the underlying principles of identity which it incorporates, will be described in detail in this paper.

We believe that the great potential of SSI is that it can make identity in the digital world function more like identity in the physical world, in which every person has a unique and persistent identity which is represented to others by means of both their physical attributes and a collection of credentials attested to by various external sources of authority. These credentials are stored and controlled by the identity holder—typically in a wallet—and presented to different people for different reasons at the identity holder’s discretion. Crucially, the identity holder controls what information to present based on the environment, trust level, and type of interaction. Moreover, their fundamental identity persists even though the credentials by which it is represented may change over time.

The digital incarnation of this model has many benefits, including both greatly improved privacy and security, and the ability to create more trustworthy online spaces. Social media and news sites, for example, might limit participation to users with verified identities, excluding bots and impersonators.

The need for identification in the physical world varies based on location and social context. We expect to walk in relative anonymity down a busy city street, but will show a driver’s license to enter a bar, and both a driver’s license and a birth certificate to apply for a passport. There are different levels of ID and supporting documents required for each activity. But in each case, access to personal information is controlled by the user who may choose whether or not to share it.

Self-sovereign identity gives users complete control of their own identities and related personal data, which sits encrypted in distributed storage instead of being stored by a third party in a central database. In older, “federated identity” models, a single account—a Google account, for example—might be used to log in to a number of third-party sites, like news sites or social media platforms. But in this model a third party brokers all of these ID transactions, meaning that in exchange for the convenience of having to remember fewer passwords, the user must sacrifice a degree of privacy.

A real world equivalent would be having to ask the state to share a copy of your driver’s license with the bar every time you wanted to prove that you were over the age of 21. SSI, in contrast, gives the user a portable, digital credential (like a driver’s license or some other document that proves your age), the authenticity of which can be securely validated via cryptography without the recipient having to check with the authority that issued it. This means that while the credential can be used to access many different sites and services, there is no third-party broker to track the services to which the user is authenticating. Furthermore, cryptographic techniques called “zero-knowledge proofs” (ZKPs) can be used to prove possession of a credential without revealing the credential itself. This makes it possible, for example, for users to prove that they are over the age of 21 without having to share their actual birth dates, which are both sensitive information and irrelevant to a binary, yes-or-no ID transaction….(More)”.

Why Doctors Hate Their Computers


Atul Gawande at the New Yorker: “….More than ninety per cent of American hospitals have been computerized during the past decade, and more than half of Americans have their health information in the Epic system. Seventy thousand employees of Partners HealthCare—spread across twelve hospitals and hundreds of clinics in New England—were going to have to adopt the new software. I was in the first wave of implementation, along with eighteen thousand other doctors, nurses, pharmacists, lab techs, administrators, and the like.

The surgeons at the training session ranged in age from thirty to seventy, I estimated—about sixty per cent male, and one hundred per cent irritated at having to be there instead of seeing patients. Our trainer looked younger than any of us, maybe a few years out of college, with an early-Justin Bieber wave cut, a blue button-down shirt, and chinos. Gazing out at his sullen audience, he seemed unperturbed. I learned during the next few sessions that each instructor had developed his or her own way of dealing with the hostile rabble. One was encouraging and parental, another unsmiling and efficient. Justin Bieber took the driver’s-ed approach: You don’t want to be here; I don’t want to be here; let’s just make the best of it.

I did fine with the initial exercises, like looking up patients’ names and emergency contacts. When it came to viewing test results, though, things got complicated. There was a column of thirteen tabs on the left side of my screen, crowded with nearly identical terms: “chart review,” “results review,” “review flowsheet.” We hadn’t even started learning how to enter information, and the fields revealed by each tab came with their own tools and nuances.

But I wasn’t worried. I’d spent my life absorbing changes in computer technology, and I knew that if I pushed through the learning curve I’d eventually be doing some pretty cool things. In 1978, when I was an eighth grader in Ohio, I built my own one-kilobyte computer from a mail-order kit, learned to program in basic, and was soon playing the arcade game Pong on our black-and-white television set. The next year, I got a Commodore 64 from RadioShack and became the first kid in my school to turn in a computer-printed essay (and, shortly thereafter, the first to ask for an extension “because the computer ate my homework”). As my Epic training began, I expected my patience to be rewarded in the same way.

My hospital had, over the years, computerized many records and processes, but the new system would give us one platform for doing almost everything health professionals needed—recording and communicating our medical observations, sending prescriptions to a patient’s pharmacy, ordering tests and scans, viewing results, scheduling surgery, sending insurance bills. With Epic, paper lab-order slips, vital-signs charts, and hospital-ward records would disappear. We’d be greener, faster, better.

But three years later I’ve come to feel that a system that promised to increase my mastery over my work has, instead, increased my work’s mastery over me. I’m not the only one. A 2016 study found that physicians spent about two hours doing computer work for every hour spent face to face with a patient—whatever the brand of medical software. In the examination room, physicians devoted half of their patient time facing the screen to do electronic tasks. And these tasks were spilling over after hours. The University of Wisconsin found that the average workday for its family physicians had grown to eleven and a half hours. The result has been epidemic levels of burnout among clinicians. Forty per cent screen positive for depression, and seven per cent report suicidal thinking—almost double the rate of the general working population.

Something’s gone terribly wrong. Doctors are among the most technology-avid people in society; computerization has simplified tasks in many industries. Yet somehow we’ve reached a point where people in the medical profession actively, viscerally, volubly hate their computers….(More)”.