Artificial Intelligence: Risks to Privacy and Democracy


Karl Manheim and Lyric Kaplan at Yale Journal of Law and Technology: “A “Democracy Index” is published annually by the Economist. For 2017, it reported that half of the world’s countries scored lower than the previous year. This included the United States, which was demoted from “full democracy” to “flawed democracy.” The principal factor was “erosion of confidence in government and public institutions.” Interference by Russia and voter manipulation by Cambridge Analytica in the 2016 presidential election played a large part in that public disaffection.

Threats of these kinds will continue, fueled by growing deployment of artificial intelligence (AI) tools to manipulate the preconditions and levers of democracy. Equally destructive is AI’s threat to decisional andinforma-tional privacy. AI is the engine behind Big Data Analytics and the Internet of Things. While conferring some consumer benefit, their principal function at present is to capture personal information, create detailed behavioral profiles and sell us goods and agendas. Privacy, anonymity and autonomy are the main casualties of AI’s ability to manipulate choices in economic and political decisions.

The way forward requires greater attention to these risks at the nation-al level, and attendant regulation. In its absence, technology giants, all of whom are heavily investing in and profiting from AI, will dominate not only the public discourse, but also the future of our core values and democratic institutions….(More)”.

Lost and Saved . . . Again: The Moral Panic about the Loss of Community Takes Hold of Social Media


Keith N. Hampton and Barry Wellman in Contemporary Sociology:”Why does every generation believe that relationships were stronger and community better in the recent past? Lamenting about the loss of community, based on a selective perception of the present and an idealization of ‘‘traditional community,’’ dims awareness of powerful inequalities and cleavages that have always pervaded human society and favors deterministic models over a nuanced understanding of how network affordances contribute to different outcomes. The beˆtes noirs have varied according to the moral panic of the times: industrialization, bureaucratization, urbanization, capitalism, socialism, and technological developments have all been tabbed by such diverse commentators as Thomas Jefferson (1784), Karl Marx (1852), Louis Wirth (1938), Maurice Stein (1960), Robert Bellah et al. (1996), and Tom Brokaw (1998). Each time, observers look back nostalgically to what they supposed were the supportive, solidary communities of the previous generation. Since the advent of the internet, the moral panicers have seized on this technology as the latest cause of lost community, pointing with alarm to what digital technologies are doing to relationships. As the focus shifts to social media and mobile devices, the panic seems particularly acute….

Taylor Dotson’s (2017) recent book Technically Together has a broader timeline for the demise of community. He sees it as happen- ing around the time the internet was popularized, with community even worse off as a result of Facebook and mobile devices. Dotson not only blames new technologies for the decline of community, but social theory, specifically the theory and the practice of ‘‘networked individualism’’: the relational turn from bounded, densely knit local groups to multiple, partial, often far-flung social networks (Rainie and Wellman 2012). Dotson takes the admirable position that social science should do more to imagine different outcomes, new technological possibilities that can be created by tossing aside the trends of today and engineering social change through design….

Some alarm in the recognition that the nature of community is changing as technologies change is sensible, and we have no quarrel with the collective desire to have better, more supportive friends, families, and communities. As Dotson implies, the maneuverability in having one’s own individually networked community can come at the cost of local group solidarity. Indeed, we have also taken action that does more than pontificate to promote local community, building community on and offline (Hampton 2011).

Yet part of contemporary unease comes from a selective perception of the present and an idealization of other forms of community. There is nostalgia for a perfect form of community that never was. Longing for a time when the grass was ever greener dims an awareness of the powerful stresses and cleavages that have always pervaded human society. And advocates, such as Dotson (2017), who suggest the need to save a particular type of community at the expense of another, often do so blind of the potential tradeoffs….(More)”

Constitutional democracy and technology in the age of artificial intelligence


Paper by Paul Nemitz: “Given the foreseeable pervasiveness of artificial intelligence (AI) in modern societies, it is legitimate and necessary to ask the question how this new technology must be shaped to support the maintenance and strengthening of constitutional democracy.

This paper first describes the four core elements of today’s digital power concentration, which need to be seen in cumulation and which, seen together, are both a threat to democracy and to functioning markets. It then recalls the experience with the lawless Internet and the relationship between technology and the law as it has developed in the Internet economy and the experience with GDPR before it moves on to the key question for AI in democracy, namely which of the challenges of AI can be safely and with good conscience left to ethics, and which challenges of AI need to be addressed by rules which are enforceable and encompass the legitimacy of democratic process, thus laws.

The paper closes with a call for a new culture of incorporating the principles of democracy, rule of law and human rights by design in AI and a three-level technological impact assessment for new technologies like AI as a practical way forward for this purpose….(More)”.

Information and Technology in Open Justice


Introduction by Mila Gasco-Hernandez and Carlos Jimenez-Gomezto to Special Issue of Social Science Computer Review:  “The topic of open justice has only been little explored perhaps due to its traditionally having been considered a “closed” field. There is still a need to know what open justice really means, to explore the use of information and technology in enabling open justice, and to understand what openness in the judiciary can do to improve government, society, and democracy. This special issue aims to shed light on the concept of openness in the judiciary by identifying and analyzing initiatives across the world….(More)”.

Regulating the Regulators: Tracing the Emergence of the Political Transparency Laws in Chile


Conference Paper by Bettina Schorr: “Due to high social inequalities and weak public institutions, political corruption and the influence of business elites on policy-makers are widespread in the Andean region. The consequences for the opportunities of sustainable development are serious: regulation limiting harmful business activities or (re-)distributive reforms are difficult to achieve and public resources often end up as private gains instead of serving development purposes.

Given international and domestic pressures, political corruption has reached the top of the political agendas in many countries. However, frequently transparency goals do not materialize into new binding policies or, when reforms are enacted, they suffer from severe implementation gaps.

The paper analyses transparency politics in Chile where a series of reforms regarding political transparency were implemented since 2014. Hence, Chile counts among the few successful cases in the region. By tracing the process that led to the emergence of new transparency policies in Chile, the paper elaborates an analytical framework for the explanation of institutional innovation in the case of political transparency. In particular, the study emphasizes the importance of civil society actors´ involvement in the whole policy cycle, particularly in the stages of formulation, implementation and evaluation….(More)”.

When Ostrom Meets Blockchain: Exploring the Potentials of Blockchain for Commons Governance


Paper by David Rozas et all:  “Blockchain technologies have generated excitement, yet their potential to enable new forms of governanceremains largely unexplored. Two confronting standpoints dominate the emergent debate around blockchain-based governance: discourses characterised by the presence of techno-determinist and market-driven values, which tend to ignore the complexity of social organisation; and critical accounts of such discourses which, whilst contributing to identifying limitations, consider the role of traditional centralised institutions as inherently necessary to enable democratic forms of governance. Therefore the question arises, can we build perspectives of blockchain-based governance that go beyond markets and states? In this article we draw on the Nobel laureate economist Elinor Ostrom’s principles for self-governance of communities to explore the transformative potential of blockchain.

We approach blockchain through the identification and conceptualisation of affordances that this technology may provide to communities. For each affordance, we carry out a detailed analysis situating each in the context of Ostrom’s principles, considering both the potentials of algorithmic governance and the importance of incorporating communities’ social practices. The relationships found between these affordances and Ostrom’s principles allow us to provide a perspective focussed on blockchain-based commons governance. By carrying out this analysis, we aim to expand the debate from one dominated by a culture of competition to one that promotes a culture of cooperation….(More)”.

The Janus Face of the Liberal Information Order


Paper by Henry Farrell and Abraham L. Newman: “…Domestically, policy-makers and scholars argued that information openness, like economic openness, would go hand-in-glove with political liberalization and the spread of democratic values. This was perhaps, in part an accident of timing: the Internet – which seemed to many to be inherently resistant to censorship – burgeoned shortly after the collapse of Communism in the Soviet Union and Eastern Europe. Politicians celebrated the dawn of a new era of open communication, while scholars began to argue that the spread of the Internet would lead to the spread of democracy (Diamond 2010;Shirky 2008).

A second wave of literature suggested that Internet-based social media had played a crucial role in spreading freedom in the Arab Spring (Howard 2010; Hussain and Howard 2013). There were some skeptics who highlighted the vexed relationship between open networks and the closed national politics of autocracies (Goldsmith and Wu 2006), or who pointed out that the Internet was nowhere near as censorship-resistant as early optimists had supposed (Deibert et al. 2008). Even these pessimists seemed to believe that the Internet could bolster liberalism in healthy democracies, although it would by no means necessarily prevail over tyranny.

The international liberal order for information, however, finds itself increasingly on shaky ground. Non-democratic regimes ranging from China to Saudi Arabia have created domestic technological infrastructures, which undermine and provide an alternative to the core principles of the regime (Boas 2006; Deibert 2008).

The European Union, while still generally supportive of open communication and free speech, has grown skeptical of the regime’s focus on unfettered economic access and has used privacy and anti-trust policy to challenge its most neo-liberal elements (Newman 2008). Non-state actors like Wikileaks have relied on information openness as a channel of disruption and perhaps manipulation. 

More troubling are the arguments of a new literature – that open information flows are less a harbinger of democracy than a vector of attack…

How can IR scholars make sense of this Janus-face quality of information? In this brief memo, we argue that much of the existing work on information technology and information flows suffers from two key deficiencies.

First – there has been an unhelpful separation between two important debates about information flows and liberalism. One – primarily focused on the international level – concerned global governance of information networks, examining how states (especially the US) arrived at and justified their policy stances, and how power dynamics shaped the battles between liberal and illiberal states over what the relevant governance arrangements should be (Klein 2002; Singh 2008; Mueller 2009). …

This leads to the second problem – that research has failed to appreciate the dynamics of contestation over time…(More)”

The Private Impact of Public Information: Landsat Satellite Maps and Gold Exploration


Paper by Abhishek Nagaraj: “The public sector provides many types of information, such as geographic and census maps, that firms use when making decisions. However, the economic implications of such information infrastructure remain unexamined.

This study estimates the impact of information from Landsat, a NASA satellite mapping program, on the discovery of new deposits by large and small firms in the gold exploration industry. Using a simple theoretical framework, I argue that public sector information guides firms on the viability of risky projects and increases the likelihood of project success.

This effect is especially relevant for smaller firms, who face higher project costs and are particularly deterred from engaging in risky projects. I test the predictions of this framework by exploiting idiosyncratic timing variation in Landsat coverage across regions. Landsat maps nearly doubled the rate of significant gold discoveries after a region was mapped and increased the market share of smaller, junior firms from about 10% to 25%.

Public information infrastructure, including mapping efforts, seem to be an important, yet overlooked, driver of private-sector productivity and small business performance…(More)”

The Global Commons of Data


Paper by Jennifer Shkabatur: “Data platform companies (such as Facebook, Google, or Twitter) amass and process immense amounts of data that is generated by their users. These companies primarily use the data to advance their commercial interests, but there is a growing public dismay regarding the adverse and discriminatory impacts of their algorithms on society at large. The regulation of data platform companies and their algorithms has been hotly debated in the literature, but current approaches often neglect the value of data collection, defy the logic of algorithmic decision-making, and exceed the platform companies’ operational capacities.

This Article suggests a different approach — an open, collaborative, and incentives-based stance toward data platforms that takes full advantage of the tremendous societal value of user-generated data. It contends that this data shall be recognized as a “global commons,” and access to it shall be made available to a wide range of independent stakeholders — research institutions, journalists, public authorities, and international organizations. These external actors would be able to utilize the data to address a variety of public challenges, as well as observe from within the operation and impacts of the platforms’ algorithms.

After making the theoretical case for the “global commons of data,” the Article explores the practical implementation of this model. First, it argues that a data commons regime should operate through a spectrum of data sharing and usage modalities that would protect the commercial interests of data platforms and the privacy of data users. Second, it discusses regulatory measures and incentives that can solicit the collaboration of platform companies with the commons model. Lastly, it explores the challenges embedded in this approach….(More)”.

Beyond Open vs. Closed: Balancing Individual Privacy and Public Accountability in Data Sharing


Paper by Bill Howe et al: “Data too sensitive to be “open” for analysis and re-purposing typically remains “closed” as proprietary information. This dichotomy undermines efforts to make algorithmic systems more fair, transparent, and accountable. Access to proprietary data in particular is needed by government agencies to enforce policy, researchers to evaluate methods, and the public to hold agencies accountable; all of these needs must be met while preserving individual privacy and firm competitiveness. In this paper, we describe an integrated legaltechnical approach provided by a third-party public-private data trust designed to balance these competing interests.

Basic membership allows firms and agencies to enable low-risk access to data for compliance reporting and core methods research, while modular data sharing agreements support a wide array of projects and use cases. Unless specifically stated otherwise in an agreement, all data access is initially provided to end users through customized synthetic datasets that offer a) strong privacy guarantees, b) removal of signals that could expose competitive advantage for the data providers, and c) removal of biases that could reinforce discriminatory policies, all while maintaining empirically good fidelity to the original data. We find that the liberal use of synthetic data, in conjunction with strong legal protections over raw data, strikes a tunable balance between transparency, proprietorship, privacy, and research objectives; and that the legal-technical framework we describe can form the basis for organizational data trusts in a variety of contexts….(More)”.