Essay by Peter Strauss: “This essay has been written to set the context for a future issue of Daedalus, the quarterly of the American Academy of Arts and Sciences, addressing the prospects of American administrative law in the Twenty-first Century. It recounts the growth of American government over the centuries since its founding, in response to the profound changes in the technology, economy, and scientific understandings it must deal with, under a Constitution written for the governance of a dispersed agrarian population operating with hand tools in a localized economy. It then suggests profound challenges of the present day facing administrative law’s development: the transition from processes of the paper age to those of the digital age; the steadily growing centralization of decision in an opaque, political presidency, displacing the focused knowledge and expertise of agencies Congress created to pursue particular governmental ends; the thickening, as well, of the political layer within agencies themselves, threatening similar displacements; and the revival in the courts of highly formalized analytic techniques inviting a return to the forms of government those who wrote the Constitution might themselves have imagined. The essay will not be published until months after the November election. While President Trump’s first term in office has sharply illustrated an imbalance in American governance between law and politics and law, reason and unreason, that imbalance is hardly new; it has been growing for decades. There lie the challenges….(More)”
Essay by Nathan J. Robinson: “…This means that a lot of the most vital information will end up locked behind the paywall. And while I am not much of a New Yorker fan either, it’s concerning that the Hoover Institute will freely give you Richard Epstein’s infamous article downplaying the threat of coronavirus, but Isaac Chotiner’s interview demolishing Epstein requires a monthly subscription, meaning that the lie is more accessible than its refutation. Eric Levitz of New York is one of the best and most prolific left political commentators we have. But unless you’re a subscriber of New York, you won’t get to hear much of what he has to say each month.
Possibly even worse is the fact that so much academic writing is kept behind vastly more costly paywalls. A white supremacist on YouTube will tell you all about race and IQ but if you want to read a careful scholarly refutation, obtaining a legal PDF from the journal publisher would cost you $14.95, a price nobody in their right mind would pay for one article if they can’t get institutional access. (I recently gave up on trying to access a scholarly article because I could not find a way to get it for less than $39.95, though in that case the article was garbage rather than gold.) Academic publishing is a nightmarish patchwork, with lots of articles advertised at exorbitant fees on one site, and then for free on another, or accessible only through certain databases, which your university or public library may or may not have access to. (Libraries have to budget carefully because subscription prices are often nuts. A library subscription to the Journal of Coordination Chemistry, for instance, costs $11,367 annually.)
Of course, people can find their ways around paywalls. SciHub is a completely illegal but extremely convenient means of obtaining academic research for free. (I am purely describing it, not advocating it.) You can find a free version of the article debunking race and IQ myths on ResearchGate, a site that has engaged in mass copyright infringement in order to make research accessible. Often, because journal publishers tightly control access to their copyrighted work in order to charge those exorbitant fees for PDFs, the versions of articles that you can get for free are drafts that have not yet gone through peer review, and have thus been subjected to less scrutiny. This means that the more reliable an article is, the less accessible it is. On the other hand, pseudo-scholarhip is easy to find. Right-wing think tanks like the Cato Institute, the Foundation for Economic Education, the Hoover Institution, the Mackinac Center, the American Enterprise Institute, and the Heritage Foundation pump out slickly-produced policy documents on every subject under the sun. They are utterly untrustworthy—the conclusion is always going to be “let the free market handle the problem,” no matter what the problem or what the facts of the case. But it is often dressed up to look sober-minded and non-ideological.
It’s not easy or cheap to be an “independent researcher.” When I was writing my first book, Superpredator, I wanted to look through newspaper, magazine, and journal archives to find everything I could about Bill Clinton’s record on race. I was lucky I had a university affiliation, because this gave me access to databases like LexisNexis. If I hadn’t, the cost of finding out what I wanted to find out would likely have run into the thousands of dollars.
A problem beyond cost, though, is convenience. I find that even when I am doing research through databases and my university library, it is often an absolute mess: the sites are clunky and constantly demanding login credentials. The amount of time wasted in figuring out how to obtain a piece of research material is a massive cost on top of the actual pricing. The federal court document database, PACER, for instance, charges 10 cents a page for access to records, which adds up quickly since legal research often involves looking through thousands of pages. They offer an exemption if you are a researcher or can’t afford it, but to get the exemption you have to fill out a three page form and provide an explanation of both why you need each document and why you deserve the exemption. This is a waste of time that inhibits people’s productivity and limits their access to knowledge.
In fact, to see just how much human potential is being squandered by having knowledge dispensed by the “free market,” let us briefly picture what “totally democratic and accessible knowledge” would look like…(More)”.
Paper by Michael Schudson: “Transparency” has become a widely recognized, even taken for granted, value in contemporary democracies, but this has been true only since the 1970s. For all of the obvious virtues of transparency for democracy, they have not always been recognized or they have been recognized, as in the U.S. Freedom of Information Act of 1966, with significant qualifications. This essay catalogs important shortcomings of transparency for democracy, as when it clashes with national security, personal privacy, and the importance of maintaining the capacity of government officials to talk frankly with one another without fear that half-formulated ideas, thoughts, and proposals will become public. And when government information becomes public, that does not make it equally available to all—publicity is not in itself democratic, as public information (as in open legislative committee hearings) is more readily accessed by empowered groups with lobbyists able to attend and monitor the provision of the information. Transparency is an element in democratic government, but it is by no means a perfect emblem of democracy….(More)”.
On-Line Exhibition by the Glass Room: “…In this exhibition – aimed at young people as well as adults – we explore how social media and the web have changed the way we read information and react to it. Learn why finding “fake news” is not as easy as it sounds, and how the term “fake news” is as much a problem as the news it describes. Dive into the world of deep fakes, which are now so realistic that they are virtually impossible to detect. And find out why social media platforms are designed to keep us hooked, and how they can be used to change our minds. You can also read our free Data Detox Kit, which reveals how to tell facts from fiction and why it benefits everyone around us when we take a little more care about what we share…(More)”.
EXPLORE OUR ONLINE EXHIBITION
MIT Open Learning: “Can you recognize a digitally manipulated video when you see one? It’s harder than most people realize. As the technology to produce realistic “deepfakes” becomes more easily available, distinguishing fact from fiction will only get more challenging. A new digital storytelling project from MIT’s Center for Advanced Virtuality aims to educate the public about the world of deepfakes with “In Event of Moon Disaster.”
This provocative website showcases a “complete” deepfake (manipulated audio and video) of U.S. President Richard M. Nixon delivering the real contingency speech written in 1969 for a scenario in which the Apollo 11 crew were unable to return from the moon. The team worked with a voice actor and a company called Respeecher to produce the synthetic speech using deep learning techniques. They also worked with the company Canny AI to use video dialogue replacement techniques to study and replicate the movement of Nixon’s mouth and lips. Through these sophisticated AI and machine learning technologies, the seven-minute film shows how thoroughly convincing deepfakes can be….
Alongside the film, moondisaster.org features an array of interactive and educational resources on deepfakes. Led by Panetta and Halsey Burgund, a fellow at MIT Open Documentary Lab, an interdisciplinary team of artists, journalists, filmmakers, designers, and computer scientists has created a robust, interactive resource site where educators and media consumers can deepen their understanding of deepfakes: how they are made and how they work; their potential use and misuse; what is being done to combat deepfakes; and teaching and learning resources….(More)”.
Paper by Nardine Alnemr: “Challenges in attaining deliberative democratic ideals – such as inclusion, authenticity and consequentiality – in wider political systems have driven the development of artificially-designed citizen deliberation. These designed deliberations, however, are expert-driven. Whereas they may achieve ‘deliberativeness’, their design and implementation are undemocratic and limit deliberative democracy’s emancipatory goals. This is relevant in respect to the role of facilitation. In online deliberation, algorithms and artificial actors replace the central role of human facilitators. The detachment of such designed settings from wider contexts is particularly troubling from a democratic perspective. Digital technologies in online deliberation are not developed in a manner consistent with democratic ideals and are not being amenable to scrutiny by citizens. I discuss the theoretical and the practical blind spots of algorithmic facilitation. Based on these, I present recommendations to democratise the design and implementation of online deliberation with a focus on chatbots as facilitators….(More)”.
Book by Volker Boehme-Neßler: “This book argues that in the digital era, a reinvention of democracy is urgently necessary. It discusses the mounting evidence showing that digitalisation is pushing classical parliamentary democracy to its limits, offering examples such as how living in a filter bubble and debating with political bots is profoundly changing democratic communication, making it more emotional, hysterical even, and less rational. It also explores how classical democracy involves long, slow thinking and decision processes, which don’t fit to the ever-increasing speed of the digital world, and examines the technical developments some fear will lead to governance by algorithms.In the digitalised world, democracy no longer functions as it has in the past. This does not mean waving goodbye to democracy – instead we need to reinvent it. How this could work is the central theme of this book….(More)”.
Paper by Tina Eliassi-Rad et al: “Political scientists have conventionally assumed that achieving democracy is a one-way ratchet. Only very recently has the question of “democratic backsliding” attracted any research attention. We argue that democratic instability is best understood with tools from complexity science. The explanatory power of complexity science arises from several features of complex systems. Their relevance in the context of democracy is discussed. Several policy recommendations are offered to help (re)stabilize current systems of representative democracy…(More)”.
Book by David Stasavage: “Historical accounts of democracy’s rise tend to focus on ancient Greece and pre-Renaissance Europe. The Decline and Rise of Democracy draws from global evidence to show that the story is much richer—democratic practices were present in many places, at many other times, from the Americas before European conquest, to ancient Mesopotamia, to precolonial Africa. Delving into the prevalence of early democracy throughout the world, David Stasavage makes the case that understanding how and where these democracies flourished—and when and why they declined—can provide crucial information not just about the history of governance, but also about the ways modern democracies work and where they could manifest in the future.
Drawing from examples spanning several millennia, Stasavage first considers why states developed either democratic or autocratic styles of governance and argues that early democracy tended to develop in small places with a weak state and, counterintuitively, simple technologies. When central state institutions (such as a tax bureaucracy) were absent—as in medieval Europe—rulers needed consent from their populace to govern. When central institutions were strong—as in China or the Middle East—consent was less necessary and autocracy more likely. He then explores the transition from early to modern democracy, which first took shape in England and then the United States, illustrating that modern democracy arose as an effort to combine popular control with a strong state over a large territory. Democracy has been an experiment that has unfolded over time and across the world—and its transformation is ongoing.
Amidst rising democratic anxieties, The Decline and Rise of Democracy widens the historical lens on the growth of political institutions and offers surprising lessons for all who care about governance….(More)”.
Report by the Select Committee on Democracy and Digital Technologies (UK Parliament): “Democracy faces a daunting new challenge. The age where electoral activity was conducted through traditional print media, canvassing and door knocking, is rapidly vanishing. Instead it is dominated by digital and social media. They are now the source from which voters get most of their information and political messaging.
The digital and social media landscape is dominated by two behemoths–Facebook and Google. They largely pass under the radar, operating outside the rules that govern electoral politics. This has become acutely obvious in the current COVID-19 pandemic where online misinformation poses not only a real and present danger to our democracy but also to our lives. Governments have been dilatory in adjusting regulatory regimes to capture these new realities. The result is a crisis of trust.
Yet our profound belief is that this can change. Technology is not a force of nature. Online platforms are not inherently ungovernable. They can and should be bound by the same restraints that we apply to the rest of society. If this is done well, in the ways we spell out in this Report, technology can become a servant of democracy rather than its enemy. There is a need for Government leadership and regulatory capacity to match the scale and pace of challenges and opportunities that the online world presents.
The Government’s Online Harms programme presents a significant first step towards this goal. It needs to happen; it needs to happen fast; and the necessary draft legislation must be laid before Parliament for scrutiny without delay. The Government must not flinch in the face of the inevitable and powerful lobbying of Big Tech and others that benefit from the current situation.
Well drafted Online Harms legislation can do much to protect our democracy. Issues such as misinformation and disinformation must be included in the Bill. The Government must make sure that online platforms bear ultimate responsibility for the content that their algorithms promote. Where harmful content spreads virally on their service or where it is posted by users with a large audience, they should face sanctions over their output as other broadcasters do.
Individual users need greater protection. They must have redress against large platforms through an ombudsman tasked with safeguarding the rights of citizens.
Transparency of online platforms is essential if democracy is to flourish. Platforms like Facebook and Google seek to hide behind ‘black box’ algorithms which choose what content users are shown. They take the position that their decisions are not responsible for harms that may result from online activity. This is plain wrong. The decisions platforms make in designing and training these algorithmic systems shape the conversations that happen online. For this reason, we recommend that platforms be mandated to conduct audits to show how in creating these algorithms they have ensured, for example, that they are not discriminating against certain groups. Regulators must have the powers to oversee these decisions, with the right to acquire the information from platforms they need to exercise those powers….(More)”.