John Timmer at Wired: “…a small international team of researchers… decided to take a look at how a group of US residents decided on which news to share. Their results suggest that some of the standard factors that people point to when explaining the tsunami of misinformation—inability to evaluate information and partisan biases—aren’t having as much influence as most of us think. Instead, a lot of the blame gets directed at people just not paying careful attention.
The researchers ran a number of fairly similar experiments to get at the details of misinformation sharing. This involved panels of US-based participants recruited either through Mechanical Turk or via a survey population that provided a more representative sample of the US. Each panel had several hundred to over 1,000 individuals, and the results were consistent across different experiments, so there was a degree of reproducibility to the data.
To do the experiments, the researchers gathered a set of headlines and lead sentences from news stories that had been shared on social media. The set was evenly mixed between headlines that were clearly true and clearly false, and each of these categories was split again between those headlines that favored Democrats and those that favored Republicans.
One thing that was clear is that people are generally capable of judging the accuracy of the headlines. There was a 56 percentage point gap between how often an accurate headline was rated as true and how often a false headline was. People aren’t perfect—they still got things wrong fairly often—but they’re clearly quite a bit better at this than they’re given credit for.
The second thing is that ideology doesn’t really seem to be a major factor in driving judgements on whether a headline was accurate. People were more likely to rate headlines that agreed with their politics, but the difference here was only 10 percentage points. That’s significant (both societally and statistically), but it’s certainly not a large enough gap to explain the flood of misinformation.
But when the same people were asked about whether they’d share these same stories, politics played a big role, and the truth receded. The difference in intention to share between true and false headlines was only 6 percentage points. Meanwhile the gap between whether a headline agreed with a person’s politics or not saw a 20 percentage point gap. Putting it in concrete terms, the authors look at the false headline “Over 500 ‘Migrant Caravaners’ Arrested With Suicide Vests.” Only 16 percent of conservatives in the survey population rated it as true. But over half of them were amenable to sharing it on social media….(More)”.
Book edited by Michael P. Lynch, Jeremy Wyatt, Junyeol Kim and Nathan Kellen: “The question “What is truth?” is so philosophical that it can seem rhetorical. Yet truth matters, especially in a “post-truth” society in which lies are tolerated and facts are ignored. If we want to understand why truth matters, we first need to understand what it is. The Nature of Truth offers the definitive collection of classic and contemporary essays on analytic theories of truth. This second edition has been extensively revised and updated, incorporating both historically central readings on truth’s nature as well as up-to-the-moment contemporary essays. Seventeen new chapters reflect the current trajectory of research on truth.
Highlights include new essays by Ruth Millikan and Gila Sher on correspondence theories; a new essay on Peirce’s theory by Cheryl Misak; seven new essays on deflationism, laying out both theories and critiques; a new essay by Jamin Asay on primitivist theories; and a new defense by Kevin Scharp of his replacement theory, coupled with a probing critique of replacement theories by Alexis Burgess. Classic essays include selections by J. L. Austin, Donald Davidson, William James, W. V. O. Quine, and Alfred Tarski….(More)”.
Book by Clare Birchall: “When total data surveillance delimits agency and revelations of political wrongdoing fail to have consequences, is transparency the social panacea liberal democracies purport it to be? This book sets forth the provocative argument that progressive social goals would be better served by a radical form of secrecy, at least while state and corporate forces hold an asymmetrical advantage over the less powerful in data control. Clare Birchall asks: How might transparency actually serve agendas that are far from transparent? Can we imagine a secrecy that could act in the service of, rather than against, a progressive politics?
To move beyond atomizing calls for privacy and to interrupt the perennial tension between state security and the public’s right to know, Birchall adapts Édouard Glissant’s thinking to propose a digital “right to opacity.” As a crucial element of radical secrecy, she argues, this would eventually give rise to a “postsecret” society, offering an understanding and experience of the political that is free from the false choice between secrecy and transparency. She grounds her arresting story in case studies including the varied presidential styles of George W. Bush, Barack Obama, and Donald Trump; the Snowden revelations; conspiracy theories espoused or endorsed by Trump; WikiLeaks and guerrilla transparency; and the opening of the state through data portals.
Postsecrecy is the necessary condition for imagining, finally, an alternative vision of “the good,” of equality, as neither shaped by neoliberal incarnations of transparency nor undermined by secret state surveillance. Not least, postsecrecy reimagines collective resistance in the era of digital data….(More)”.
Study by Laura Edelson, Minh-Kha Nguyen, Ian Goldstein, Oana Goga, Tobias Lauinger, and Damon McCoy: Facebook has become a major way people find news and information in an increasingly politically polarized nation. We analyzed how users interacted with different types of posts promoted as news in the lead-up to and aftermath of the U.S. 2020 elections. We found that politically extreme sources tend to generate more interactions from users. In particular, content from sources rated as far-right by independent news rating services consistently received the highest engagement per follower of any partisan group. Additionally, frequent purveyors of far-right misinformation had on average 65% more engagement per follower than other far-right pages. We found:
- Sources of news and information rated as far-right generate the highest average number of interactions per follower with their posts, followed by sources from the far-left, and then news sources closer to the center of the political spectrum.
- Looking at the far-right, misinformation sources far outperform non-misinformation sources. Far-right sources designated as spreaders of misinformation had an average of 426 interactions per thousand followers per week, while non-misinformation sources had an average of 259 weekly interactions per thousand followers.
- Engagement with posts from far-right and far-left news sources peaked around Election Day and again on January 6, the day of the certification of the electoral count and the U.S. Capitol riot. For posts from all other political leanings of news sources, the increase in engagement was much less intense.
- Center and left partisan categories incur a misinformation penalty, while right-leaning sources do not. Center sources of misinformation, for example, performed about 70% worse than their non-misinformation counterparts. (Note: center sources of misinformation tend to be sites presenting as health news that have no obvious ideological orientation.)…(More)”.
Yoshi Sodeoka in The Atlantic: “…With the wholesale transfer of so much entertainment, social interaction, education, commerce, and politics from the real world to the virtual world—a process recently accelerated by the coronavirus pandemic—many Americans have come to live in a nightmarish inversion of the Tocquevillian dream, a new sort of wilderness. Many modern Americans now seek camaraderie online, in a world defined not by friendship but by anomie and alienation. Instead of participating in civic organizations that give them a sense of community as well as practical experience in tolerance and consensus-building, Americans join internet mobs, in which they are submerged in the logic of the crowd, clicking Like or Share and then moving on. Instead of entering a real-life public square, they drift anonymously into digital spaces where they rarely meet opponents; when they do, it is only to vilify them.
Conversation in this new American public sphere is governed not by established customs and traditions in service of democracy but by rules set by a few for-profit companies in service of their needs and revenues. Instead of the procedural regulations that guide a real-life town meeting, conversation is ruled by algorithms that are designed to capture attention, harvest data, and sell advertising. The voices of the angriest, most emotional, most divisive—and often the most duplicitous—participants are amplified. Reasonable, rational, and nuanced voices are much harder to hear; radicalization spreads quickly. Americans feel powerless because they are.
In this new wilderness, democracy is becoming impossible. If one half of the country can’t hear the other, then Americans can no longer have shared institutions, apolitical courts, a professional civil service, or a bipartisan foreign policy. We can’t compromise. We can’t make collective decisions—we can’t even agree on what we’re deciding. No wonder millions of Americans refuse to accept the results of the most recent presidential election, despite the verdicts of state electoral committees, elected Republican officials, courts, and Congress. We no longer are the America Tocqueville admired, but have become the enfeebled democracy he feared, a place where each person,…(More)”.
Paper by Daniel Innerarity: “Democracy is possible because of an increase in the complexity of society, but that same complexity seems to threaten democracy. There is a clear imbalance between people’s actual competence and the expectation that citizens in a democratic society will be politically competent. It is not only that society has become more complex but that democratization itself increases the degree of social complexity. This unintelligibility can be overcome through the acquisition of some political competence—such as improving individual knowledge, diverse strategies for simplification or recourse to the experts—that partially reduce this imbalance. My hypothesis is that despite the attraction of de-democratizing procedures, the best solutions are those that are most democratic: strengthening the cooperation and the institutional organization of collective intelligence. The purpose of this article is not to solve all the problems I touch on, but rather to examine how they are related and to provide a general framework for the problem of de-democratization through misunderstanding….(More)”.
Book edited by Daniel Moeckli, Anna Forgács, and Henri Ibi: “With the rise of direct-democratic instruments, the relationship between popular sovereignty and the rule of law is set to become one of the defining political issues of our time. This important and timely book provides an in-depth analysis of the limits imposed on referendums and citizens’ initiatives, as well as of systems of reviewing compliance with these limits, in 11 European states.
Chapters explore and lay the scientific basis for answering crucial questions such as ‘Where should the legal limits of direct democracy be drawn?’ and ‘Who should review compliance with these limits?’ Providing a comparative analysis of the different issues in the selected countries, the book draws out key similarities and differences, as well as an assessment of the law and the practice at national levels when judged against the international standards contained in the Venice Commission’s Guidelines on the Holding of Referendums.
Presenting an up-to-date analysis of the relationship between popular sovereignty and the rule of law, The Legal Limits of Direct Democracy will be a key resource for scholars and students in comparative and constitutional law and political science. It will also be beneficial to policy-makers and practitioners in parliaments, governments and election commissions, and experts working for international organisations….(More)”.
Paper by Paolo Cavaliere: “This article claims that the practices undertaken by digital platforms to counter disinformation, under the EU Action Plan against Disinformation and the Code of Practice, mark a shift in the governance of news media content. While professional journalism standards have been used for long, both within and outside the industry, to assess the accuracy of news content and adjudicate on media conduct, the platforms are now resolving to different fact-checking routines to moderate and curate their content.
The article will demonstrate how fact-checking organisations have different working methods than news operators and ultimately understand and assess ‘accuracy’ in different ways. As a result, this new and enhanced role for platforms and fact-checkers as curators of content impacts on how content is distributed to the audience and, thus, on media freedom. Depending on how the fact-checking standards and working routines will consolidate in the near future, however, this trend offers an actual opportunity to improve the quality of news and the right to receive information…(More)”.
Paper by David Andrew Logan: “New York Times v. Sullivan (1964) is an iconic decision, foundational to modern First Amendment theory, and in a string of follow-on decisions the Court firmly grounded free speech theory and practice in the need to protect democratic discourse. To do this the Court provided broad and deep protections to the publishers of falsehoods. This article recognizes that New York Times and its progeny made sense in the “public square” of an earlier era, but the justices could never have foreseen the dramatic changes in technology and the media environment in the years since, nor predict that by making defamation cases virtually impossible to win they were harming, rather than helping self-government. In part because of New York Times, the First Amendment has been weaponized, frustrating a basic requirement of a healthy democracy: the development of a set of broadly agreed-upon facts. Instead, we are subject to waves of falsehoods that swamp the ability of citizens to effectively self-govern. As a result, and despite its iconic status, New York Times needs to be reexamined and retooled to better serve our democracy….(More)”
Book edited by Kevin Macnish and Jai Galliott: “Considers the morality of using big data in the political sphere, covering cases from the Snowden leaks to the Brexit referendum
- Investigates theories and recommendations for how to align the modern political process with the exponential rise in the availability of digital information
- Opens new avenues for thinking about the philosophy and morality of social media, such as Facebook and Twitter, in the context of political decision-making
- Sets out and objectively assesses the ‘opacity’ framework as an appropriate means of dealing with the challenges associated with big data and democracy
What’s wrong with targeted advertising in political campaigns? Should we be worried about echo chambers? How does data collection impact on trust in society? As decision-making becomes increasingly automated, how can decision-makers be held to account? This collection consider potential solutions to these challenges. It brings together original research on the philosophy of big data and democracy from leading international authors, with recent examples – including the 2016 Brexit Referendum, the Leveson Inquiry and the Edward Snowden leaks. And it asks whether an ethical compass is available or even feasible in an ever more digitised and monitored world….(More)”.