Evaluating the fake news problem at the scale of the information ecosystem


Paper by Jennifer Allen, Baird Howland, Markus Mobius, David Rothschild and Duncan J. Watts: “Fake news,” broadly defined as false or misleading information masquerading as legitimate news, is frequently asserted to be pervasive online with serious consequences for democracy. Using a unique multimode dataset that comprises a nationally representative sample of mobile, desktop, and television consumption, we refute this conventional wisdom on three levels. First, news consumption of any sort is heavily outweighed by other forms of media consumption, comprising at most 14.2% of Americans’ daily media diets. Second, to the extent that Americans do consume news, it is overwhelmingly from television, which accounts for roughly five times as much as news consumption as online. Third, fake news comprises only 0.15% of Americans’ daily media diet. Our results suggest that the origins of public misinformedness and polarization are more likely to lie in the content of ordinary news or the avoidance of news altogether as they are in overt fakery….(More)”.

RESIST Counter Disinformation Toolkit


UK Government: “This toolkit will help support the dissemination of reliable, truthful information that underpins our democracy. RESIST stands for (Recognise disinformation, Early warning, Situational Insight, Impact analysis, Strategic communication, Track outcomes).

This toolkit will:

  • build your resilience to the threat of disinformation
  • give you guidance on how to identify a range of different types of disinformation consistently and effectively
  • help you prevent and tackle the spread of disinformation
  • enable you to develop a response when disinformation affects your organisation’s ability to do its job or represents a threat to the general public.

The toolkit promotes a consistent approach to the threat and provides 6 steps to follow.

RESIST Disinformation: a toolkit

The purpose of this toolkit is to help you prevent the spread of disinformation. It will enable you to develop a response when disinformation affects your organisation’s ability to do its job, the people who depend on your services, or represents a threat to the general public.

What is disinformation?

Disinformation is the deliberate creation and/or sharing of false information with the intention to deceive and mislead audiences. The inadvertent sharing of false information is referred to as misinformation.

Who is this toolkit for?

Government and public sector communications professionals, as well as policy officers, senior managers and special advisers….(More)”

The Wisdom of the Crowd: Promoting Media Development through Deliberative Initiatives


Report by Craig Matasick: “…innovative new set of citizen engagement practices—collectively known as deliberative democracy—offers important lessons that, when applied to the media development efforts, can help improve media assistance efforts and strengthen independent media environments around the world. At a time when disinformation runs rampant, it is more important than ever to strengthen public demand for credible information, reduce political polarization, and prevent media capture. Deliberative democracy approaches can help tackle these issues by expanding the number and diversity of voices that participate in policymaking, thereby fostering greater collective action and enhancing public support for media reform efforts.

Through a series of five illustrative case studies, the report demonstrates how deliberative democracy practices can be employed in both media development and democracy assistance efforts, particularly in the Global South. Such initiatives produce recommendations that take into account a plurality of voices while building trust between citizens and decision-makers by demonstrating to participants that their issues will be heard and addressed. Ultimately, this process can enable media development funders and practitioners to identify priorities and design locally relevant projects that have a higher likelihood for long-term impact.

– Deliberative democracy approaches, which are characterized by representative participation and moderated deliberation, provide a framework to generate demand-driven media development interventions while at the same time building greater public support for media reform efforts.

– Deliberative democracy initiatives foster collaboration across different segments of society, building trust in democratic institutions, combatting polarization, and avoiding elite capture.

– When employed by news organizations, deliberative approaches provide a better understanding of the issues their audiences care most about and uncover new problems affecting citizens that might not otherwise have come to light….(More)”.

Metrics at Work: Journalism and the Contested Meaning of Algorithms


Book by Angèle Christin: “When the news moved online, journalists suddenly learned what their audiences actually liked, through algorithmic technologies that scrutinize web traffic and activity. Has this advent of audience metrics changed journalists’ work practices and professional identities? In Metrics at Work, Angèle Christin documents the ways that journalists grapple with audience data in the form of clicks, and analyzes how new forms of clickbait journalism travel across national borders.

Drawing on four years of fieldwork in web newsrooms in the United States and France, including more than one hundred interviews with journalists, Christin reveals many similarities among the media groups examined—their editorial goals, technological tools, and even office furniture. Yet she uncovers crucial and paradoxical differences in how American and French journalists understand audience analytics and how these affect the news produced in each country. American journalists routinely disregard traffic numbers and primarily rely on the opinion of their peers to define journalistic quality. Meanwhile, French journalists fixate on internet traffic and view these numbers as a sign of their resonance in the public sphere. Christin offers cultural and historical explanations for these disparities, arguing that distinct journalistic traditions structure how journalists make sense of digital measurements in the two countries.

Contrary to the popular belief that analytics and algorithms are globally homogenizing forces, Metrics at Work shows that computational technologies can have surprisingly divergent ramifications for work and organizations worldwide….(More)”.

Can fake news really change behaviour? Evidence from a study of COVID-19 misinformation.


Paper by Ciara Greene and Gillian Murphy: “Previous research has argued that fake news may have grave consequences for health behaviour, but surprisingly, no empirical data have been provided to support this assumption. This issue takes on new urgency in the context of the coronavirus pandemic. In this large preregistered study (N = 3746) we investigated the effect of exposure to fabricated news stories about COVID-19 on related behavioural intentions. We observed small but measurable effects on some related behavioural intentions but not others – for example, participants who read a story about problems with a forthcoming contact-tracing app reported reduced willingness to download the app. We found no effects of providing a general warning about the dangers of online misinformation on response to the fake stories, regardless of the framing of the warning in positive or negative terms. We conclude with a call for more empirical research on the real-world consequences of fake news….(More)”

Regulating Social Media: The Fight Over Section 230 and Beyond


Report by Paul M. Barrett: “Recently, Section 230 of the Communications Decency Act of 1996 has come under sharp attack from members of both political parties, including presidential candidates Donald Trump and Joe Biden. The foundational law of the commercial internet, Section 230 does two things: It protects platforms and websites from most lawsuits related to content posted by third parties. And it guarantees this shield from liability even if the platforms and sites actively police the content they host. This protection has encouraged internet companies to innovate and grow, even as it has raised serious questions about whether social media platforms adequately self-regulate harmful content. In addition to the assaults by Trump and Biden, members of Congress have introduced a number of bills designed to limit the reach of Section 230. Some critics have asserted unrealistically that repealing or curbing Section 230 would solve a wide range of problems relating to internet governance. These critics also have played down the potentialy dire consequences that repeal would have for smaller internet companies. Academics, think tank researchers, and others outside of government have made a variety of more nuanced proposals for revising the law. We assess these ideas with an eye toward recommending and integrating the most promising ones. Our conclusion is that Section 230 ought to be preserved—but that it can be improved…(More)”

How Competition Impacts Data Privacy


Paper by Aline Blankertz: “A small number of large digital platforms increasingly shape the space for most online interactions around the globe and they often act with hardly any constraint from competing services. The lack of competition puts those platforms in a powerful position that may allow them to exploit consumers and offer them limited choice. Privacy is increasingly considered one area in which the lack of competition may create harm. Because of these concerns, governments and other institutions are developing proposals to expand the scope for competition authorities to intervene to limit the power of the large platforms and to revive competition.  


The first case that has explicitly addressed anticompetitive harm to privacy is the German Bundeskartellamt’s case against Facebook in which the authority argues that imposing bad privacy terms can amount to an abuse of dominance. Since that case started in 2016, more cases deal with the link between competition and privacy. For example, the proposed Google/Fitbit merger has raised concerns about sensitive health data being merged with existing Google profiles and Apple is under scrutiny for not sharing certain personal data while using it for its own services.

However, addressing bad privacy outcomes through competition policy is effective only if those outcomes are caused, at least partly, by a lack of competition. Six distinct mechanisms can be distinguished through which competition may affect privacy, as summarized in Table 1. These mechanisms constitute different hypotheses through which less competition may influence privacy outcomes and lead either to worse privacy in different ways (mechanisms 1-5) or even better privacy (mechanism 6). The table also summarizes the available evidence on whether and to what extent the hypothesized effects are present in actual markets….(More)”.

Governance responses to disinformation: How open government principles can inform policy options


OECD paper by Craig Matasick, Carlotta Alfonsi and Alessandro Bellantoni: “This paper provides a holistic policy approach to the challenge of disinformation by exploring a range of governance responses that rest on the open government principles of transparency, integrity, accountability and stakeholder participation. It offers an analysis of the significant changes that are affecting media and information ecosystems, chief among them the growth of digital platforms. Drawing on the implications of this changing landscape, the paper focuses on four policy areas of intervention: public communication for a better dialogue between government and citizens; direct responses to identify and combat disinformation; legal and regulatory policy; and media and civic responses that support better information ecosystems. The paper concludes with proposed steps the OECD can take to build evidence and support policy in this space…(More)”.

The Truth Is Paywalled But The Lies Are Free


Essay by Nathan J. Robinson: “…This means that a lot of the most vital information will end up locked behind the paywall. And while I am not much of a New Yorker fan either, it’s concerning that the Hoover Institute will freely give you Richard Epstein’s infamous article downplaying the threat of coronavirus, but Isaac Chotiner’s interview demolishing Epstein requires a monthly subscription, meaning that the lie is more accessible than its refutation. Eric Levitz of New York is one of the best and most prolific left political commentators we have. But unless you’re a subscriber of New York, you won’t get to hear much of what he has to say each month. 

Possibly even worse is the fact that so much academic writing is kept behind vastly more costly paywalls. A white supremacist on YouTube will tell you all about race and IQ but if you want to read a careful scholarly refutation, obtaining a legal PDF from the journal publisher would cost you $14.95, a price nobody in their right mind would pay for one article if they can’t get institutional access. (I recently gave up on trying to access a scholarly article because I could not find a way to get it for less than $39.95, though in that case the article was garbage rather than gold.) Academic publishing is a nightmarish patchwork, with lots of articles advertised at exorbitant fees on one site, and then for free on another, or accessible only through certain databases, which your university or public library may or may not have access to. (Libraries have to budget carefully because subscription prices are often nuts. A library subscription to the Journal of Coordination Chemistryfor instance, costs $11,367 annually.) 

Of course, people can find their ways around paywalls. SciHub is a completely illegal but extremely convenient means of obtaining academic research for free. (I am purely describing it, not advocating it.) You can find a free version of the article debunking race and IQ myths on ResearchGate, a site that has engaged in mass copyright infringement in order to make research accessible. Often, because journal publishers tightly control access to their copyrighted work in order to charge those exorbitant fees for PDFs, the versions of articles that you can get for free are drafts that have not yet gone through peer review, and have thus been subjected to less scrutiny. This means that the more reliable an article is, the less accessible it is. On the other hand, pseudo-scholarhip is easy to find. Right-wing think tanks like the Cato Institute, the Foundation for Economic Education, the Hoover Institution, the Mackinac Center, the American Enterprise Institute, and the Heritage Foundation pump out slickly-produced policy documents on every subject under the sun. They are utterly untrustworthy—the conclusion is always going to be “let the free market handle the problem,” no matter what the problem or what the facts of the case. But it is often dressed up to look sober-minded and non-ideological. 

It’s not easy or cheap to be an “independent researcher.” When I was writing my first book, Superpredator, I wanted to look through newspaper, magazine, and journal archives to find everything I could about Bill Clinton’s record on race. I was lucky I had a university affiliation, because this gave me access to databases like LexisNexis. If I hadn’t, the cost of finding out what I wanted to find out would likely have run into the thousands of dollars.  

A problem beyond cost, though, is convenience. I find that even when I am doing research through databases and my university library, it is often an absolute mess: the sites are clunky and constantly demanding login credentials. The amount of time wasted in figuring out how to obtain a piece of research material is a massive cost on top of the actual pricing. The federal court document database, PACER, for instance, charges 10 cents a page for access to records, which adds up quickly since legal research often involves looking through thousands of pages. They offer an exemption if you are a researcher or can’t afford it, but to get the exemption you have to fill out a three page form and provide an explanation of both why you need each document and why you deserve the exemption. This is a waste of time that inhibits people’s productivity and limits their access to knowledge.

In fact, to see just how much human potential is being squandered by having knowledge dispensed by the “free market,” let us briefly picture what “totally democratic and accessible knowledge” would look like…(More)”.

How behavioural sciences can promote truth, autonomy and democratic discourse online


Philipp Lorenz-Spreen, Stephan Lewandowsky, Cass R. Sunstein & Ralph Hertwig in Nature: “Public opinion is shaped in significant part by online content, spread via social media and curated algorithmically. The current online ecosystem has been designed predominantly to capture user attention rather than to promote deliberate cognition and autonomous choice; information overload, finely tuned personalization and distorted social cues, in turn, pave the way for manipulation and the spread of false information. How can transparency and autonomy be promoted instead, thus fostering the positive potential of the web? Effective web governance informed by behavioural research is critically needed to empower individuals online. We identify technologically available yet largely untapped cues that can be harnessed to indicate the epistemic quality of online content, the factors underlying algorithmic decisions and the degree of consensus in online debates. We then map out two classes of behavioural interventions—nudging and boosting— that enlist these cues to redesign online environments for informed and autonomous choice….(More)”.