Who Is Falling for Fake News?


Article by Angie Basiouny: “People who read fake news online aren’t doomed to fall into a deep echo chamber where the only sound they hear is their own ideology, according to a revealing new study from Wharton.

Surprisingly, readers who regularly browse fake news stories served up by social media algorithms are more likely to diversify their news diet by seeking out mainstream sources. These well-rounded news junkies make up more than 97% of online readers, compared with the scant 2.8% who consume online fake news exclusively.

“We find that these echo chambers that people worry about are very shallow. This idea that the internet is creating an echo chamber is just not holding out to be true,” said Senthil Veeraraghavan, a Wharton professor of operations, information and decisions.

Veeraraghavan is co-author of the paper, “Does Fake News Create Echo Chambers?” It was also written by Ken Moon, Wharton professor of operations, information and decisions, and Jiding Zhang, an assistant operations management professor at New York University Shanghai who earned her doctorate at Wharton.

The study, which examined the browsing activity of nearly 31,000 households during 2017, offers empirical evidence that goes against popular beliefs about echo chambers. While echo chambers certainly are dark and dangerous places, they aren’t metaphorical black holes that suck in every person who reads an article about, say, Obama birtherism theory or conspiracies about COVID-19 vaccines. The study found that households exposed to fake news actually increase their exposure to mainstream news by 9.1%.

“We were surprised, although we were very aware going in that there was much that we did not know,” Moon said. “One thing we wanted to see is how much fake news is out there. How do we figure out what’s fake and what’s not, and who is producing the fake news and why? The economic structure of that matters from a business perspective.”…(More)”

Rules: A Short History of What We Live By


Book by Lorraine Daston: “Rules order almost every aspect of our lives. They set our work hours, dictate how we drive and set the table, tell us whether to offer an extended hand or cheek in greeting, and organize the rites of life, from birth through death. We may chafe under the rules we have, and yearn for ones we don’t, yet no culture could do without them. In Rules, historian Lorraine Daston traces their development in the Western tradition and shows how rules have evolved from ancient to modern times. Drawing on a rich trove of examples, including legal treatises, cookbooks, military manuals, traffic regulations, and game handbooks, Daston demonstrates that while the content of rules is dazzlingly diverse, the forms that they take are surprisingly few and long-lived.

Daston uncovers three enduring kinds of rules: the algorithms that calculate and measure, the laws that govern, and the models that teach. She vividly illustrates how rules can change—how supple rules stiffen, or vice versa, and how once bothersome regulations become everyday norms. Rules have been devised for almost every imaginable activity and range from meticulous regulations to the laws of nature. Daston probes beneath this variety to investigate when rules work and when they don’t, and why some philosophical problems about rules are as ancient as philosophy itself while others are as modern as calculating machines….(More)”.

Social Noise: What Is It, and Why Should We Care?


Article by Tara Zimmerman: “As social media, online relationships, and perceived social expectations on platforms such as Facebook play a greater role in people’s lives, a new phenomenon has emerged: social noise. Social noise is the influence of personal and relational factors on information received, which can confuse, distort, or even change the intended message. Influenced by social noise, people are likely to moderate their response to information based on cues regarding what behavior is acceptable or desirable within their social network. This may be done consciously or unconsciously as individuals strive to present themselves in ways that increase their social capital. For example, this might be seen as liking or sharing information posted by a friend or family member as a show of support despite having no strong feelings toward the information itself. Similarly, someone might refrain from liking, sharing, or commenting on information they strongly agree with because they believe others in their social network would disapprove.

This study reveals that social media users’ awareness of observation by others does impact their information behavior. Efforts to craft a personal reputation, build or maintain relationships, pursue important commitments, and manage conflict all influence the observable information behavior of
social media users. As a result, observable social media information behavior may not be an accurate reflection of an individual’s true thoughts and beliefs. This is particularly interesting in light of the role social media plays in the spread of mis- and disinformation…(More)”.

Corruption Risk Forecast


About: “Starting with 2015 and building on the work of Alina Mungiu-Pippidi the European Research Centre for Anti-Corruption and State-Building (ERCAS) engaged in the development of a new generation of corruption indicators to fill the gap. This led to the creation of the Index for Public Integrity (IPI) in 2017, of the Corruption Risk Forecast in 2020 and of the T-index (de jure and de facto computer mediated government transparency) in 2021. Also since 2021 a component of the T-index (administrative transparency) is included in the IPI, whose components also offer the basis for the Corruption Risk Forecast.

This generation is different from perception indicators in a few fundamental aspects:

  1. Theory-grounded. Our indicators are unique because they are based on a clear theory- why corruption happens, how do countries that control corruption differ from those that don’t and what specifically is broken and should be fixed. We tested for a large variety of indicators before we decided on these ones.
  2. Specific. Each component is a measurement based on facts of a certain aspect of control of corruption or transparency. Read methodology to follow in detail where the data comes from and how these indicators were selected.
  3. Change sensitive. Except for the T-index components whose monitoring started in 2021 all other components go back in time at least 12 years and can be compared across years in the Trends menu on the Corruption Risk forecast page. No statistical process blurs the difference across years as with perception indicators. For long term trends, we flag what change is significant and what change is not. T-index components will also be comparable across the nest years to come. Furthermore, our indicators are selected to be actionable, so any significant policy intervention which has an impact is captured and reported when we renew the data.
  4. Comparative. You can compare every country we cover with the rest of the world to see exactly where it stands, and against its peers from the region and the income group.
  5. Transparent. Our T-index dataallows you to review and contribute to our work. Use the feedback form on T-index page to send input, and after checking by our team we will upgrade the codes to include your contribution. Use the feedback form on Corruption Risk forecast page to contribute to the forecast…(More)”.

Democracy Disrupted: Governance in an Increasingly Virtual and Massively Distributed World


Essay by Eric B. Schnurer: “It is hard not to think that the world has come to a critical juncture, a point of possibly catastrophic collapse. Multiple simultaneous crises—many of epic proportions—raise doubts that liberal democracies can govern their way through them. In fact, it is vanishingly rare to hear anyone say otherwise.

While thirty years ago, scholars, pundits, and political leaders were confidently proclaiming the end of history, few now deny that it has returned—if it ever ended. And it has done so at a time of not just geopolitical and economic dislocations but also historic technological dislocations. To say that this poses a challenge to liberal democratic governance is an understatement. As history shows, the threat of chaos, uncertainty, weakness, and indeed ungovernability always favors the authoritarian, the man on horseback who promises stability, order, clarity—and through them, strength and greatness.

How, then, did we come to this disruptive return? Explanations abound, from the collapse of industrial economies and the post–Cold War order to the racist, nativist, and ultranationalist backlash these have produced; from the accompanying widespread revolt against institutions, elites, and other sources of authority to the social media business models and algorithms that exploit and exacerbate anger and division; from sophisticated methods of information warfare intended specifically to undercut confidence in truth or facts to the rise of authoritarian personalities in virtually every major country, all skilled in exploiting these developments. These are all perfectly good explanations. Indeed, they are interconnected and collectively help to explain our current state. But as Occam’s razor tells us, the simplest explanation is often the best. And there is a far simpler explanation for why we find ourselves in this precarious state: The widespread breakdowns and failures of governance and authority we are experiencing are driven by, and largely explicable by, underlying changes in technology.

We are in fact living through technological change on the scale of the Agricultural or Industrial Revolution, but it is occurring in only a fraction of the time. What we are experiencing today—the breakdown of all existing authority, primarily but not exclusively governmental—is if not a predictable result, at least an unsurprising one. All of these other features are just the localized spikes on the longer sine wave of history…(More)”.

Democracy Disrupted: Governance in an Increasingly Virtual and Massively Distributed World.


Essay by Eric B. Schnurer: “…In short, it is often difficult to see where new technologies actually will lead. The same technological development can, in different settings, have different effects: The use of horses in warfare, which led seemingly inexorably in China and Europe to more centralized and autocratic states, had the effect on the other side of the world of enabling Hernán Cortés, with an army of roughly five hundred Spaniards, to defeat the massed infantries of the highly centralized, autocratic Aztec regime. Cortés’s example demonstrates that a particular technology generally employed by a concentrated power to centralize and dominate can also be used by a small insurgent force to disperse and disrupt (although in Cortés’s case this was on behalf of the eventual imposition of an even more despotic rule).

Regardless of the lack of inherent ideological content in any given technology, however, our technological realities consistently give metaphorical shape to our ideological constructs. In ancient Egypt, the regularity of the Nile’s flood cycle, which formed the society’s economic basis, gave rise to a belief in recurrent cycles of life and death; in contrast, the comparatively harsh and static agricultural patterns of the more-or-less contemporaneous Mesopotamian world produced a society that conceived of gods who simply tormented humans and then relegated them after death to sit forever in a place of dust and silence; meanwhile, the pastoral societies of the Fertile Crescent have handed down to us the vision of God as shepherd of his flock. (The Bible also gives us, in the story of Cain and Abel, a parable of the deadly conflict that technologically driven economic changes wreak: Abel was a traditional pastoralist—he tended sheep—while Cain, who planted seeds in the ground, represented the disruptive “New Economy” of settled agriculture. Tellingly, after killing off the pastoralist, the sedentarian Cain exits to found the first city.88xGenesis 4:17.)

As humans developed more advanced technologies, these in turn reshaped our conceptions of the world around us, including the proper social order. Those who possessed superior technological knowledge were invested with supernatural authority: The key to early Rome’s defense was the ability quickly to assemble and disassemble the bridges across the Tiber, so much so that the pontifex maximus—literally the “greatest bridge-builder”—became the high priest, from whose Latin title we derive the term pontiff. The most sophisticated—and arguably most crucial—technology in any town in medieval Europe was its public clock. The clock, in turn, became a metaphor for the mechanical working of the universe—God, in fact, was often conceived of as a clockmaker (a metaphor still frequently invoked to argue against evolution and for the necessity of an intelligent creator)—and for the proper form of social organization: All should know their place and move through time and space as predictably as the figurines making their regular appearances and performing their routinized interactions on the more elaborate and entertaining of these town-square timepieces.

In our own time, the leading technologies continue to provide the organizing concepts for our economic, political, and theological constructs. The factory became such a ubiquitous reflection of economic and social realities that, from the early nineteenth century onward, virtually every social and cultural institution—welfare (the poorhouse, or, as it was often called, the “workhouse”), public safety (the penitentiary), health care (the hospital), mental health (the insane asylum), “workforce” or public housing, even (as teachers often suggest to me) the education system—was consciously remodeled around it. Even when government finally tried to get ahead of the challenges posed by the Industrial Revolution by building the twentieth-century welfare state, it wound up constructing essentially a new capital of the Industrial Age in Washington, DC, with countless New Deal ministries along the Mall—resembling, as much as anything, the rows of factory buildings one can see in the steel and mill towns of the same era.

By the middle of the twentieth century, the atom and the computer came to dominate most intellectual constructs. First, the uncertainty of quantum mechanics upended mechanistic conceptions of social and economic relations, helping to foster conceptions of relativism in everything from moral philosophy to literary criticism. More recently, many scientists have come to the conclusion that the universe amounts to a massive information processor, and popular culture to the conviction that we all simply live inside a giant video game.

In sum, while technological developments are not deterministic—their outcomes being shaped, rather, by the uses we conceive to employ them—our conceptions are largely molded by these dominant technologies and the transformations they effect.99xI should note that while this argument is not deterministic, like those of most current thinkers about political and economic development such as Francis Fukuyama, Jared Diamond, and Yuval Noah Harari, neither is it materialistic, like that of Karl Marx. Marx thoroughly rejected human ideas and thinking as movers of history, which he saw as simply shaped and dictated by the technology. I am suggesting instead a dialectic between the ideal and the material. To repeat the metaphor, technological change constitutes the plate tectonics on which human contingencies are then built. To understand, then, the deeper movements of thought, economic arrangements, and political developments, both historical and contemporary, one must understand the nature of the technologies underlying and driving their unfolding…(More)“.

Social Media, Freedom of Speech, and the Future of our Democracy


Book edited by Lee C. Bollinger and Geoffrey R. Stone: “One of the most fiercely debated issues of this era is what to do about “bad” speech-hate speech, disinformation and propaganda campaigns, and incitement of violence-on the internet, and in particular speech on social media platforms such as Facebook and Twitter. In Social Media, Freedom of Speech, and the Future of our Democracy, Lee C. Bollinger and Geoffrey R. Stone have gathered an eminent cast of contributors—including Hillary Clinton, Amy Klobuchar, Sheldon Whitehouse, Mark Warner, Newt Minow,Tim Wu, Cass Sunstein, Jack Balkin, Emily Bazelon, and others—to explore the various dimensions of this problem in the American context. They stress how difficult it is to develop remedies given that some of these forms of “bad” speech are ordinarily protected by the First Amendment. Bollinger and Stone argue that it is important to remember that the last time we encountered major new communications technology-television and radio-we established a federal agency to provide oversight and to issue regulations to protect and promote “the public interest.” Featuring a variety of perspectives from some of America’s leading experts on this hotly contested issue, this volume offers new insights for the future of free speech in the social media era…(More)”.

How China uses search engines to spread propaganda


Blog by Jessica Brandt and Valerie Wirtschafter: “Users come to search engines seeking honest answers to their queries. On a wide range of issues—from personal health, to finance, to news—search engines are often the first stop for those looking to get information online. But as authoritarian states like China increasingly use online platforms to disseminate narratives aimed at weakening their democratic competitors, these search engines represent a crucial battleground in their information war with rivals. For Beijing, search engines represent a key—and underappreciated vector—to spread propaganda to audiences around the world.  

On a range of topics of geopolitical importance, Beijing has exploited search engine results to disseminate state-backed media that amplify the Chinese Communist Party’s propaganda. As we demonstrate in our recent report, published by the Brookings Institution in collaboration with the German Marshall Fund’s Alliance for Securing Democracy, users turning to search engines for information on Xinjiang, the site of the CCP’s egregious human rights abuses of the region’s Uyghur minority, or the origins of the coronavirus pandemic are surprisingly likely to encounter articles on these topics published by Chinese state-media outlets. By prominently surfacing this type of content, search engines may play a key role in Beijing’s effort to shape external perceptions, which makes it crucial that platforms—along with authoritative outlets that syndicate state-backed content without clear labeling—do more to address their role in spreading these narratives…(More)“.

The Truth in Fake News: How Disinformation Laws Are Reframing the Concepts of Truth and Accuracy on Digital Platforms


Paper by Paolo Cavaliere: “The European Union’s (EU) strategy to address the spread of disinformation, and most notably the Code of Practice on Disinformation and the forthcoming Digital Services Act, tasks digital platforms with a range of actions to minimise the distribution of issue-based and political adverts that are verifiably false or misleading. This article discusses the implications of the EU’s approach with a focus on its categorical approach, specifically what it means to conceptualise disinformation as a form of advertisement and by what standards digital platforms are expected to assess the truthful or misleading nature of the content they distribute because of this categorisation. The analysis will show how the emerging EU anti-disinformation framework marks a departure from the European Court of Human Rights’ consolidated standards of review for public interest and commercial speech and the tests utilised to assess their accuracy….(More)”.

How to get to the core of democracy


Blog by Toralf Stark, Norma Osterberg-Kaufmann and Christoph Mohamad-Klotzbach: “…Many criticisms of conceptions of democracy are directed more at the institutional design than at its normative underpinnings. These include such things as the concept of representativeness. We propose focussing more on the normative foundations assessed by the different institutional frameworks than discussing the institutional frameworks themselves. We develop a new concept, which we call the ‘core principle of democracy’. By doing so, we address the conceptual and methodological puzzles theoretically and empirically. Thus, we embrace a paradigm shift.

Collecting data is ultimately meaningless if we do not find ways to assess, summarise and theorise it. Kei Nishiyama argued we must ‘shift our attention away from the concept of democracy and towards concepts of democracy’. By the term concept we, in line with Nishiyama, are following Rawls. Rawls claimed that ‘the concept of democracy refers to a single, common principle that transcends differences and on which everyone agrees’. In contrast with this, ‘ideas of democracy (…) refer to different, sometimes contested ideas based on a common concept’. This is what Laurence Whitehead calls the ‘timeless essence of democracy’….

Democracy is a latent construct and, by nature, not directly observable. Nevertheless, we are searching for indicators and empirically observable characteristics we can assign to democratic conceptions. However, by focusing only on specific patterns of institutions, only sometimes derived from theoretical considerations, we block our view of its multiple meanings. Thus, we’ve no choice but to search behind the scenes for the underlying ‘core’ principle the institutions serve.

The singular core principle that all concepts of democracy seek to realise is political self-efficacy…(More)”.

Political self-efficacy
Source: authors’ own compilation