Media Is Us: Understanding Communication and Moving beyond Blame


Book by Elizaveta Friesem: “Media is usually seen as a feature of the modern world enabled by the latest technologies. Scholars, educators, parents, and politicians often talk about media as something people should be wary of due to its potential negative impact on their lives. But do we really understand what media is?

Elizaveta Friesem argues that instead of being worried about media or blaming it for what’s going wrong in society, we should become curious about uniquely human ways we communicate with each other. Media Is Us proposes five key principles of communication that are relevant both for the modern media and for people’s age-old ways of making sense of the world.

In order to understand problems of the contemporary society revealed and amplified by the latest technologies, we will have to ask difficult questions about ourselves. Where do our truths and facts come from? How can we know who is to blame for flaws of the social system? What can we change about our own everyday actions to make the world a better place? To answer these questions we will need to rethink not only the term “media” but also the concept of power. The change of perspective proposed by the book is intended to help the reader become more self-aware and also empathic towards those who choose different truths.

Concluding with practical steps to build media literacy through the ACE model—from Awareness to Collaboration through Empathy—this timely book is essential for students and scholars, as well as anyone who would use the new understanding of media to decrease the current levels of cultural polarization….(More)”.

Do journalists “hide behind” sources when they use numbers in the news?


Article by Mark Coddington and Seth Lewis: “Numerical information is a central piece of journalism. Just look at how often stories rely on quantitative data — from Covid case numbers to public opinion polling to economics statistics — as their evidentiary backbone. The rise of data journalism, with its slick visualizations and interactives, has reinforced the role and influence of numbers in the news.

But, as B.T. Lawson reminds us in a new article in Journalism Practice, though we have plenty of research on this decade-long boom in data journalism, much of the research “overstates the significance of the data journalist within the news media. Yes, data journalists are now a mainstay of most news organizations, but they are not the only journalists using numbers. Far from it.”

Indeed, in contrast to the 1960s and 70s era of computer-assisted reporting, when a small minority of specialized reporters worked with data but most reporters did not, nowadays virtually all journalists are expected to engage with numbers as part of their work. Which brings up a potential problem: Some research suggests that journalists rarely challenge the numbers they receive, leading them to accept and reproduce the discourse around those numbers from their sources.

To get a clearer picture of how journalists draw on numbers and narratives about them, Lawson examined reporters’ use of numbers in their coverage of seven humanitarian crises in 2017. The author did this in two ways: first through a content analysis of 978 news articles from U.K. news media (to look for some direct or indirect form of challenging statistics, cross-verifying one claim relative to another, etc.), and then through interviews with 16 journalists involved in at least one of those stories, to gain additional insights into the process of receiving and reporting on numbers.

The title of the resulting article — “Hiding Behind Databases, Institutions and Actors: How Journalists Use Statistics in Reporting Humanitarian Crises” — indicates something about one of its findings: namely, that journalists covering humanitarian crises rely heavily on numbers, often provided by NGOs or the UN, but they seldom verify the numbers they use, mainly because they see it as outside their role to do such work and because they “hide behind” the perceived credibility of their sources….(More)”

Diverse Sources Database


About: “The Diverse Sources Database is NPR’s resource for journalists who believe in the value of diversity and share our goal to make public radio look and sound like America.

Originally called Source of the Week, the database launched in 2013 as a way help journalists at NPR and member stations expand the racial/ethnic diversity of the experts they tap for stories…(More)”.

‘Belonging Is Stronger Than Facts’: The Age of Misinformation



Max Fisher at the New York Times: “There’s a decent chance you’ve had at least one of these rumors, all false, relayed to you as fact recently: that President Biden plans to force Americans to eat less meat; that Virginia is eliminating advanced math in schools to advance racial equality; and that border officials are mass-purchasing copies of Vice President Kamala Harris’s book to hand out to refugee children.

All were amplified by partisan actors. But you’re just as likely, if not more so, to have heard it relayed from someone you know. And you may have noticed that these cycles of falsehood-fueled outrage keep recurring.

We are in an era of endemic misinformation — and outright disinformation. Plenty of bad actors are helping the trend along. But the real drivers, some experts believe, are social and psychological forces that make people prone to sharing and believing misinformation in the first place. And those forces are on the rise.

“Why are misperceptions about contentious issues in politics and science seemingly so persistent and difficult to correct?” Brendan Nyhan, a Dartmouth College political scientist, posed in a new paper in Proceedings of the National Academy of Sciences.

It’s not for want of good information, which is ubiquitous. Exposure to good information does not reliably instill accurate beliefs anyway. Rather, Dr. Nyhan writes, a growing body of evidence suggests that the ultimate culprits are “cognitive and memory limitations, directional motivations to defend or support some group identity or existing belief, and messages from other people and political elites.”

Put more simply, people become more prone to misinformation when three things happen. First, and perhaps most important, is when conditions in society make people feel a greater need for what social scientists call ingrouping — a belief that their social identity is a source of strength and superiority, and that other groups can be blamed for their problems….(More)”.

Do You See What I See? Capabilities and Limits of Automated Multimedia Content Analysis


A CDT Research report, entitled "Do You See What I See? Capabilities and Limits of Automated Multimedia Content Analysis".
CDT Research report, entitled “Do You See What I See? Capabilities and Limits of Automated Multimedia Content Analysis”.

Report by Dhanaraj Thakur and  Emma Llansó: “The ever-increasing amount of user-generated content online has led, in recent years, to an expansion in research and investment in automated content analysis tools. Scrutiny of automated content analysis has accelerated during the COVID-19 pandemic, as social networking services have placed a greater reliance on these tools due to concerns about health risks to their moderation staff from in-person work. At the same time, there are important policy debates around the world about how to improve content moderation while protecting free expression and privacy. In order to advance these debates, we need to understand the potential role of automated content analysis tools.

This paper explains the capabilities and limitations of tools for analyzing online multimedia content and highlights the potential risks of using these tools at scale without accounting for their limitations. It focuses on two main categories of tools: matching models and computer prediction models. Matching models include cryptographic and perceptual hashing, which compare user-generated content with existing and known content. Predictive models (including computer vision and computer audition) are machine learning techniques that aim to identify characteristics of new or previously unknown content….(More)”.

Side-Stepping Safeguards, Data Journalists Are Doing Science Now


Article by Irineo Cabreros: “News stories are increasingly told through data. Witness the Covid-19 time series that decorate the homepages of every major news outlet; the red and blue heat maps of polling predictions that dominate the runup to elections; the splashy, interactive plots that dance across the screen.

As a statistician who handles data for a living, I welcome this change. News now speaks my favorite language, and the general public is developing a healthy appetite for data, too.

But many major news outlets are no longer just visualizing data, they are analyzing it in ever more sophisticated ways. For example, at the height of the second wave of Covid-19 cases in the United States, The New York Times ran a piece declaring that surging case numbers were not attributable to increased testing rates, despite President Trump’s claims to the contrary. The thrust of The Times’ argument was summarized by a series of plots that showed the actual rise in Covid-19 cases far outpacing what would be expected from increased testing alone. These weren’t simple visualizations; they involved assumptions and mathematical computations, and they provided the cornerstone for the article’s conclusion. The plots themselves weren’t sourced from an academic study (although the author on the byline of the piece is a computer science Ph.D. student); they were produced through “an analysis by The New York Times.”

The Times article was by no means an anomaly. News outlets have asserted, on the basis of in-house data analyses, that Covid-19 has killed nearly half a million more people than official records report; that Black and minority populations are overrepresented in the Covid-19 death toll; and that social distancing will usually outperform attempted quarantine. That last item, produced by The Washington Post and buoyed by in-house computer simulations, was the most read article in the history of the publication’s website, according to Washington Post media reporter Paul Farhi.

In my mind, a fine line has been crossed. Gone are the days when science journalism was like sports journalism, where the action was watched from the press box and simply conveyed. News outlets have stepped onto the field. They are doing the science themselves….(More)”.

‘Belonging Is Stronger Than Facts’: The Age of Misinformation


Max Fisher at the New York Times: “There’s a decent chance you’ve had at least one of these rumors, all false, relayed to you as fact recently: that President Biden plans to force Americans to eat less meat; that Virginia is eliminating advanced math in schools to advance racial equality; and that border officials are mass-purchasing copies of Vice President Kamala Harris’s book to hand out to refugee children.

All were amplified by partisan actors. But you’re just as likely, if not more so, to have heard it relayed from someone you know. And you may have noticed that these cycles of falsehood-fueled outrage keep recurring.

We are in an era of endemic misinformation — and outright disinformation. Plenty of bad actors are helping the trend along. But the real drivers, some experts believe, are social and psychological forces that make people prone to sharing and believing misinformation in the first place. And those forces are on the rise.

“Why are misperceptions about contentious issues in politics and science seemingly so persistent and difficult to correct?” Brendan Nyhan, a Dartmouth College political scientist, posed in a new paper in Proceedings of the National Academy of Sciences.

It’s not for want of good information, which is ubiquitous. Exposure to good information does not reliably instill accurate beliefs anyway. Rather, Dr. Nyhan writes, a growing body of evidence suggests that the ultimate culprits are “cognitive and memory limitations, directional motivations to defend or support some group identity or existing belief, and messages from other people and political elites.”

Put more simply, people become more prone to misinformation when three things happen. First, and perhaps most important, is when conditions in society make people feel a greater need for what social scientists call ingrouping — a belief that their social identity is a source of strength and superiority, and that other groups can be blamed for their problems.

As much as we like to think of ourselves as rational beings who put truth-seeking above all else, we are social animals wired for survival. In times of perceived conflict or social change, we seek security in groups. And that makes us eager to consume information, true or not, that lets us see the world as a conflict putting our righteous ingroup against a nefarious outgroup….(More)”.

Would you notice if fake news changed your behavior? An experiment on the unconscious effects of disinformation


Paper by Zach Bastick: “A growing literature is emerging on the believability and spread of disinformation, such as fake news, over social networks. However, little is known about the degree to which malicious actors can use social media to covertly affect behavior with disinformation. A lab-based randomized controlled experiment was conducted with 233 undergraduate students to investigate the behavioral effects of fake news. It was found that even short (under 5-min) exposure to fake news was able to significantly modify the unconscious behavior of individuals. This paper provides initial evidence that fake news can be used to covertly modify behavior, it argues that current approaches to mitigating fake news, and disinformation in general, are insufficient to protect social media users from this threat, and it highlights the implications of this for democracy. It raises the need for an urgent cross-sectoral effort to investigate, protect against, and mitigate the risks of covert, widespread and decentralized behavior modification over online social networks….(More)”

You Are Here: A Field Guide for Navigating Polarized Speech, Conspiracy Theories, and Our Polluted Media Landscape


Book by Whitney Phillips and Ryan M. Milner: “Our media environment is in crisis. Polarization is rampant. Polluted information floods social media. Even our best efforts to help clean up can backfire, sending toxins roaring across the landscape. In You Are Here, Whitney Phillips and Ryan Milner offer strategies for navigating increasingly treacherous information flows. Using ecological metaphors, they emphasize how our individual me is entwined within a much larger we, and how everyone fits within an ever-shifting network map.

Phillips and Milner describe how our poisoned media landscape came into being, beginning with the Satanic Panics of the 1980s and 1990s—which, they say, exemplify “network climate change”—and proceeding through the emergence of trolling culture and the rise of the reactionary far right (as well as its amplification by journalists) during and after the 2016 election. They explore the history of conspiracy theories in the United States, focusing on those concerning the Deep State; explain why old media literacy solutions fail to solve new media literacy problems; and suggest how we can navigate the network crisis more thoughtfully, effectively, and ethically. We need a network ethics that looks beyond the messages and the messengers to investigate toxic information’s downstream effects….(More)”.

How We Built a Facebook Feed Viewer


Citizen Browser at The MarkUp: “Our interactive dashboard, Split Screen, gives readers a peek into the content Facebook delivered to people of different demographic backgrounds and voting preferences who participated in our Citizen Browser project. 

Using Citizen Browser, our custom Facebook inspector, we perform daily captures of Facebook data from paid panelists. These captures collect the content that was displayed on their Facebook feeds at the moment the app performed its automated capture. From Dec. 1, 2020, to March 2, 2021, 2,601 paid participants have contributed their data to the project. 

To measure what Facebook’s recommendation algorithm displays to different groupings of people, we compare data captured from each over a two-week period. We look at three different pairings:

  • Women vs. Men
  • Biden Voters vs. Trump Voters
  • Millennials vs. Boomers 

We labeled our panelists based on their self-disclosed political leanings, gender, and age. We describe each pairing in more detail in the Pairings section of this article. 

For each pair, we examine four types of content served by Facebook: news sources, posts with news links, hashtags, and group recommendations. We compare the percentage of each grouping that was served each piece of content to that of the other grouping in the pair.  

For more information on the data we collect, the panel’s demographic makeup, and the extensive redaction process we undertake to preserve privacy, see our methodology How We Built a Facebook Inspector.

Our observations should not be taken as proof of Facebook’s choosing to target specific content at specific demographic groups. There are many factors that influence any given person’s feed that we do not account for, including users’ friends and social networks….(More)”.