To Map Billions of Cicadas, It Takes Thousands of Citizen Scientists


Article by Linda Poon and Marie Patino: “At the end of May, Dan Mozgai will spend his vacation from his day job chasing cicadas. The bugs won’t be hard to find; in about a week, billions of the beady-eyed crawlers from Brood X will start coming up from their 17-year-long underground, blanketing parts of 15 states in the Northeast, Mid-Atlantic and Midwest with their cacophony of shrill mating calls. 

Mozgai isn’t an entomologist — he does online marketing for DirecTV. But since2007, he’s worked closely with academic researchers to track various broods of periodical cicadas,as part of one of the oldest citizen science efforts in the U.S. 

He’ll be joined by ten of thousands of other volunteers across the Brood X territory who will use the mobile app Cicada Safari, where userscan add geotagged photos and videos onto a live map, as dozens of student researchers behind the scenes verify each submission. Videos will be especially helpful this year, as it provides audio data for the researchers, says Gene Kritsky, an entomologist at Mount St. Joseph University in Cincinnati, and the creator behind Cicada Safari. He’s been testing the new app with smaller broods for two years in anticipation for this moment. https://0b26ee1773bac5736a29111147e28a6b.safeframe.googlesyndication.com/safeframe/1-0-38/html/container.html

Brood X,  is one of the largest, and mostly broadly distributed geographically, of periodical cicadas, which emerge every 13 or 17 years. They’ll stick around for just a few weeks, through June, to mate and lay eggs.

“With the smartphone technology and the GPS location services, it was just a perfect way to do citizen science,” Kritsky says. Some 87,000 people have signed up as of the beginning of May, and they’ve already documented several early risers, especially around Cincinnati and Washington, D.C. — two of the expected hotspot…(More)”.

The Filing Cabinet


Essay by Craig Robertson: “The filing cabinet was critical to the information infrastructure of the 20th-century. Like most infrastructure, it was usually overlooked….The subject of this essay emerged by chance. I was researching the history of the U.S. passport, and had spent weeks at the National Archives, struggling through thousands of reels of unindexed microfilm records of 19th-century diplomatic correspondence; then I arrived at the records for 1906. That year, the State Department adopted a numerical filing system. Suddenly, every American diplomatic office began using the same number for passport correspondence, with decimal numbers subdividing issues and cases. Rather than scrolling through microfilm images of bound pages organized chronologically, I could go straight to passport-relevant information that had been gathered in one place.

I soon discovered that I had Elihu Root to thank for making my research easier. A lawyer whose clients included Andrew Carnegie, Root became secretary of state in 1905. But not long after he arrived, the prominent corporate lawyer described himself as “a man trying to conduct the business of a large metropolitan law-firm in the office of a village squire.”  The department’s record-keeping practices contributed to his frustration. As was then common in American offices, clerks used press books or copybooks to store incoming and outgoing correspondence in chronologically ordered bound volumes with limited indexing. For Root, the breaking point came when a request for a handful of letters resulted in several bulky volumes appearing on his desk. His response was swift: he demanded that a vertical filing system be adopted; soon the department was using a numerical subject-based filing system housed in filing cabinets. 

The shift from bound volumes to filing systems is a milestone in the history of classification; the contemporaneous shift to vertical filing cabinets is a milestone in the history of storage….(More)”.

How to get people to talk to one another again? Citizens’ assemblies


Interview with Jane Mansbridge, Adams Professor of Political Leadership and Democratic Values Emerita at the Harvard Kennedy School, is the author of “Beyond Adversary Democracy.” Her current work revolves around representation, democratic deliberation, and everyday activism:

GAZETTE: How might we get citizens who are so polarized to listen to one another?

MANSBRIDGE: One proven practice is the technique of citizens’ assemblies or deliberative polls. These are groups of citizens drawn randomly, through a democratic lottery, from a particular population. It could be an entire country, a state, a city, or even a neighborhood, from which you bring together a group of citizens to talk about an issue that is of concern to their community. For this technique to be successful, the group has to be random, meaning that you have to have good representation from everyone, not just the white retirees who don’t have much to do and would love to come to this sort of thing. To get a random group, you ought to able to pay the participants because you want to be able to get the poor, the less educated, and people who, for one reason or another, would not give up a weekend otherwise to come together with other citizens to deliberate about some major issue.

GAZETTE: How do we know these assemblies foster civil dialogue?

MANSBRIDGE: Let’s take the deliberative polling organized by the Center for Deliberative Democracy at Stanford that I’ve worked with, in informal ways, for a couple of decades. If you look at those gatherings, one important way to get citizens to listen to one another comes from their design, in which they alternate small groups of 12 or so people, randomly drawn from the random selection, with larger assemblies, in which the citizens ask questions to experts. One of the tasks they have in their small group is not only to deliberate about the issues, but to design questions they want to ask the experts. As it happens, the project of asking a common question becomes a task that binds citizens together across the lines of difference….(More)”

The Conference on the Future of Europe—an Experiment in Citizens’ Participation


Stefan Lehne at Carnegie Europe: “If the future of Europe is to be decided at the Conference on the Future of Europe, we should be extremely worried.

Clearly, this has been one of the least fortunate EU projects of recent years. Conceived by French President Emmanuel Macron in 2019 as a response to the rise of populism, the conference fell victim, first to the pandemic and then to institutional squabbling over who should lead it, resulting in a delay of an entire year.

The setup of the conference emerging from months of institutional infighting is strangely schizophrenic.

On the one hand, it offers a forum for interinstitutional negotiations, where representatives of the European Parliament demanding deeper integration will confront a number of governments staunchly opposed to transferring further powers to the EU. 

On the other, the conference provides for an innovative experiment in citizens’ participation. A multilingual interactive website—futureu.europa.eu—offers citizens the opportunity to share and discuss ideas and to organize events. Citizens’ panels made up of randomly selected people from across the EU will discuss various policy areas and feed their findings into the debate of the conference’s plenary….

In the first three weeks 11,000 people participated in the digital platform, sharing more than 2,500 ideas on various aspects of the EU’s work.

A closer look reveals that many of the participants are engaged citizens and activists who use the website as just another format to propagate their demands. The platform thus offers a diverse and colorful discussion forum, but is unlikely to render a representative picture of the views of the average citizen.

This is precisely the objective of the citizens’ panels: an attempt to introduce an element of deliberative democracy into EU politics.

Deliberative assemblies have in recent decades become a prominent feature of democratic life in many countries. They work best at the local level, where participants understand each other well and are already roughly familiar with the issue at stake.

But they have also been employed at the national level, such as the citizens’ assembly preparing the referendum on abortion in Ireland or the citizens’ convention on climate in France.

The European Commission has rich experience, having held more than 1,800 citizens’ consultations, but apart from a single rather symbolic experiment in 2018, a genuine citizens’ panel based on sortition has never been attempted at the European level.

Deliberative democracy is all about initiating an open discussion, carefully weighing the evidence, and thus allowing convergence toward a broadly shared agreement. Given the language barriers and the highly diverse cultural background of European citizens, this is difficult to accomplish at the EU level.

Also, many of subject areas of the conference ranging from climate to economic and social policy are technically complex. It is clear that a great deal of expert advice and time will be necessary to enable citizens to engage in meaningful deliberation on these topics.

Unfortunately, the limited timeframe and the insufficient resources of the conference—financing depends on contributions from the institutions—make it doubtful that the citizens’ panels will be conducted in a sufficiently serious manner.

There is also—as is so often the case with citizens’ assemblies—the crucial question of the follow-up. In the case of the conference, the recommendations of the panels, together with the content of the digital platform and the outcome of events in the member states, will feed into the discussions of the plenary….(More)”

Why governing data is key for the future of cities


Article by Carlos Santiso and Marcelo Facchina: “Technology is changing city dwellers lives, as well as how urban centres evolve to meet their needs. The pandemic has accelerated this transformation, and the digital transition has generated an explosion of data, especially in cities. In this context, the ability of local governments to manage urban problems will be paramount for the recovery, and the pandemic has helped us better understand the missing elements we need to govern cities effectively. For instance, the World Bank’s World Development Report of 2021 underscored that a data infrastructure policy is one of the building blocks of a good data governance framework, both to foster the local data economy and promote digital inclusion.  

It is inconceivable not to consider cities as an integral part of the solution to challenges like tackling social exclusion, improving public services and reducing insecurity, among others. A key issue that has become increasingly prominent in city agendas is the good governance of data; that is how data is handled and for what purpose, its quality and integrity, as well as the privacy and security concerns related to its collection and use. In other words, city governments need to preserve people’s trust in the way they handle data to improve lives.

A modern local government cannot be sustained without good data governance, secure data infrastructure, and digital talent to extract public value from data. Data policy must therefore act as an enabler of transformation strategies, defining the scope, direction, responsibilities and procedures for the effective and responsible use of data for more responsive and resilient cities.

At the national level, “delivery units” have gained relevance as instruments for managing change in governments and driving the effective implementation of strategic priorities. These management models led by central government have proven to be effective instruments for achieving government targetspriority goals and major projects.

The model is even being expanded to subnational governments, like in the case of Colombia. Municipalities interact directly with citizens in providing public services, and innovations like the “delivery units”, can help improve citizen satisfaction with government services. In a recent study, we show how Latin American cities, for example Recife and Rio de Janeiro in Brazil, have leveraged these innovations in public management as a strategic planning tool, building on the pioneering experience of New York. Another interesting case is Buenos Aires, in Argentina, where systematic monitoring of government commitments by the Compliance Management Unit achieved a significant decrease in murder rates (43%) and road accidents (33%) between 2015 and 2019.

The pivotal role of new technologies and the strategic use of data by municipal governments can also improve delivery of services, making them more accessible, agile, efficient and less costly. In another recent study, we look at the case of 12 cities around the world and in the region, including Boston, Seoul, London, Buenos Aires, Medellin, Mexico and Recife that are seeking to strengthen their strategic management with more intensive use of data to better meet the growing expectations of their citizens….(More)”.

‘Belonging Is Stronger Than Facts’: The Age of Misinformation


Max Fisher at the New York Times: “There’s a decent chance you’ve had at least one of these rumors, all false, relayed to you as fact recently: that President Biden plans to force Americans to eat less meat; that Virginia is eliminating advanced math in schools to advance racial equality; and that border officials are mass-purchasing copies of Vice President Kamala Harris’s book to hand out to refugee children.

All were amplified by partisan actors. But you’re just as likely, if not more so, to have heard it relayed from someone you know. And you may have noticed that these cycles of falsehood-fueled outrage keep recurring.

We are in an era of endemic misinformation — and outright disinformation. Plenty of bad actors are helping the trend along. But the real drivers, some experts believe, are social and psychological forces that make people prone to sharing and believing misinformation in the first place. And those forces are on the rise.

“Why are misperceptions about contentious issues in politics and science seemingly so persistent and difficult to correct?” Brendan Nyhan, a Dartmouth College political scientist, posed in a new paper in Proceedings of the National Academy of Sciences.

It’s not for want of good information, which is ubiquitous. Exposure to good information does not reliably instill accurate beliefs anyway. Rather, Dr. Nyhan writes, a growing body of evidence suggests that the ultimate culprits are “cognitive and memory limitations, directional motivations to defend or support some group identity or existing belief, and messages from other people and political elites.”

Put more simply, people become more prone to misinformation when three things happen. First, and perhaps most important, is when conditions in society make people feel a greater need for what social scientists call ingrouping — a belief that their social identity is a source of strength and superiority, and that other groups can be blamed for their problems.

As much as we like to think of ourselves as rational beings who put truth-seeking above all else, we are social animals wired for survival. In times of perceived conflict or social change, we seek security in groups. And that makes us eager to consume information, true or not, that lets us see the world as a conflict putting our righteous ingroup against a nefarious outgroup….(More)”.

Who is “Public” Data Really For?


Jer Thorp at Literary Hub: “Public” is a word that has, in the last decade, become bound tightly to data. Loosely defined, any data that is available in the public domain falls into this category, but the term is most often used to describe data that might serve some kind of civic purpose: census data or environmental data or health data, along with transparency-focused data like government budgets and reports. Often sidled up to “public” is the word “open.” Although the Venn diagram between the two words has ample overlap (public data is often open, and vice versa), the word “open” typically refers to if and how the data is accessible, rather than toward what ends it might be put to use.

Both words—“public” and “open”—invite a question: For whom? Despite the efforts of Mae and Gareth, and Tom Grundner and many others, the internet as it exists is hardly a public space. Many people still find themselves excluded from full participation. Access to anything posted on a city web page or on a .gov domain is restricted by barriers of cost and technical ability. Getting this data can be particularly hard for communities that are already marginalized, and both barriers—financial and technical—can be nearly impassable in places with limited resources and literacies.

Data.gov, the United States’ “open data portal,” lists nearly 250,000 data sets, an apparent bounty of free information. Spend some time on data.gov and other portals, though, and you’ll find out that public data as it exists is messy and often confusing. Many hosted “data sets” are links to URLs that are no longer active. Trying to access data about Native American communities from the American Community Survey on data.gov brought me first to a census site with an unlabeled list of file folders. Downloading a zip file and unpacking it resulted in 64,086 cryptically named text files each containing zero kilobytes of data. As someone who has spent much of the last decade working with these kinds of data, I can tell you that this is not an uncommon experience. All too often, working with public data feels like assembling particularly complicated Ikea furniture with no tools, no instructions, and an unknown number of missing pieces.

Today’s public data serves a particular type of person and a specific type of purpose. Mostly, it supports technically adept entrepreneurs. Civic data initiatives haven’t been shy about this; on data.gov’s impact page you’ll find a kind of hall-of-fame list of companies that are “public data success stories”: Kayak, Trulia, Foursquare, LinkedIn, Realtor.com, Zillow, Zocdoc, AccuWeather, Carfax. All of these corporations have, in some fashion, built profit models around public data, often charging for access to the very information that the state touts as “accessible, discoverable, and usable.”…(More)”.

Why Aren’t Text Message Interventions Designed to Boost College Success Working at Scale?


Article by Ben Castleman: “I like to think of it as my Mark Zuckerberg moment: I was a graduate student and it was a sweltering summer evening in Cambridge. Text messages were slated to go out to recent high school graduates in Massachusetts and Texas. Knowing that thousands of phones would soon start chirping and vibrating with information about college, I refreshed my screen every 30 seconds, waiting to see engagement statistics on how students would respond. Within a few minutes there were dozens of new responses from students wanting to connect with an advisor to discuss their college plans.

We’re approaching the tenth anniversary of that first text-based advising campaign to reduce summer melt—when students have been accepted to and plan to attend college upon graduating high school, but do not start college in the fall. The now-ubiquity of businesses sending texts makes it hard to remember how innovative texting as a channel was; back in the early 2010s, text was primarily used for social and conversational communication. Maybe the occasional doctor’s office or airline would send a text reminder, but SMS was not broadly used as a channel by schools or colleges.

Those novel text nudges appeared successful. Results from a randomized controlled trial (RCT) that I conducted with Lindsay Page showed that students who received the texts reminding them of pre-enrollment tasks and connecting them with advisors enrolled in college at higher rates. We had the opportunity to replicate our summer melt work two summers later in additional cities and with engagement from the White House Social and Behavioral Sciences team and found similar impacts.

This evidence emerged as the Obama administration made higher ed policy a greater focus in the second term, with a particular emphasis on expanding college opportunity for underrepresented students. Similar text campaigns expanded rapidly and broadly—most notably former First Lady Michelle Obama’s Up Next campaign—in part because they check numerous boxes for policymakers and funders: Texts are inexpensive to send; text campaigns are relatively easy to implement; and there was evidence of their effectiveness at expanding college access….(More)”.

The lapses in India’s Covid-19 data are a result of decades of callousness towards statistics


Prathamesh Mulye at Quartz: “India is paying a huge price for decades of callous attitude towards data and statistics. For several weeks now, experts have been calling out the Indian government and state heads for suppressing Covid-19 infection and death figures. None of the political leaders have addressed these concerns even as official data reflects a small fraction of what’s playing out at hospitals and cremation grounds.

A major reason why administrations are getting away without an answer is that data lapses are nothing new to India.

Successive regimes in the country have tinkered and twisted figures as per their convenience without much consequences. For years, the country has been criticised for insufficient and poor quality data relating to a range of topics, including GDP, farmer suicide, and even unemployment…

Before the pandemic started, the most prominent data controversy in India was around the GDP numbers, which the Modi government continuously changed and chopped to cover up the slowdown in economic growth. In 2019, the Modi government also chose not to publish an unemployment data report that showed that joblessness in the country was at a nine-year high in 2017-18. And last year, in the middle of the pandemic, the government said it had no data on the number of frontline workers who had lost their lives to Covid-19 or a list of police personnel fatalities due to the disease.

Experts say that India’s statistical machinery has been deliberately weakened over the past few years to protect various governments’ false claims and image.

“The weakened statistical machinery manifests itself in different ways such as delays and questions about data quality. Also, when the results of a survey don’t suit the government in power, it tries to suppress data. This happened, for instance, with nutrition data in previous governments too,” said Reetika Khera, associate professor at the Indian Institute of Technology (IIT), Delhi.

“Think of the economy as a patient: data captures its pulse rate. If you don’t listen to the pulse, you won’t be able to diagnose correctly, let alone cure it,” she added….(More)”

Alphonse Bertillon and the Troubling Pursuit of Human Metrics


Article by Jessica Helfand: “…We are groomed, from an early age, to crave measurement. Notches on walls verify our height. Notes from doctors record our weight. We buy scales and diaries, save report cards and log achievements. As babies become toddlers become adolescents and adults, we take pictures — lots of pictures. Memories registered and milestones passed, we willingly share our data by way of a host of forms that cumulatively present, over a lifetime, as a kind of gold standard. On paper or online, they’re our material witnesses, holding the temporal at bay.

Dora’s material witness is typical of the sorts of records to which all of us are attached, official documents that connect faces to places, snapshots to statistics. Bureaucratic and perfunctory, we seldom stop to question the silent power of these documents, even as they transport our collective selves across time and space. Lacking nuance, devoid of emotion, they nevertheless confer a kind of keen graphic authority, begetting permission, enabling access, presupposing legitimacy, and anticipating a host of needs. Framed by the records that circumscribe that legitimacy — the records and diplomas, ID cards and passports and licenses — the playing field of difference is homogenized by numerical necessity, making all of us, in a sense, prisoners of the indexical.

Sir Francis Galton’s 1851 “Anthropometric Laboratory” — which was included in the International Health Exhibition held in London in 1885 — was an attempt to show the public how human characteristics could be both measured and recorded.

The pursuit of human metrics has a rich and fascinating history, dating back to the ancient Greeks, who viewed proportion itself as a physical projection of the harmony of the universe. Idealized proportion was synonymous with beauty, a physical expression of divine benevolence. (“The good, of course, is always beautiful,” wrote Plato, “and the beautiful never lacks proportion.”) From Dürer to da Vinci, the notion that humans might aspire to a pure and balanced ideal would find expression in everything from the writings of Vitruvius to the gardens of Le Nôtre to the evolution of the humanist alphabet. To the degree that proportion itself was deemed closer to the divine when realized as an expression of balance and geometry, proportion had everything to do with mathematics in general (and the golden section in particular) and found its most profound expression in the realization of the human form.

While there is ample evidence to suggest that the urge to measure had its origins in ancient civilizations, the science of bodily measurement was not recognized as a proper professional pursuit until the 19th century. With the advent of industry and the pragmatic concerns with which it was associated — growth projections, profit motives, numerical evidence as approved metrics for evaluation — certain public institutions were perhaps uniquely sensitized to appreciate the value of quantitative data. Statistics as a field of mathematical inquiry gained traction as a discipline thanks in no small part to the scholarship of Sir Francis Galton, whose obsession with counting and measuring everything imaginable (but especially human beings) warrants mention here. His 1851 “Anthropometric Laboratory” — which was included in the International Health Exhibition held in London in 1885 — was an attempt to show the public how human characteristics could be both measured and recorded. Add to this the rise in photography as a promising new technology and the idea of capturing evidence via methodical efforts in data mining was an idea whose time had clearly come….(More)”