Whose Streets? Our Streets!


Report by Rebecca Williams: “The extent to which “smart city” technology is altering our sense of freedom in public spaces deserves more attention if we want a democratic future. Democracy–the rule of the people–constitutes our collective self-determination and protects us against domination and abuse. Democracy requires safe spaces, or commons, for people to organically and spontaneously convene regardless of their background or position to campaign for their causes, discuss politics, and protest. In these commons, where anyone can take a stand and be noticed is where a notion of collective good can be developed and communicated. Public spaces, like our streets, parks, and squares, have historically played a significant role in the development of democracy. We should fight to preserve the freedoms intrinsic to our public spaces because they make democracy possible.

Last summer, approximately 15 to 26 million people participated in Black Lives Matter protests after the murder of George Floyd making it the largest mass movement in U.S. history. In June, the San Diego Police Department obtained footage of Black Lives Matter protesters from “smart streetlight” cameras, sparking shock and outrage from San Diego community members. These “smart streetlights” were promoted as part of citywide efforts to become a “smart city” to help with traffic control and air quality monitoring. Despite discoverable documentation about the streetlight’s capabilities and data policies on their website, including a data-sharing agreement about how they would share data with the police, the community had no expectation that the streetlights would be surveilling protestors. After media coverage and ongoing advocacy from the Transparent and Responsible Use of Surveillance Technology San Diego (TRUSTSD) coalition, the City Council, set aside the funding for the streetlights4 until a surveillance technology ordinance was considered and the Mayor ordered the 3,000+ streetlight cameras off. Due to the way power was supplied to the cameras, they remained on, but the city reported it no longer had access to the data it collected. In November, the City Council voted unanimously in favor of a surveillance ordinance and to establish a Privacy Advisory Board.In May, it was revealed that the San Diego Police Department had previously (in 2017) held back materials to Congress’ House Committee on Oversight and Reform about their use facial recognition technology. This story, with its mission creep and mishaps, is representative of a broader set of “smart city” cautionary trends that took place in the last year. These cautionary trends call us to question if our public spaces become places where one fears punishment, how will that affect collective action and political movements?

This report is an urgent warning of where we are headed if we maintain our current trajectory of augmenting our public space with trackers of all kinds. In this report, I outline how current “smart city” technologies can watch you. I argue that all “smart city” technology trends toward corporate and state surveillance and that if we don’t stop and blunt these trends now that totalitarianism, panopticonism, discrimination, privatization, and solutionism will challenge our democratic possibilities. This report examines these harms through cautionary trends supported by examples from this last year and provides 10 calls to action for advocates, legislatures, and technology companies to prevent these harms. If we act now, we can ensure the technology in our public spaces protect and promote democracy and that we do not continue down this path of an elite few tracking the many….(More)”

Off-Label: How tech platforms decide what counts as journalism


Essay by Emily Bell: “…But putting a stop to militarized fascist movements—and preventing another attack on a government building—will ultimately require more than content removal. Technology companies need to fundamentally recalibrate how they categorize, promote, and circulate everything under their banner, particularly news. They have to acknowledge their editorial responsibility.

The extraordinary power of tech platforms to decide what material is worth seeing—under the loosest possible definition of who counts as a “journalist”—has always been a source of tension with news publishers. These companies have now been put in the position of being held accountable for developing an information ecosystem based in fact. It’s unclear how much they are prepared to do, if they will ever really invest in pro-truth mechanisms on a global scale. But it is clear that, after the Capitol riot, there’s no going back to the way things used to be.

Between 2016 and 2020, Facebook, Twitter, and Google made dozens of announcements promising to increase the exposure of high-quality news and get rid of harmful misinformation. They claimed to be investing in content moderation and fact-checking; they assured us that they were creating helpful products like the Facebook News Tab. Yet the result of all these changes has been hard to examine, since the data is both scarce and incomplete. Gordon Crovitz—a former publisher of the Wall Street Journal and a cofounder of NewsGuard, which applies ratings to news sources based on their credibility—has been frustrated by the lack of transparency: “In Google, YouTube, Facebook, and Twitter we have institutions that we know all give quality ratings to news sources in different ways,” he told me. “But if you are a news organization and you want to know how you are rated, you can ask them how these systems are constructed, and they won’t tell you.” Consider the mystery behind blue-check certification on Twitter, or the absurdly wide scope of the “Media/News” category on Facebook. “The issue comes down to a fundamental failure to understand the core concepts of journalism,” Crovitz said.

Still, researchers have managed to put together a general picture of how technology companies handle various news sources. According to Jennifer Grygiel, an assistant professor of communications at Syracuse University, “we know that there is a taxonomy within these companies, because we have seen them dial up and dial down the exposure of quality news outlets.” Internally, platforms rank journalists and outlets and make certain designations, which are then used to develop algorithms for personalized news recommendations and news products….(More)”

Power to the Public: The Promise of Public Interest Technology


Book by Tara Dawson McGuinness and Hana Schank: “As the speed and complexity of the world increases, governments and nonprofit organizations need new ways to effectively tackle the critical challenges of our time—from pandemics and global warming to social media warfare. In Power to the Public, Tara Dawson McGuinness and Hana Schank describe a revolutionary new approach—public interest technology—that has the potential to transform the way governments and nonprofits around the world solve problems. Through inspiring stories about successful projects ranging from a texting service for teenagers in crisis to a streamlined foster care system, the authors show how public interest technology can make the delivery of services to the public more effective and efficient.

At its heart, public interest technology means putting users at the center of the policymaking process, using data and metrics in a smart way, and running small experiments and pilot programs before scaling up. And while this approach may well involve the innovative use of digital technology, technology alone is no panacea—and some of the best solutions may even be decidedly low-tech.

Clear-eyed yet profoundly optimistic, Power to the Public presents a powerful blueprint for how government and nonprofits can help solve society’s most serious problems….(More)”

More Than Nudges Are Needed to End the Pandemic


Richard Thaler in the New York Times: “…In the case of Covid vaccinations, society cannot afford to wait decades. Although vaccines are readily available and free for everyone over age 12 in the United States, there are many holdouts. About 40 percent of the adult population has not been fully vaccinated, and about a third has not yet gotten even one dose. It is time to get serious.

Of course, information campaigns must continue to stress the safety and efficacy of the vaccines, but it is important to target the messages at the most hesitant groups. It would help if the F.D.A. gave the vaccines its full approval rather than the current emergency use designation. Full approval for the Pfizer drug may come as soon as Labor Day, but the process for the other vaccines is much further behind.

One way to increase vaccine takeup would be to offer monetary incentives. For example, President Biden has recently advocated paying people $100 to get their shots.

Although this policy is well intended, I believe it is a mistake for a state or a country to offer to pay individuals to get vaccinated. First of all, the amount might be taken to be an indicator of how much — or little — the government thinks getting a jab is worth. Surely the value to society of increased vaccinations is well beyond $100 per person.

Second, it seems increasingly likely that one or more booster shots may be necessary for some populations in the United States to deal with the Delta variant of the coronavirus — and, perhaps, other variants as well. If that happens, we don’t want some people to procrastinate, hoping to get paid. Government-sponsored booster shots are already beginning in Israel and are at various stages of planning in several European countries.

An alternative model is being offered by the National Football League, which has stopped short of requiring players to be vaccinated but is offering plenty of incentives. Unvaccinated players have to be tested every day, must be masked and at a distance from teammates on flights, and must stay in their room until game day. Vaccinated players who test positive and are asymptomatic can return to duty after two negative tests 24 hours apart. But unvaccinated players must undergo a 10-day isolation period.

These incentives followed a long effort to educate the players about the benefits to themselves, their families and fellow players. It is hard to say which aspect of the N.F.L. plan is doing the work, but over 90 percent of the league’s players have received at least one jab. The fact that a team could lose a game because an unvaccinated player can’t play creates a powerful group dynamic…(More)”.

The A, B and C of Democracy: Or Cats in the Sack


Book by Luca Belgiorno-Nettis and Kyle Redman: “This is a learner’s guide to a better democracy. Sounds ambitious? It is. The catalyst for publishing this book is obvious. There’s no need to regurgitate the public’s disaffection with politics. Mired in the tawdry mechanics of political campaigning, and incapable of climbing out of cyclical electioneering contests, representative democracies are stuck in a rut.

As Dawn Nakagawa, Vice President of the Berggruen Institute, writes, ‘Democratic reform is hard. We are very attached to our constitutions and institutions, even to the point of romanticising it all.’

This handbook is an introduction to minipublics – otherwise known as citizens’ juries or assemblies – interspersed with a few travel anecdotes to share the momentum behind the basic methodology of deliberative democracy.

As the world accelerates into its digital future, with new modes of working, connecting and living – our parliaments remain relics from a primordial, ideological and adversarial age. Meanwhile urgent challenges are stumbling to half-solutions in slow-motion. Collaboration amongst us humans in the Anthropocene is no longer just a nice-to-have….(More)”.

Crowdsourced Sensor Data Powers Smoke Map


OpenGov: “The Environmental Protection Agency and the U.S. Forest Service (USFS) have released updates to the AirNow Fire and Smoke Map to help protect communities from the effects of wildfire smoke. Started as a pilot project last year, the map pulls data from three sources: temporary monitors such as those the Forest Service and other agencies have deployed near fires; crowdsourced data from nearly 10,000 low-cost sensors nationwide that measure fine particle pollution, the major harmful pollutant in smoke; and monitors that regularly report to AirNow, EPA’s one-stop source for air quality data.

The agencies announced improvements to the map, including a dashboard that gives users quick access to information that can help them plan their outdoor activities, the current Air Quality Index (AQI) category at the monitor or sensor location, data showing whether air quality is improving or worsening, and information about actions to consider taking based on the AQI.

EPA and USFS developed the map-pilot to provide information on fire locations, smoke plumes and air quality in one place. It had more than 7.4 million views in its first three months. The map imports data from almost 10,000 sensors from an air quality sensor network that crowdsources data on particle pollution, providing real-time measurement of air quality on a public map. This was a logical addition to two other projects already underway.

The extra data points the sensors provided proved useful in characterising air quality during the 2020 fire season, and we had positive reception from state, local and tribal air agency partners, and from the public. The map is intended for individuals to use in making decisions about outdoor activities based on air quality, but the unique fire, smoke and concentration data can help increase awareness of the significant impacts of wildfires across all levels of government — federal, state, local and tribal — and provide a valuable tool to assist agencies as they work to protect public health from wildfire smoke during these emergencies….(More)”.

The Time Tax


Article by Annie Lowrey: “…In my decade-plus of social-policy reporting, I have mostly understood these stories as facts of life. Government programs exist. People have to navigate those programs. That is how it goes. But at some point, I started thinking about these kinds of administrative burdens as the “time tax”—a levy of paperwork, aggravation, and mental effort imposed on citizens in exchange for benefits that putatively exist to help them. This time tax is a public-policy cancer, mediating every American’s relationship with the government and wasting countless precious hours of people’s time.

The issue is not that modern life comes with paperwork hassles. The issue is that American benefit programs are, as a whole, difficult and sometimes impossible for everyday citizens to use. Our public policy is crafted from red tape, entangling millions of people who are struggling to find a job, failing to feed their kids, sliding into poverty, or managing a disabling health condition.

… the government needs to simplify. For safety-net programs, this means eliminating asset tests, work requirements, interviews, and other hassles. It means federalizing programs like unemployment insurance and Medicaid. It means cross-coordinating, so that applicants are automatically approved for everything for which they qualify.

Finally, it needs to take responsibility for the time tax. Congress needs to pump money into the civil service and into user-friendly, citizen-centered programmatic design. And the federal government needs to reward states and the executive agencies for increasing uptake and participation rates, while punishing them for long wait times and other bureaucratic snafus.

Such changes would eliminate poverty and encourage trust in government. They would make American lives easier and simpler. Yes, Washington should give Americans more money and more security. But most of all, it should give them back their time….(More)”.

….

A comprehensive study of technological change


Article by Scott Murray: The societal impacts of technological change can be seen in many domains, from messenger RNA vaccines and automation to drones and climate change. The pace of that technological change can affect its impact, and how quickly a technology improves in performance can be an indicator of its future importance. For decision-makers like investors, entrepreneurs, and policymakers, predicting which technologies are fast improving (and which are overhyped) can mean the difference between success and failure.

New research from MIT aims to assist in the prediction of technology performance improvement using U.S. patents as a dataset. The study describes 97 percent of the U.S. patent system as a set of 1,757 discrete technology domains, and quantitatively assesses each domain for its improvement potential.

“The rate of improvement can only be empirically estimated when substantial performance measurements are made over long time periods,” says Anuraag Singh SM ’20, lead author of the paper. “In some large technological fields, including software and clinical medicine, such measures have rarely, if ever, been made.”

previous MIT study provided empirical measures for 30 technological domains, but the patent sets identified for those technologies cover less than 15 percent of the patents in the U.S. patent system. The major purpose of this new study is to provide predictions of the performance improvement rates for the thousands of domains not accessed by empirical measurement. To accomplish this, the researchers developed a method using a new probability-based algorithm, machine learning, natural language processing, and patent network analytics….(More)”.

Census Data Change to Protect Privacy Rattles Researchers, Minority Groups


Paul Overberg at the Wall Street Journal: A plan to protect the confidentiality of Americans’ responses to the 2020 census by injecting small, calculated distortions into the results is raising concerns that it will erode their usability for research and distribution of state and federal funds.

The Census Bureau is due to release the first major results of the decennial count in mid-August. They will offer the first detailed look at the population and racial makeup of thousands of counties and cities, as well as tribal areas, neighborhoods, school districts and smaller areas that will be used to redraw congressional, legislative and local districts to balance their populations.

The bureau will adjust most of those statistics to prevent someone from recombining them in a way that would disclose information about an individual respondent. Testing by the bureau shows that improvements in data science, computing power and commercial databases make that feasible.

Last week the bureau’s acting director said the plan was a necessary update of older methods to protect confidentiality. Ron Jarmin said the agency searched for alternatives before settling on differential privacy, a systematic approach to add statistical noise to data, something it has done in some fashion for years.

“I’m pretty confident that it’s going to meet users’ expectations,” Mr. Jarmin said at a panel during an online conference of government data users. “We have to deal with the technology as it is and as it evolves.”…(More)”.

An Obsolete Paradigm


Blogpost by Paul Wormelli: “…Our national system of describing the extent of crime in the U.S. is broken beyond repair and deserves to be replaced by a totally new paradigm (system). 

Since 1930, we have relied on the metrics generated by the Uniform Crime Reporting (UCR) Program to describe crime in the U.S., but it simply does not do so, even with its evolution into the National Incident-Based Reporting System (NIBRS). Criminologists have long recognized the limited scope of the UCR summary crime data, leading to the creation of the National Crime Victimization Survey (NCVS) and other supplementary crime data measurement vehicles. However, despite these measures, the United States still has no comprehensive national data on the amount of crime that has occurred. Even after decades of collecting data, the 1968 Presidential Crime Commission report on the Challenge of Crime in a Free Society lamented the absence of sound and complete data on crime in the U.S., and called for the creation of a National Crime Survey (NCS) that eventually led to the creation of the NCVS. Since then, we have slowly attempted to make improvements that will lead to more robust data. Only in 2021 did the FBI end UCR summary-based crime data collection and move to NIBRS crime data collection on a national scale.

Admittedly, the shift to NIBRS will unleash a sea change in how we analyze crime data and use it for decision making. However, it still lacks the completeness of national crime reporting. In the landmark study of the National Academy of Sciences Committee on Statistics (funded by the FBI and the Bureau of Justice Statistics to make recommendations on modernizing crime statistics), the panel members grappled with this reality and called out the absence of national statistics on crime that would fully inform policymaking on this critical subject….(More)”