Governments Empower Citizens by Promoting Digital Rights


Article by Julia Edinger: “The rapid rise of digital services and smart city technology has elevated concerns about privacy in the digital age and government’s role, even as cities from California to Texas take steps to make constituents aware of their digital rights.

Earlier this month, Long Beach, Calif., launched an improved version of its Digital Rights Platform, which shows constituents their data privacy and digital rights and information about how the city uses technologies while protecting digital rights.

“People’s digital rights are no different from their human or civil rights, except that they’re applied to how they interact with digital technologies — when you’re online, you’re still entitled to every right you enjoy offline,” said Will Greenberg, staff technologist at the Electronic Frontier Foundation (EFF), in a written statement. The nonprofit organization defends civil liberties in the digital world.


Long Beach’s platform initially launched several years ago, to mitigate privacy concerns that came out of the 2020 launch of a smart city initiative, according to Long Beach CIO Lea Eriksen. When that initiative debuted, the Department of Innovation and Technology requested the City Council approve a set of data privacy guidelines to ensure digital rights would be protected, setting the stage for the initial platform launch. Its 2021 beta version has now been enhanced to offer information on 22 city technology uses, up from two, and an enhanced feedback module enabling continued engagement and platform improvements…(More)”.

Digitally Invisible: How the Internet is Creating the New Underclass


Book by Nicol Turner Lee: “President Joe Biden has repeatedly said that the United States would close the digital divide under his leadership. However, the divide still affects people and communities across the country. The complex and persistent reality is that millions of residents live in digital deserts, and many more face disproportionate difficulties when it comes to getting and staying online, especially people of color, seniors, rural residents, and farmers in remote areas.

Economic and health disparities are worsening in rural communities without available internet access. Students living in urban digital deserts with little technology exposure are ill prepared to compete for emerging occupations. Even seniors struggle to navigate the aging process without access to online information and remote care.

In this book, Nicol Turner Lee, a leading expert on the American digital divide, uses personal stories from individuals around the country to show how the emerging digital underclass is navigating the spiraling online economy, while sharing their joys and hopes for an equitable and just future.

Turner Lee argues that achieving digital equity is crucial for the future of America’s global competitiveness and requires radical responses to offset the unintended consequences of increasing digitization. In the end, “Digitally Invisible” proposes a pathway to more equitable access to existing and emerging technologies, while encouraging readers to weigh in on this shared goal…(More)”.

Reliability of U.S. Economic Data Is in Jeopardy, Study Finds


Article by Ben Casselman: “A report says new approaches and increased spending are needed to ensure that government statistics remain dependable and free of political influence.

Federal Reserve officials use government data to help determine when to raise or lower interest rates. Congress and the White House use it to decide when to extend jobless benefits or send out stimulus payments. Investors place billions of dollars worth of bets that are tied to monthly reports on job growth, inflation and retail sales.

But a new study says the integrity of that data is in increasing jeopardy.

The report, issued on Tuesday by the American Statistical Association, concludes that government statistics are reliable right now. But that could soon change, the study warns, citing factors including shrinking budgets, falling survey response rates and the potential for political interference.

The authors — statisticians from George Mason University, the Urban Institute and other institutions — likened the statistical system to physical infrastructure like highways and bridges: vital, but often ignored until something goes wrong.

“We do identify this sort of downward spiral as a threat, and that’s what we’re trying to counter,” said Nancy Potok, who served as chief statistician of the United States from 2017 to 2019 and was one of the report’s authors. “We’re not there yet, but if we don’t do something, that threat could become a reality, and in the not-too-distant future.”

The report, “The Nation’s Data at Risk,” highlights the threats facing statistics produced across the federal government, including data on education, health, crime and demographic trends.

But the risks to economic data are particularly notable because of the attention it receives from policymakers and investors. Most of that data is based on surveys of households or businesses. And response rates to government surveys have plummeted in recent years, as they have for private polls. The response rate to the Current Population Survey — the monthly survey of about 60,000 households that is the basis for the unemployment rate and other labor force statistics — has fallen to about 70 percent in recent months, from nearly 90 percent a decade ago…(More)”.

Everyone Has A Price — And Corporations Know Yours


Article by David Dayen: “Six years ago, I was at a conference at the University of Chicago, the intellectual heart of corporate-friendly capitalism, when my eyes found the cover of the Chicago Booth Review, the business school’s flagship publication. “Are You Ready for Personalized Pricing?” the headline asked. I wasn’t, so I started reading.

The story looked at how online shopping, persistent data collection, and machine-learning algorithms could combine to generate the stuff of economists’ dreams: individual prices for each customer. It even recounted an experiment in 2015, where online employment website ZipRecruiter essentially outsourced its pricing strategy to two University of Chicago economists, Sanjog Misra and Jean-Pierre Dubé…(More)”.

Increasing The “Policy Readiness” Of Ideas


Article by Tom Kalil: “NASA and the Defense Department have developed an analytical framework called the “technology readiness level” for assessing the maturity of a technology – from basic research to a technology that is ready to be deployed.  

policy entrepreneur (anyone with an idea for a policy solution that will drive positive change) needs to realize that it is also possible to increase the “policy readiness” level of an idea by taking steps to increase the chances that a policy idea is successful, if adopted and implemented.  Given that policy-makers are often time constrained, they are more likely to consider ideas where more thought has been given to the core questions that they may need to answer as part of the policy process.

A good first step is to ask questions about the policy landscape surrounding a particular idea:

1. What is a clear description of the problem or opportunity?  What is the case for policymakers to devote time, energy, and political capital to the problem?

2. Is there a credible rationale for government involvement or policy change?  

Economists have developed frameworks for both market failure (such as public goods, positive and negative externalities, information asymmetries, and monopolies) and government failure (such as regulatory capture, the role of interest groups in supporting policies that have concentrated benefits and diffuse costs, limited state capacity, and the inherent difficulty of aggregating timely, relevant information to make and implement policy decisions.)

3. Is there a root cause analysis of the problem? …(More)”.

The MAGA Plan to End Free Weather Reports


Article by Zoë Schlanger: “In the United States, as in most other countries, weather forecasts are a freely accessible government amenity. The National Weather Service issues alerts and predictions, warning of hurricanes and excessive heat and rainfall, all at the total cost to American taxpayers of roughly $4 per person per year. Anyone with a TV, smartphone, radio, or newspaper can know what tomorrow’s weather will look like, whether a hurricane is heading toward their town, or if a drought has been forecast for the next season. Even if they get that news from a privately owned app or TV station, much of the underlying weather data are courtesy of meteorologists working for the federal government.

Charging for popular services that were previously free isn’t generally a winning political strategy. But hard-right policy makers appear poised to try to do just that should Republicans gain power in the next term. Project 2025—a nearly 900-page book of policy proposals published by the conservative think tank the Heritage Foundation—states that an incoming administration should all but dissolve the National Oceanic and Atmospheric Administration, under which the National Weather Service operates….NOAA “should be dismantled and many of its functions eliminated, sent to other agencies, privatized, or placed under the control of states and territories,” Project 2025 reads. … “The preponderance of its climate-change research should be disbanded,” the document says. It further notes that scientific agencies such as NOAA are “vulnerable to obstructionism of an Administration’s aims,” so appointees should be screened to ensure that their views are “wholly in sync” with the president’s…(More)”.

Bringing Communities In, Achieving AI for All


Article by Shobita Parthasarathy and Jared Katzman: “…To this end, public and philanthropic research funders, universities, and the tech industry should be seeking out partnerships with struggling communities, to learn what they need from AI and build it. Regulators, too, should have their ears to the ground, not just the C-suite. Typical members of a marginalized community—or, indeed, any nonexpert community—may not know the technical details of AI, but they understand better than anyone else the power imbalances at the root of concerns surrounding AI bias and discrimination. And so it is from communities marginalized by AI, and from scholars and organizations focused on understanding and ameliorating social disadvantage, that AI designers and regulators most need to hear.

Progress toward AI equity begins at the agenda-setting stage, when funders, engineers, and corporate leaders make decisions about research and development priorities. This is usually seen as a technical or management task, to be carried out by experts who understand the state of scientific play and the unmet needs of the market… A heartening example comes from Carnegie Mellon University, where computer scientists worked with residents in the institution’s home city of Pittsburgh to build a technology that monitored and visualized local air quality. The collaboration began when researchers attended community meetings where they heard from residents who were suffering the effects of air pollution from a nearby factory. The residents had struggled to get the attention of local and national officials because they were unable to provide the sort of data that would motivate interest in their case. The researchers got to work on prototype systems that could produce the needed data and refined their technology in response to community input. Eventually their system brought together heterogeneous information, including crowdsourced smell reports, video footage of factory smokestacks, and air-quality and wind data, which the residents then submitted to government entities. After reviewing the data, administrators at the Environmental Protection Agency agreed to review the factory’s compliance, and within a year the factory’s parent company announced that the facility would close…(More)”.

Collaborating with Journalists and AI: Leveraging Social Media Images for Enhanced Disaster Resilience and Recovery


Paper by Murthy Dhiraj et al: “Methods to meaningfully integrate journalists into crisis informatics remain lacking. We explored the feasibility of generating a real-time, priority-driven map of infrastructure damage during a natural disaster by strategically selecting journalist networks to identify sources of image-based infrastructure-damage data. Using the REST Twitter API, 1,000,522 tweets were collected from September 13-18, 2018, during and after Hurricane Florence made landfall in the United States. Tweets were classified by source (e.g., news organizations or citizen journalists), and 11,638 images were extracted. We utilized Google’s AutoML Vision software to successfully develop a machine learning image classification model to interpret this sample of images. As a result, 80% of our labeled data was used for training, 10% for validation, and 10% for testing. The model achieved an average precision of 90.6%, an average recall of 77.2%, and an F1 score of .834. In the future, establishing strategic networks of journalists ahead of disasters will reduce the time needed to identify disaster-response targets, thereby focusing relief and recovery efforts in real-time. This approach ultimately aims to save lives and mitigate harm…(More)”.

A new index is using AI tools to measure U.S. economic growth in a broader way


Article by Jeff Cox: “Measuring the strength of the sprawling U.S. economy is no easy task, so one firm is sending artificial intelligence in to do the job.

The Zeta Economic Index, launched Monday, uses generative AI to analyze what its developers call “trillions of behavioral signals,” largely focused on consumer activity, to score growth on both a broad level of health and a separate measure on stability.

At its core, the index will gauge online and offline activity across eight categories, aiming to give a comprehensive look that incorporates standard economic data points such as unemployment and retail sales combined with high-frequency information for the AI age.

“The algorithm is looking at traditional economic indicators that you would normally look at. But then inside of our proprietary algorithm, we’re ingesting the behavioral data and transaction data of 240 million Americans, which nobody else has,” said David Steinberg, co-founder, chairman and CEO of Zeta Global.

“So instead of looking at the data in the rearview mirror like everybody else, we’re trying to put it out in advance to give a 30-day advanced snapshot of where the economy is going,” he added…(More)”.

How the Rise of the Camera Launched a Fight to Protect Gilded Age Americans’ Privacy


Article by Sohini Desai: “In 1904, a widow named Elizabeth Peck had her portrait taken at a studio in a small Iowa town. The photographer sold the negatives to Duffy’s Pure Malt Whiskey, a company that avoided liquor taxes for years by falsely advertising its product as medicinal. Duffy’s ads claimed the fantastical: that it cured everything from influenza to consumption, that it was endorsed by clergymen, that it could help you live until the age of 106. The portrait of Peck ended up in one of these dubious ads, published in newspapers across the country alongside what appeared to be her unqualified praise: “After years of constant use of your Pure Malt Whiskey, both by myself and as given to patients in my capacity as nurse, I have no hesitation in recommending it.”

Duffy’s lies were numerous. Peck (misleadingly identified as “Mrs. A. Schuman”) was not a nurse, and she had not spent years constantly slinging back malt beverages. In fact, she fully abstained from alcohol. Peck never consented to the ad.

The camera’s first great age—which began in 1888 when George Eastman debuted the Kodak—is full of stories like this one. Beyond the wonders of a quickly developing art form and technology lay widespread lack of control over one’s own image, perverse incentives to make a quick buck, and generalized fear at the prospect of humiliation and the invasion of privacy…(More)”.