Can data die?


Article by Jennifer Ding: “…To me, the crux of the Lenna story is how little power we have over our data and how it is used and abused. This threat seems disproportionately higher for women who are often overrepresented in internet content, but underrepresented in internet company leadership and decision making. Given this reality, engineering and product decisions will continue to consciously (and unconsciously) exclude our needs and concerns.

While social norms are changing towards non-consensual data collection and data exploitation, digital norms seem to be moving in the opposite direction. Advancements in machine learning algorithms and data storage capabilities are only making data misuse easier. Whether the outcome is revenge porn or targeted ads, surveillance or discriminatory AI, if we want a world where our data can retire when it’s outlived its time, or when it’s directly harming our lives, we must create the tools and policies that empower data subjects to have a say in what happens to their data… including allowing their data to die…(More)”

Fairer Democracy: Designing a Better Citizens’ Assembly


Press release by The Fannie and John Hertz Foundation: “Last winter, 80 residents of Washington State convened virtually to discuss the best ways for their state to tackle climate change. Their final recommendations were shared with state legislators, who are now considering some of the ideas in their policymaking. But the participants of the Washington Climate Assembly were neither climate experts nor politicians. Instead, they were randomly selected citizens from all walks of life, chosen carefully to reflect a range of demographics and views on climate change.

Such citizens’ assemblies are an increasingly popular way, around the world, of engaging average people in their democracies. But ensuring that participants are truly representative of society at large is a daunting analytical challenge. 

That’s where Bailey Flanigan, a Hertz Fellow and a graduate student at Carnegie Mellon University, comes in. Flanigan and colleagues at Carnegie Mellon and Harvard University have developed a new algorithm for selecting the participants in citizens’ assemblies, a process called sortition. The goal of their approach, she says, is to improve the fairness of sortition—and it’s already been published in Nature and used to select participants for dozens of assemblies, including the Washington Climate Assembly….

The researchers have made their algorithm, which they dubbed Panelot, available for public use, and Procaccia said it’s already been used in selecting more than 40 citizens’ assemblies. 

“It’s testament to the potential impact of work in this area that our algorithm has been enthusiastically adopted by so many organizations,” Flanigan said. “A lot of practitioners were using their own algorithms, and the idea that computer scientists can help centralize efforts to make sortition fairer and more transparent has started some exciting conversations.”…(More)”

The Age of A.I. And Our Human Future


Book by Henry A Kissinger, Eric Schmidt, and Daniel Huttenlocher: “Artificial Intelligence (AI) is transforming human society fundamentally and profoundly. Not since the Enlightenment and the Age of Reason have we changed how we approach knowledge, politics, economics, even warfare. Three of our most accomplished and deep thinkers come together to explore what it means for us all.

An A.I. that learned to play chess discovered moves that no human champion would have conceived of. Driverless cars edge forward at red lights, just like impatient humans, and so far, nobody can explain why it happens. Artificial intelligence is being put to use in sports, medicine, education, and even (frighteningly) how we wage war.

In this book, three of our most accomplished and deep thinkers come together to explore how A.I. could affect our relationship with knowledge, impact our worldviews, and change society and politics as profoundly as the ideas of the Enlightenment…(More)”.

Nonprofit Websites Are Riddled With Ad Trackers


Article by By Alfred Ng and Maddy Varner: “Last year, nearly 200 million people visited the website of Planned Parenthood, a nonprofit that many people turn to for very private matters like sex education, access to contraceptives, and access to abortions. What those visitors may not have known is that as soon as they opened plannedparenthood.org, some two dozen ad trackers embedded in the site alerted a slew of companies whose business is not reproductive freedom but gathering, selling, and using browsing data.

The Markup ran Planned Parenthood’s website through our Blacklight tool and found 28 ad trackers and 40 third-party cookies tracking visitors, in addition to so-called “session recorders” that could be capturing the mouse movements and keystrokes of people visiting the homepage in search of things like information on contraceptives and abortions. The site also contained trackers that tell Facebook and Google if users visited the site.

The Markup’s scan found Planned Parenthood’s site communicating with companies like Oracle, Verizon, LiveRamp, TowerData, and Quantcast—some of which have made a business of assembling and selling access to masses of digital data about people’s habits.

Katie Skibinski, vice president for digital products at Planned Parenthood, said the data collected on its website is “used only for internal purposes by Planned Parenthood and our affiliates,” and the company doesn’t “sell” data to third parties.

“While we aim to use data to learn how we can be most impactful, at Planned Parenthood, data-driven learning is always thoughtfully executed with respect for patient and user privacy,” Skibinski said. “This means using analytics platforms to collect aggregate data to gather insights and identify trends that help us improve our digital programs.”

Skibinski did not dispute that the organization shares data with third parties, including data brokers.

Blacklight scan of Planned Parenthood Gulf Coast—a localized website specifically for people in the Gulf region, including Texas, where abortion has been essentially outlawed—churned up similar results.

Planned Parenthood is not alone when it comes to nonprofits, some operating in sensitive areas like mental health and addiction, gathering and sharing data on website visitors.

Using our Blacklight tool, The Markup scanned more than 23,000 websites of nonprofit organizations, including those belonging to abortion providers and nonprofit addiction treatment centers. The Markup used the IRS’s nonprofit master file to identify nonprofits that have filed a tax return since 2019 and that the agency categorizes as focusing on areas like mental health and crisis intervention, civil rights, and medical research. We then examined each nonprofit’s website as publicly listed in GuideStar. We found that about 86 percent of them had third-party cookies or tracking network requests. By comparison, when The Markup did a survey of the top 80,000 websites in 2020, we found 87 percent used some type of third-party tracking.

About 11 percent of the 23,856 nonprofit websites we scanned had a Facebook pixel embedded, while 18 percent used the Google Analytics “Remarketing Audiences” feature.

The Markup found that 439 of the nonprofit websites loaded scripts called session recorders, which can monitor visitors’ clicks and keystrokes. Eighty-nine of those were for websites that belonged to nonprofits that the IRS categorizes as primarily focusing on mental health and crisis intervention issues…(More)”.

What Do Teachers Know About Student Privacy? Not Enough, Researchers Say


Nadia Tamez-Robledo at EdTech: “What should teachers be expected to know about student data privacy and ethics?

Considering so much of their jobs now revolve around student data, it’s a simple enough question—and one that researcher Ellen B. Mandinach and a colleague were tasked with answering. More specifically, they wanted to know what state guidelines had to say on the matter. Was that information included in codes of education ethics? Or perhaps in curriculum requirements for teacher training programs?

“The answer is, ‘Not really,’” says Mandinach, a senior research scientist at the nonprofit WestEd. “Very few state standards have anything about protecting privacy, or even much about data,” she says, aside from policies touching on FERPA or disposing of data properly.

While it seems to Mandinach that institutions have historically played hot potato over who is responsible for teaching educators about data privacy, the pandemic and its supercharged push to digital learning have brought new awareness to the issue.

The application of data ethics has real consequences for students, says Mandinach, like an Atlanta sixth grader who was accused of “Zoombombing” based on his computer’s IP address or the Dartmouth students who were exonerated from cheating accusations.

“There are many examples coming up as we’re in this uncharted territory, particularly as we’re virtual,” Mandinach says. “Our goal is to provide resources and awareness building to the education community and professional organization…so [these tools] can be broadly used to help better prepare educators, both current and future.”

This week, Mandinach and her partners at the Future of Privacy Forum released two training resources for K-12 teachers: the Student Privacy Primer and a guide to working through data ethics scenarios. The curriculum is based on their report examining how much data privacy and ethics preparation teachers receive while in college….(More)”.

Towards Better Governance of Urban Data: Concrete Examples of Success


Blogpost by Naysan Saran: “Since the Sumerians of the fourth millennium BCE, governments have kept records. These records have, of course, evolved from a few hundred cuneiform symbols engraved on clay tablets to terabytes of data hosted on cloud servers. However, their primary goal remains the same: to improve land management.

That being said, the six thousand years of civilization separating us from the Sumerians has seen the birth of democracy, and with that birth, a second goal has been grafted onto the first: cities must now earn the trust of their citizens with respect to how they manage those citizens’ data. This goal cannot be achieved without good data governance, which defines strategies for the efficient and transparent use and distribution of information.

To learn more about the state of the art in municipal data management, both internally and externally, we went to meet with two experts who agreed to share their experiences and best practices: François Robitaille, business architect for the city of Laval; and Adrienne Schmoeker, former Deputy Chief Analytics Officer for the City of New York, and Director at the New York City Mayor’s Office of Data Analytics where she managed the Open Data Program for three years….(More)”

What Universities Owe Democracy


Book by Ronald J. Daniels with Grant Shreve and Phillip Spector: “Universities play an indispensable role within modern democracies. But this role is often overlooked or too narrowly conceived, even by universities themselves. In What Universities Owe Democracy, Ronald J. Daniels, the president of Johns Hopkins University, argues that—at a moment when liberal democracy is endangered and more countries are heading toward autocracy than at any time in generations—it is critical for today’s colleges and universities to reestablish their place in democracy.

Drawing upon fields as varied as political science, economics, history, and sociology, Daniels identifies four distinct functions of American higher education that are key to liberal democracy: social mobility, citizenship education, the stewardship of facts, and the cultivation of pluralistic, diverse communities. By examining these roles over time, Daniels explains where colleges and universities have faltered in their execution of these functions—and what they can do going forward.

Looking back on his decades of experience leading universities, Daniels offers bold prescriptions for how universities can act now to strengthen democracy. For those committed to democracy’s future prospects, this book is a vital resource…(More)”.

Democratizing and technocratizing the notice-and-comment process


Essay by Reeve T. Bull: “…When enacting the Administrative Procedure Act, Congress was not entirely clear on the extent to which it intended the agency to take into account public opinion as reflected in comments or merely to sift the comments for relevant information. This tension has simmered for years, but it never posed a major problem since the vast majority of rules garnered virtually no public interest.

Even now, most rules still generate a very anemic response. Internet submission has vastly simplified the process of filing a comment, however, and a handful of rules generate “mass comment” responses of hundreds of thousands or even millions of submissions. In these cases, as the net neutrality incident showed, individual commenters and even private firms have begun to manipulate the process by using computer algorithms to generate comments and, in some instances, affix false identities. As a result, agencies can no longer ignore the problem.

Nevertheless, technological progress is not necessarily a net negative for agencies. It also presents extraordinary opportunities to refine the notice-and-comment process and generate more valuable feedback. Moreover, if properly channeled, technological improvements can actually provide the remedies to many of the new problems that agencies have encountered. And other, non-technological reforms can address most, if not all of, the other newly emerging challenges. Indeed, if agencies are open-minded and astute, they can both “democratize” the public participation process, creating new and better tools for ascertaining public opinion (to the extent it is relevant in any given rule), and “technocratize” it at the same time, expanding and perfecting avenues for obtaining expert feedback….

As with many aspects of modern life, technological change that once was greeted with naive enthusiasm has now created enormous challenges. As a recent study for the Administrative Conference of the United States (for which I served as a co-consultant) has found, agencies can deploy technological tools to address at least some of these problems. For instance, so-called “deduplication software” can identify and group comments that come from different sources but that contain large blocks of identical text and therefore were likely copied from a common source. Bundling these comments can greatly reduce processing time. Agencies can also undertake various steps to combat unwanted computer-generated or falsely attributed comments, including quarantining such comments and issuing commenting policies discouraging their submission. A recently adopted set of ACUS recommendations partly based on the report offer helpful guidance to agencies on this front.

Unfortunately, as technology evolves, new challenges will emerge. As noted in the ACUS report, agencies are relatively unconcerned with duplicate comments since they possess the technological tools to process them. Yet artificial intelligence has evolved to the point that computer algorithms can produce comments that are both indistinguishable from human comments and at least facially appear to contain unique and relevant information. In one recent study, an algorithm generated and submitted…(More)”

Facial Recognition Technology: Responsible Use Principles and the Legislative Landscape


Report by James Lewis: “…Criticism of FRT is too often based on a misunderstanding about the technology. A good starting point to change this is to clarify the distinction between FRT and facial characterization. FRT compares two images and asks how likely it is that one image is the same as the other. The best FRT is more accurate than humans at matching images. In contrast, “facial analysis” or “facial characterization” examines an image and then tries to characterize it by gender, age, or race. Much of the critique of FRT is actually about facial characterization. Claims about FRT inaccuracy are either out of date or mistakenly talking about facial characterization. Of course, accuracy depends on how FRT is used. When picture quality is poor, accuracy is lower but often still better than the average human. A 2021 report by the National Institute of Standards and Technology (NIST) found that accuracy had improved dramatically and that more accurate systems were less likely to make errors based on race or gender. This confusion hampers the development of effective rules.

Some want to ban FRT, but it will continue to be developed and deployed because of the convenience for consumers and the benefits to public safety. Continued progress in sensors and artificial intelligence (AI) will increase availability and performance of the technologies used for facial recognition. Stopping the development of FRT would require stopping the development of AI, and that is neither possible nor in the national interest. This report provides a list of guardrails to guide the development of law and regulation for civilian use….(More)”.

Bring American cities into the 21st century by funding urban innovation


Article by Dan Doctoroff and Richard Florida: “The U.S. is on the verge of the fourth revolution in urban technology. Where railroads, the electric grid, and the automobile defined previous eras, today, new strategies that integrate new technologies in our cities can unlock striking possibilities.

Our buildings can be dramatically more sustainable, adaptable, and affordable. Energy systems and physical infrastructure can fulfill the promise of “climate-positive” development. Secure digital infrastructure can connect people and improve services while safeguarding privacy. We can deploy mobility solutions that regulate the flow of people and vehicles in real time, ease traffic, and cut carbon emissions. Innovative social infrastructure can enable new service models to build truly inclusive communities. 

Congress and the administration are currently negotiating a reconciliation package that is intended to put the U.S. on a path to a sustainable and equitable future. However, this mission will not succeed without meaningful investments in technical solutions that recognize the frontline role of cities and urban counties in so many national priorities.

U.S. cities are still built, connected, powered, heated, and run much as they have been for the past 75 years. Cities continue to generally rely on “dumb” infrastructure, such as the classic traffic light, which can direct traffic and do little else. 

When Detroit deployed the first red-yellow-green automatic traffic light in the 1920s, it pioneered state-of-the art traffic management. Soon, there was a traffic light at every major intersection in America, and it has remained an icon of urban technology ever since. Relying on 100-year-old technology isn’t all that unusual in our cities. If you look closely at any American city, you will see it’s rather the rule. While our policy needs and technical capabilities have changed dramatically, the urban systems U.S. cities rely on have remained essentially frozen in time since the Second World War.  

We must leverage today’s technology and use artificial intelligence, machine learning, data analytics, connected infrastructure, cloud computing, and automation to run our cities. That is why we have come together to help forge a new initiative, the Coalition for Urban Innovation, to reimagine urban infrastructure for the future. Consisting of leading urban thinkers, businesses, and nonprofits, the coalition is calling on Congress and the administration to seize this generational opportunity to finally unlock the potential of cities as powerful levers for tackling climate change, promoting inclusion, and otherwise addressing our thorniest challenges…(More)”.