Privacy Principles for Mobility Data


About: “The Principles are a set of values and priorities intended to guide the mobility ecosystem in the responsible use of data and the protection of individual privacy. They are intended to serve as a guiding “North Star” to assess technical and policy decisions that have implications for privacy when handling mobility data. The principles are designed to apply to all sectors, including public, private, research and non-profit….

Increasingly, organizations in the public, private and nonprofit sectors are faced with decisions that have data privacy implications. For organizations utilizing mobility data, these principles provide a baseline framework to both identify and address these situations. Individuals whose data is being collected, utilized and shared must be afforded proper protections and opportunities for agency in how information about them is used and handled. These principles offer guidance for how to engage in this process.

Human movement generates data in many ways: directly through the usage of GPS-enabled mobility services or devices, indirectly through phones or other devices with geolocation and even through cameras and other sensors that observe the public realm. While these principles were written with shared mobility services in mind, many of them will be applicable in other contexts in which data arising out of individual movement is collected and analyzed. We encourage any organization working with this type of data to adapt and apply these principles in their specific context.

While not all mobility data may present a privacy risk to individuals, all stakeholders managing mobility data should treat it as personal information that is sensitive, unless it can be demonstrated that it doesn’t present a privacy risk to individuals.

These principles were developed through a collaboration organized by the New Urban Mobility (NUMO) alliance, the North American Bikeshare & Scootershare Association (NABSA) and the Open Mobility Foundation (OMF) in 2020. These groups convened a diverse set of stakeholders representing cities, mobility service providers, technology companies, privacy advocates and academia. Over the course of many months, this group heard from privacy experts, discussed key topics related to data privacy and identified core ideas and common themes to serve as a basis for these Principles….(More)”.

A Climate Equity Agenda Informed by Community Brilliance


Essay by Jalonne L. White-Newsome: “Even with decades of data, state-of-the-art tools and prediction technologies, and clear signals that the impacts of climate change pose a threat to public health, there is still a major disconnect that is allowing extreme weather events to disrupt the health and well-being of low-income communities and people of color across the United States. Centering the health and well-being of these communities within cross-sector partnerships between residents, scientists, government, industry, and philanthropy can drive climate adaptation and resilience…(More)”

Falling in love with the problem, not the solution


Blog by Kyle Novak: “Fall in love with the problem, not your solution.”  It’s a maxim that I first heard spoken a few years ago by USAID’s former Chief Innovation Officer Ann Mei Chang. I’ve found myself frequently reflecting on those words as I’ve been thinking about the challenges of implementing public policy. I spent the past year on Capitol Hill in Washington, D.C. working as a legislative fellow, funded through a grant to bring scientists to improve evidence-based policymaking within the federal government. I spent much of the year trying to better understand how legislation and oversight work together in context of policy and politics. To learn what makes good public policy, I wanted to understand how to better implement it. Needless to say, I took a course in Problem Driven Iterative Adaptation (PDIA), a framework to manage risk in complex policy challenges by embracing experimentation and “learning through doing.”

Congress primarily uses legislation and budget to control and implement policy initiatives through the federal agencies. Legislation is drafted and introduced by lawmakers with input from constituents, interest groups, and agencies; the Congressional budget is explicitly planned out each year based on input from the agencies; and accountability is built into the process through oversight mechanisms. Congress largely provides the planning and lock-in of “plan and control” management based on majority political party control and congruence with policy priorities of the Administration.  But, it is difficult to successfully implement a plan-and-control approach when political, social, or economic situations are changing.

Take the problem of data privacy and protection. A person’s identity is becoming largely digital. Every day each of us produces almost a gigabyte of information—our location is shared by our mobile phones, our preferences and interpersonal connections are tagged on social media, our purchases analyzed, and our actions recorded on increasingly ubiquitous surveillance cameras. Monetization of this information, often bought and sold through data brokers, enables an invasive and oppressive system that affects all aspects of our lives.  Algorithms mine our data to make decisions about our employment, healthcare, education, credit, and policing. Machine learning and digital redlining skirts protections that prohibit discrimination on basis of race, gender, and religion. Targeted and automated disinformation campaigns suppress fundamental rights of speech and expression. And digital technologies magnify existing inequities. While misuse of personal data has the potential to do incredible harm, responsible use of that data has the power to do incredible good. The challenge of data privacy and protection is one that impacts all of us, our civil liberties, and the foundations of a democratic society.

The success of members of Congress are often measured in the solutions they propose, not the problems that they identify….(More)”

How to Budget for Equity and Drive Lasting Change


Article by Andrew Kleine and Josh Inaba: “After George Floyd’s tragic death last year sparked calls to “defund the police,” government leaders across the country looked at all their operations under a new lens of equity. Most importantly, state and local leaders examined ways to invest in equitable services. While it is often said that government budgets are value statements, the past year has revealed that many budgets need to be revisited so that they better demonstrate the values of the people they serve.

To address misalignments between government spending and community values, leaders should focus on budgeting for equity, which has four fundamental facets: prioritizing equity, using data and evidence, budgeting for outcomes and engaging the community in new ways…

Data and evidence are important components of any efforts to address racial equity because they allow governments to pinpoint disparities, establish goals to remedy them and find solutions that work. This means that government leaders should be using data to evaluate not just “How well did we do it?” and “Is anyone better off?” but also consider the question “Is everyone better off?”

Asking “Is everyone better off”? is what led Boston officials to take a deep dive into its sidewalk repair data. Analysts found that because repairs were driven by 311 complaints instead of an objective assessment of need, the sidewalks in poorer, minority neighborhoods were in worse shape than those in wealthier parts of the city. Boston now uses a sidewalk condition index and other need-based factors to prioritize its sidewalk capital program.

Similarly, evidence can help governments address more long-standing inequities such as kindergarten readiness. In Maryland, for example, 60% of white students were ready for kindergarten in 2019 compared with 42% of Black students and 26% of Hispanic students, a readiness gap that has widened in recent years. Although Maryland has acted to expand early childhood education, the root cause of the disparity starts before childbirth, when the health and preparedness of mothers can make or break early childhood outcomes.

Evidence-based upstream interventions, such as Nurse-Family Partnership programs, help improve early childhood educational outcomes by supporting low-income, first-time mothers from pregnancy through the child’s second birthday. Initiatives like these can help to address long-standing inequities, and governments can use clearinghouses, such as Results for America’s Economic Mobility Catalog, to identify evidence-based strategies to address a wide variety of these equity-related gaps…(More)”.

A Proposal for Researcher Access to Platform Data: The Platform Transparency and Accountability Act


Paper by Nathaniel Persily: “We should not need to wait for whistleblowers to blow their whistles, however, before we can understand what is actually happening on these extremely powerful digital platforms. Congress needs to act immediately to ensure that a steady stream of rigorous research reaches the public on the most pressing issues concerning digital technology. No one trusts the representations made by the platforms themselves, though, given their conflict of interest and understandable caution in releasing information that might spook shareholders. We need to develop an unprecedented system of corporate datasharing, mandated by government for independent research in the public interest.

This is easier said than done. Not only do the details matter, they are the only thing that matters. It is all well and good to call for “transparency” or “datasharing,” as an uncountable number of academics have, but the way government might setup this unprecedented regime will determine whether it can serve the grandiose purposes techcritics hope it will….(More)”.

Can data die?


Article by Jennifer Ding: “…To me, the crux of the Lenna story is how little power we have over our data and how it is used and abused. This threat seems disproportionately higher for women who are often overrepresented in internet content, but underrepresented in internet company leadership and decision making. Given this reality, engineering and product decisions will continue to consciously (and unconsciously) exclude our needs and concerns.

While social norms are changing towards non-consensual data collection and data exploitation, digital norms seem to be moving in the opposite direction. Advancements in machine learning algorithms and data storage capabilities are only making data misuse easier. Whether the outcome is revenge porn or targeted ads, surveillance or discriminatory AI, if we want a world where our data can retire when it’s outlived its time, or when it’s directly harming our lives, we must create the tools and policies that empower data subjects to have a say in what happens to their data… including allowing their data to die…(More)”

Fairer Democracy: Designing a Better Citizens’ Assembly


Press release by The Fannie and John Hertz Foundation: “Last winter, 80 residents of Washington State convened virtually to discuss the best ways for their state to tackle climate change. Their final recommendations were shared with state legislators, who are now considering some of the ideas in their policymaking. But the participants of the Washington Climate Assembly were neither climate experts nor politicians. Instead, they were randomly selected citizens from all walks of life, chosen carefully to reflect a range of demographics and views on climate change.

Such citizens’ assemblies are an increasingly popular way, around the world, of engaging average people in their democracies. But ensuring that participants are truly representative of society at large is a daunting analytical challenge. 

That’s where Bailey Flanigan, a Hertz Fellow and a graduate student at Carnegie Mellon University, comes in. Flanigan and colleagues at Carnegie Mellon and Harvard University have developed a new algorithm for selecting the participants in citizens’ assemblies, a process called sortition. The goal of their approach, she says, is to improve the fairness of sortition—and it’s already been published in Nature and used to select participants for dozens of assemblies, including the Washington Climate Assembly….

The researchers have made their algorithm, which they dubbed Panelot, available for public use, and Procaccia said it’s already been used in selecting more than 40 citizens’ assemblies. 

“It’s testament to the potential impact of work in this area that our algorithm has been enthusiastically adopted by so many organizations,” Flanigan said. “A lot of practitioners were using their own algorithms, and the idea that computer scientists can help centralize efforts to make sortition fairer and more transparent has started some exciting conversations.”…(More)”

The Age of A.I. And Our Human Future


Book by Henry A Kissinger, Eric Schmidt, and Daniel Huttenlocher: “Artificial Intelligence (AI) is transforming human society fundamentally and profoundly. Not since the Enlightenment and the Age of Reason have we changed how we approach knowledge, politics, economics, even warfare. Three of our most accomplished and deep thinkers come together to explore what it means for us all.

An A.I. that learned to play chess discovered moves that no human champion would have conceived of. Driverless cars edge forward at red lights, just like impatient humans, and so far, nobody can explain why it happens. Artificial intelligence is being put to use in sports, medicine, education, and even (frighteningly) how we wage war.

In this book, three of our most accomplished and deep thinkers come together to explore how A.I. could affect our relationship with knowledge, impact our worldviews, and change society and politics as profoundly as the ideas of the Enlightenment…(More)”.

Nonprofit Websites Are Riddled With Ad Trackers


Article by By Alfred Ng and Maddy Varner: “Last year, nearly 200 million people visited the website of Planned Parenthood, a nonprofit that many people turn to for very private matters like sex education, access to contraceptives, and access to abortions. What those visitors may not have known is that as soon as they opened plannedparenthood.org, some two dozen ad trackers embedded in the site alerted a slew of companies whose business is not reproductive freedom but gathering, selling, and using browsing data.

The Markup ran Planned Parenthood’s website through our Blacklight tool and found 28 ad trackers and 40 third-party cookies tracking visitors, in addition to so-called “session recorders” that could be capturing the mouse movements and keystrokes of people visiting the homepage in search of things like information on contraceptives and abortions. The site also contained trackers that tell Facebook and Google if users visited the site.

The Markup’s scan found Planned Parenthood’s site communicating with companies like Oracle, Verizon, LiveRamp, TowerData, and Quantcast—some of which have made a business of assembling and selling access to masses of digital data about people’s habits.

Katie Skibinski, vice president for digital products at Planned Parenthood, said the data collected on its website is “used only for internal purposes by Planned Parenthood and our affiliates,” and the company doesn’t “sell” data to third parties.

“While we aim to use data to learn how we can be most impactful, at Planned Parenthood, data-driven learning is always thoughtfully executed with respect for patient and user privacy,” Skibinski said. “This means using analytics platforms to collect aggregate data to gather insights and identify trends that help us improve our digital programs.”

Skibinski did not dispute that the organization shares data with third parties, including data brokers.

Blacklight scan of Planned Parenthood Gulf Coast—a localized website specifically for people in the Gulf region, including Texas, where abortion has been essentially outlawed—churned up similar results.

Planned Parenthood is not alone when it comes to nonprofits, some operating in sensitive areas like mental health and addiction, gathering and sharing data on website visitors.

Using our Blacklight tool, The Markup scanned more than 23,000 websites of nonprofit organizations, including those belonging to abortion providers and nonprofit addiction treatment centers. The Markup used the IRS’s nonprofit master file to identify nonprofits that have filed a tax return since 2019 and that the agency categorizes as focusing on areas like mental health and crisis intervention, civil rights, and medical research. We then examined each nonprofit’s website as publicly listed in GuideStar. We found that about 86 percent of them had third-party cookies or tracking network requests. By comparison, when The Markup did a survey of the top 80,000 websites in 2020, we found 87 percent used some type of third-party tracking.

About 11 percent of the 23,856 nonprofit websites we scanned had a Facebook pixel embedded, while 18 percent used the Google Analytics “Remarketing Audiences” feature.

The Markup found that 439 of the nonprofit websites loaded scripts called session recorders, which can monitor visitors’ clicks and keystrokes. Eighty-nine of those were for websites that belonged to nonprofits that the IRS categorizes as primarily focusing on mental health and crisis intervention issues…(More)”.

What Do Teachers Know About Student Privacy? Not Enough, Researchers Say


Nadia Tamez-Robledo at EdTech: “What should teachers be expected to know about student data privacy and ethics?

Considering so much of their jobs now revolve around student data, it’s a simple enough question—and one that researcher Ellen B. Mandinach and a colleague were tasked with answering. More specifically, they wanted to know what state guidelines had to say on the matter. Was that information included in codes of education ethics? Or perhaps in curriculum requirements for teacher training programs?

“The answer is, ‘Not really,’” says Mandinach, a senior research scientist at the nonprofit WestEd. “Very few state standards have anything about protecting privacy, or even much about data,” she says, aside from policies touching on FERPA or disposing of data properly.

While it seems to Mandinach that institutions have historically played hot potato over who is responsible for teaching educators about data privacy, the pandemic and its supercharged push to digital learning have brought new awareness to the issue.

The application of data ethics has real consequences for students, says Mandinach, like an Atlanta sixth grader who was accused of “Zoombombing” based on his computer’s IP address or the Dartmouth students who were exonerated from cheating accusations.

“There are many examples coming up as we’re in this uncharted territory, particularly as we’re virtual,” Mandinach says. “Our goal is to provide resources and awareness building to the education community and professional organization…so [these tools] can be broadly used to help better prepare educators, both current and future.”

This week, Mandinach and her partners at the Future of Privacy Forum released two training resources for K-12 teachers: the Student Privacy Primer and a guide to working through data ethics scenarios. The curriculum is based on their report examining how much data privacy and ethics preparation teachers receive while in college….(More)”.