How Period-Tracker Apps Treat Your Data, and What That Means if Roe v. Wade Is Overturned


Article by Nicole Nguyen and Cordilia James: “You might not talk to your friends about your monthly cycle, but there’s a good chance you talk to an app about it. And why not? Period-tracking apps are more convenient than using a diary, and the insights are more interesting, too. 

But how much do you know about the ways apps and trackers collect, store—and sometimes share—your fertility and menstrual-cycle data?

The question has taken on new importance following the leak of a draft Supreme Court opinion that would overturn Roe v. Wade. Roe established a constitutional right to abortion, and should the court reverse its 1973 decision, about half the states in the U.S. are likely to restrict or outright ban the procedure.

Phone and app data have long been shared and sold without prominent disclosure, often for advertising purposes. HIPAA, aka the Health Insurance Portability and Accountability Act, might protect information shared between you and your healthcare provider, but it doesn’t typically apply to data you put into an app, even a health-related one. Flo Health Inc., maker of a popular period and ovulation tracker, settled with the Federal Trade Commission in 2021 for sharing sensitive health data with Facebook without making the practice clear to users.

The company completed an independent privacy audit earlier this year. “We remain committed to ensuring the utmost privacy for our users and want to make it clear that Flo does not share health data with any company,” a spokeswoman said.

In a scenario where Roe is overturned, your digital breadcrumbs—including the kind that come from period trackers—could be used against you in states where laws criminalize aiding in or undergoing abortion, say legal experts.

“The importance of menstrual data is not merely speculative. It has been relevant to the government before, in investigations and restrictions,” said Leah Fowler, research director at University of Houston’s Health Law and Policy Institute. She cited a 2019 hearing where Missouri’s state health department admitted to keeping a spreadsheet of Planned Parenthood abortion patients, which included the dates of their last menstrual period.

Prosecutors have also obtained other types of digital information, including text messages and search histories, as evidence for abortion-related cases…(More)”.

China’s Expanding Surveillance State


Article by  Isabelle Qian, Muyi Xiao, Paul Mozur and Alexander Cardia in the New York Times: “China’s ambition to collect a staggering amount of personal data from everyday citizens is more expansive than previously known, a Times investigation has found. Phone-tracking devices are now everywhere. The police are creating some of the largest DNA databases in the world. And the authorities are building upon facial recognition technology to collect voice prints from the general public.

The Times’s Visual Investigations team and reporters in Asia spent over a year analyzing more than a hundred thousand government bidding documents. They call for companies to bid on the contracts to provide surveillance technology, and include product requirements and budget size, and sometimes describe at length the strategic thinking behind the purchases. Chinese laws stipulate that agencies must keep records of bids and make them public, but in reality the documents are scattered across hard-to-search web pages that are often taken down quickly without notice. ChinaFile, a digital magazine published by the Asia Society, collected the bids and shared them exclusively with The Times.

This unprecedented access allowed The Times to study China’s surveillance capabilities. The Chinese government’s goal is clear: designing a system to maximize what the state can find out about a person’s identity, activities and social connections, which could ultimately help the government maintain its authoritarian rule.

Here are the investigation’s major revelations.

Analysts estimate that more than half of the world’s nearly one billion surveillance cameras are in China, but it had been difficult to gauge how they were being used, what they captured and how much data they generated. The Times analysis found that the police strategically chose locations to maximize the amount of data their facial recognition cameras could collect…

The Chinese authorities are realistic about their technological limitations. According to one bidding document, the Ministry of Public Security, China’s top police agency, believed the country’s video surveillance systems still lacked analytical capabilities. One of the biggest problems they identified was that the data had not been centralized….(More)”.

How the Federal Government Buys Our Cell Phone Location Data


Article by Bennett Cyphers: “…Weather apps, navigation apps, coupon apps, and “family safety” apps often request location access in order to enable key features. But once an app has location access, it typically has free rein to share that access with just about anyone.

That’s where the location data broker industry comes in. Data brokers entice app developers with cash-for-data deals, often paying per user for direct access to their device. Developers can add bits of code called “software development kits,” or SDKs, from location brokers into their apps. Once installed, a broker’s SDK is able to gather data whenever the app itself has access to it: sometimes, that means access to location data whenever the app is open. In other cases, it means “background” access to data whenever the phone is on, even if the app is closed.

One app developer received the following marketing email from data broker Safegraph:

SafeGraph can monetize between $1-$4 per user per year on exhaust data (across location, matches, segments, and other strategies) for US mobile users who have strong data records. We already partner with several GPS apps with great success, so I would definitely like to explore if a data partnership indeed makes sense.

But brokers are not limited to data from apps they partner with directly. The ad tech ecosystem provides ample opportunities for interested parties to skim from the torrents of personal information that are broadcast during advertising auctions. In a nutshell, advertising monetization companies (like Google) partner with apps to serve ads. As part of the process, they collect data about users—including location, if available—and share that data with hundreds of different companies representing digital advertisers. Each of these companies uses that data to decide what ad space to bid on, which is a nasty enough practice on its own. But since these “bidstream” data flows are largely unregulated, the companies are also free to collect the data as it rushes past and store it for later use. 

The data brokers covered in this post add another layer of misdirection to the mix. Some of them may gather data from apps or advertising exchanges directly, but others acquire data exclusively from other data brokers. For example, Babel Street reportedly purchases all of its data from Venntel. Venntel, in turn, acquires much of its data from its parent company, the marketing-oriented data broker Gravy Analytics. And Gravy Analytics has purchased access to data from the brokers Complementics, Predicio, and Mobilewalla. We have little information about where those companies get their data—but some of it may be coming from any of the dozens of other companies in the business of buying and selling location data.

If you’re looking for an answer to “which apps are sharing data?”, the answer is: “It’s almost impossible to know.” Reporting, technical analysis, and right-to-know requests through laws like GDPR have revealed relationships between a handful of apps and location data brokers. For example, we know that the apps Muslim Pro and Muslim Mingle sold data to X-Mode, and that navigation app developer Sygic sent data to Predicio (which sold it to Gravy Analytics and Venntel). However, this is just the tip of the iceberg. Each of the location brokers discussed in this post obtains data from hundreds or thousands of different sources. Venntel alone has claimed to gather data from “over 80,000” different apps. Because much of its data comes from other brokers, most of these apps likely have no direct relationship with Venntel. As a result, the developers of the apps fueling this industry likely have no idea where their users’ data ends up. Users, in turn, have little hope of understanding whether and how their data arrives in these data brokers’ hands…(More)”.

I tried to read all my app privacy policies. It was 1 million words.


Article by Geoffrey A. Fowler: “…So here’s an idea: Let’s abolish the notion that we’re supposed to read privacy policies.

I’m not suggesting companies shouldn’t have to explain what they’re up to. Maybe we call them “data disclosures” for the regulators, lawyers, investigative journalists and curious consumers to pore over.

But to protect our privacy, the best place to start is for companies to simply collect less data. “Maybe don’t do things that need a million words of explanation? Do it differently,” said Slaughter. “You can’t abuse, misuse, leverage data that you haven’t collected in the first place.”

Apps and services should only collect the information they really need to provide that service — unless we opt in to let them collect more, and it’s truly an option.

I’m not holding my breath that companies will do that voluntarily, but a federal privacy law would help. While we wait for one, Slaughter said the FTC (where Democratic commissioners recently gained a majority) is thinking about how to use its existing authority “to pursue practices — including data collection, use and misuse — that are unfair to users.”

Second, we need to replace the theater of pressing “agree” with real choices about our privacy.

Today, when we do have choices to make, companies often present them in ways that pressure us into making the worst decisions for ourselves.

Apps and websites should give us the relevant information and our choices in the moment when it matters. Twitter actually does this just-in-time notice better than many other apps and websites: By default, it doesn’t collect your exact location, and only prompts you to do so when you ask to tag your location in a tweet.

Even better, technology could help us manage our choices. Cranor suggests that data disclosures could be coded to be read by machines. Companies already do this for financial information, and the TLDR Act would require consistent tags on privacy information, too. Then your computer could act kind of like a butler, interacting with apps and websites on your behalf.

Picture Siri as a butler who quizzes you briefly about your preferences and then does your bidding. The privacy settings on an iPhone already let you tell all the different apps on your phone not to collect your location. For the past year, they’ve also allowed you to ask apps not to track you.

Web browsers could serve as privacy butlers, too. Mozilla’s Firefox already lets you block certain kinds of privacy invasions. Now a new technology called the Global Privacy Control is emerging that would interact with websites and instruct them not to “sell” our data. It’s grounded in California’s privacy law, which is among the toughest in the nation, though it remains to be seen how the state will enforce GPC…(More)”.

We Need to Take Back Our Privacy


Zeynep Tufekci in The New York Times: “…Congress, and states, should restrict or ban the collection of many types of data, especially those used solely for tracking, and limit how long data can be retained for necessary functions — like getting directions on a phone.

Selling, trading and merging personal data should be restricted or outlawed. Law enforcement could obtain it subject to specific judicial oversight.

Researchers have been inventing privacy-preserving methods for analyzing data sets when merging them is in the public interest but the underlying data is sensitive — as when health officials are tracking a disease outbreak and want to merge data from multiple hospitals. These techniques allow computation but make it hard, if not impossible, to identify individual records. Companies are unlikely to invest in such methods, or use end-to-end encryption as appropriate to protect user data, if they could continue doing whatever they want. Regulation could make these advancements good business opportunities, and spur innovation.

I don’t think people like things the way they are. When Apple changed a default option from “track me” to “do not track me” on its phones, few people chose to be tracked. And many who accept tracking probably don’t realize how much privacy they’re giving up, and what this kind of data can reveal. Many location collectors get their data from ordinary apps — could be weather, games, or anything else — that often bury that they will share the data with others in vague terms deep in their fine print.

Under these conditions, requiring people to click “I accept” to lengthy legalese for access to functions that have become integral to modern life is a masquerade, not informed consent.

Many politicians have been reluctant to act. The tech industry is generous, cozy with power, and politicians themselves use data analysis for their campaigns. This is all the more reason to press them to move forward…(More)”.

GDPR and the Lost Generation of Innovative Apps


Paper by Rebecca Janßen, Reinhold Kesler, Michael E. Kummer & Joel Waldfogel: “Using data on 4.1 million apps at the Google Play Store from 2016 to 2019, we document that GDPR induced the exit of about a third of available apps; and in the quarters following implementation, entry of new apps fell by half. We estimate a structural model of demand and entry in the app market. Comparing long-run equilibria with and without GDPR, we find that GDPR reduces consumer surplus and aggregate app usage by about a third. Whatever the privacy benefits of GDPR, they come at substantial costs in foregone innovation…(More)”.

Roe draft raises concerns data could be used to identify abortion seekers, providers


Article by Chris Mills Rodrigo: “Concerns that data gathered from peoples’ interactions with their digital devices could potentially be used to identify individuals seeking or performing abortions have come into the spotlight with the news that pregnancy termination services could soon be severely restricted or banned in much of the United States.

Following the leak of a draft majority opinion indicating that the Supreme Court is poised to overturn Roe v. Wade, the landmark 1973 decision that established the federal right to abortion, privacy advocates are raising alarms about the ways law enforcement officials or anti-abortion activists could make such identifications using data available on the open market, obtained from companies or extracted from devices.

“The dangers of unfettered access to Americans’ personal information have never been more obvious. Researching birth control online, updating a period-tracking app or bringing a phone to the doctor’s office could be used to track and prosecute women across the U.S.,” Sen. Ron Wyden (D-Ore.) said in a statement to The Hill. 

Data from web searches, smartphone location pings and online purchases can all be easily obtained with little to no safeguards.

“Almost everything that you do … data can be captured about it and can be fed into a larger model that can help somebody or some entity infer whether or not you may be pregnant and whether or not you may be someone who’s planning to have an abortion or has had one,” Nathalie Maréchal, senior policy manager at Ranking Digital Rights, explained. 

There are three primary ways that data could travel from individuals’ devices to law enforcement or other groups, according to experts who spoke with The Hill.

The first is via third party data brokers, which make up a shadowy multibillion dollar industry dedicated to collecting, aggregating and selling location data harvested from individuals’ mobile phones that has provided unprecedented access to the daily movements of Americans for advertisers, or virtually anyone willing to pay…(More)”.

‘Agile governance’ could redesign policy on data protection. Here’s why that matters


Article by Nicholas Davis: “Although technology regulation is evolving rapidly in today’s world, such regulation remains greatly fragmented across national and regional divides. Agile governance can potentially solve this fragmentation by promoting nimbler, more fluid, and more adaptive approaches to regulation.

Whether it is privacy, cyber security, cyber warfare, national security, or prohibited content, every hot-button issue in technology governance today seems to be of global concern, yet resides in the hands of nationally-focused lawmakers relying on outdated policies that continue to reinforce the fragmentation of technology regulation.

Take data protection, for example. The EU’s General Data Protection Regulation (GDPR), which was first proposed in 2012 and came into effect in 2018, is essentially an international privacy law for data protection. Any organization that processes any personal data from any EU citizen is covered.

Beyond its extraterritorial impact, it has inspired similar efforts to update and improve data protection in other jurisdictions, such as in JapanChileEgypt, and the state of California in the United States…(More)”.

The European Data Protection Supervisor (EDPS) launches pilot phase of two social media platforms


Press Release: “The European Data Protection Supervisor (EDPS) launches today the public pilot phase of two social media platforms: EU Voice and EU Video.

EU institutions, bodies, offices and agencies (EUIs) participating in the pilot phase of these platforms will be able to interact with the public by sharing short texts, images and videos on EU Voice; and by sharing, uploading, commenting videos and podcasts on EU Video.

The two platforms are part of decentralised, free and open-source social media networks that connect users in a privacy-oriented environment, based on Mastodon and PeerTube software. By launching the pilot phase of EU Voice and EU Video, the EDPS aims to contribute to the European Union’s strategy for data and digital sovereignty to foster Europe’s independence in the digital world.

Wojciech Wiewiórowski, EDPS, said“With the pilot launch of EU Voice and EU Video, we aim to offer alternative social media platforms that prioritise individuals and their rights to privacy and data protection. In concrete terms this means, for example, that EU Voice and EU Video do not rely on transfers of personal data to countries outside the European Union and the European Economic Area; there are no advertisements on the platforms; and there is no profiling of individuals that may use the platforms. These measures, amongst others, give individuals the choice on and control over how their personal data is used.”

The EDPS and the European Commission’s Directorate General for Informatics (DIGIT) have collaborated closely throughout the development of EU Voice and EU Video. In line with the goals of the Commission’s Open Source Software Strategy 2020 – 2023, DIGIT’s technical assistance to the EDPS proves the importance of inter-institutional cooperation on open source as an enabler of privacy rights and data protection, therefore contributing to the EU’s technological sovereignty.

The launch of the pilot phase of EU Voice and EU Video will help the EDPS to test the platforms in practice by collecting feedback from participating EUIs. The EDPS hopes that this first step will mark a continuity in the use of privacy-compliant social media platforms…(More)”.

Guns, Privacy, and Crime


Paper by Alessandro Acquisti & Catherine Tucker: “Open government holds promise of both a more efficient but more accountable and transparent government. It is not clear, however, how transparent information about citizens and their interaction with government, however, affects the welfare of those citizens, and if so in what direction. We investigate this by using as a natural experiment the effect of the online publication of the names and addresses of holders of handgun carry permits on criminals’ propensity to commit burglaries. In December 2008, a Memphis, TN newspaper published a searchable online database of names, zip codes, and ages of Tennessee handgun carry permit holders. We use detailed crime and handgun carry permit data for the city of Memphis to estimate the impact of publicity about the database on burglaries. We find that burglaries increased in zip codes with fewer gun permits, and decreased in those with more gun permits, after the database was publicized….(More)”