The Inevitable Weaponization of App Data Is Here


Joseph Cox at VICE: “…After years of warning from researchers, journalists, and even governments, someone used highly sensitive location data from a smartphone app to track and publicly harass a specific person. In this case, Catholic Substack publication The Pillar said it used location data ultimately tied to Grindr to trace the movements of a priest, and then outed him publicly as potentially gay without his consent. The Washington Post reported on Tuesday that the outing led to his resignation….

The data itself didn’t contain each mobile phone user’s real name, but The Pillar and its partner were able to pinpoint which device belonged to Burill by observing one that appeared at the USCCB staff residence and headquarters, locations of meetings that he was in, as well as his family lake house and an apartment that has him listed as a resident. In other words, they managed to, as experts have long said is easy to do, unmask this specific person and their movements across time from an supposedly anonymous dataset.

A Grindr spokesperson told Motherboard in an emailed statement that “Grindr’s response is aligned with the editorial story published by the Washington Post which describes the original blog post from The Pillar as homophobic and full of unsubstantiated inuendo. The alleged activities listed in that unattributed blog post are infeasible from a technical standpoint and incredibly unlikely to occur. There is absolutely no evidence supporting the allegations of improper data collection or usage related to the Grindr app as purported.”…

“The research from The Pillar aligns to the reality that Grindr has historically treated user data with almost no care or concern, and dozens of potential ad tech vendors could have ingested the data that led to the doxxing,” Zach Edwards, a researcher who has closely followed the supply chain of various sources of data, told Motherboard in an online chat. “No one should be doxxed and outed for adult consenting relationships, but Grindr never treated their own users with the respect they deserve, and the Grindr app has shared user data to dozens of ad tech and analytics vendors for years.”…(More)”.

We need to regulate mind-reading tech before it exists


Abel Wajnerman Paz at Rest of the World: “Neurotechnology” is an umbrella term for any technology that can read and transcribe mental states by decoding and modulating neural activity. This includes technologies like closed-loop deep brain stimulation that can both detect neural activity related to people’s moods and can suppress undesirable symptoms, like depression, through electrical stimulation.

Despite their evident usefulness in education, entertainment, work, and the military, neurotechnologies are largely unregulated. Now, as Chile redrafts its constitution — disassociating it from the Pinochet surveillance regime — legislators are using the opportunity to address the need for closer protection of people’s rights from the unknown threats posed by neurotechnology. 

Although the technology is new, the challenge isn’t. Decades ago, similar international legislation was passed following the development of genetic technologies that made possible the collection and application of genetic data and the manipulation of the human genome. These included the Universal Declaration on the Human Genome and Human Rights in 1997 and the International Declaration on Human Genetic Data in 2003. The difference is that, this time, Chile is a leading light in the drafting of neuro-rights legislation.

In Chile, two bills — a constitutional reform bill, which is awaiting approval by the Chamber of Deputies, and a bill on neuro-protection — will establish neuro-rights for Chileans. These include the rights to personal identity, free will, mental privacy, equal access to cognitive enhancement technologies, and protection against algorithmic bias….(More)”.

Concern trolls and power grabs: Inside Big Tech’s angry, geeky, often petty war for your privacy


Article by Issie Lapowsky: “Inside the World Wide Web Consortium, where the world’s top engineers battle over the future of your data….

The W3C’s members do it all by consensus in public GitHub forums and open Zoom meetings with meticulously documented meeting minutes, creating a rare archive on the internet of conversations between some of the world’s most secretive companies as they collaborate on new rules for the web in plain sight.

But lately, that spirit of collaboration has been under intense strain as the W3C has become a key battleground in the war over web privacy. Over the last year, far from the notice of the average consumer or lawmaker, the people who actually make the web run have converged on this niche community of engineers to wrangle over what privacy really means, how the web can be more private in practice and how much power tech giants should have to unilaterally enact this change.

On one side are engineers who build browsers at Apple, Google, Mozilla, Brave and Microsoft. These companies are frequent competitors that have come to embrace web privacy on drastically different timelines. But they’ve all heard the call of both global regulators and their own users, and are turning to the W3C to develop new privacy-protective standards to replace the tracking techniques businesses have long relied on.

On the other side are companies that use cross-site tracking for things like website optimization and advertising, and are fighting for their industry’s very survival. That includes small firms like Rosewell’s, but also giants of the industry, like Facebook.

Rosewell has become one of this side’s most committed foot soldiers since he joined the W3C last April. Where Facebook’s developers can only offer cautious edits to Apple and Google’s privacy proposals, knowing full well that every exchange within the W3C is part of the public record, Rosewell is decidedly less constrained. On any given day, you can find him in groups dedicated to privacy or web advertising, diving into conversations about new standards browsers are considering.

Rather than asking technical questions about how to make browsers’ privacy specifications work better, he often asks philosophical ones, like whether anyone really wants their browser making certain privacy decisions for them at all. He’s filled the W3C’s forums with concerns about its underlying procedures, sometimes a dozen at a time, and has called upon the W3C’s leadership to more clearly articulate the values for which the organization stands….(More)”.

Luxury Surveillance


Essay by Chris Gilliard and David Golumbia: One of the most troubling features of the digital revolution is that some people pay to subject themselves to surveillance that others are forced to endure and would, if anything, pay to be free of.

Consider a GPS tracker you can wear around one of your arms or legs. Make it sleek and cool — think the Apple Watch or FitBit —  and some will pay hundreds or even thousands of dollars for the privilege of wearing it. Make it bulky and obtrusive, and others, as a condition of release from jail or prison, being on probation, or awaiting an immigration hearing, will be forced to wear one — and forced to pay for it too.

In each case, the device collects intimate and detailed biometric information about its wearer and uploads that data to servers, communities, and repositories. To the providers of the devices, this data and the subsequent processing of it are the main reasons the devices exist. They are means of extraction: That data enables further study, prediction, and control of human beings and populations. While some providers certainly profit from the sale of devices, this secondary market for behavioral control and prediction is where the real money is — the heart of what Shoshana Zuboff rightly calls surveillance capitalism.

The formerly incarcerated person knows that their ankle monitor exists for that purpose: to predict and control their behavior. But the Apple Watch wearer likely thinks about it little, if at all — despite the fact that the watch has the potential to collect and analyze much more data about its user (e.g. health metrics like blood pressure, blood glucose levels, ECG data) than parole or probation officers are even allowed to gather about their “clients” without specific warrant. Fitness-tracker wearers are effectively putting themselves on parole and paying for the privilege.

Both the Apple Watch and the FitBit can be understood as examples of luxury surveillance: surveillance that people pay for and whose tracking, monitoring, and quantification features are understood by the user as benefits they are likely to celebrate. Google, which has recently acquired FitBit, is seemingly leaning into the category, launching a more expensive version of the device named the “Luxe.” Only certain people can afford luxury surveillance, but that is not necessarily a matter of money: In general terms, consumers of luxury surveillance see themselves as powerful and sovereign, and perhaps even immune from unwelcome monitoring and control. They see self-quantification and tracking not as disciplinary or coercive, but as a kind of care or empowerment. They understand it as something extra, something “smart.”…(More)”.

Why We Should End the Data Economy


Essay by Carissa Véliz: “…The data economy undermines equality and fairness. You and your neighbor are no longer treated as equal citizens. You aren’t given an equal opportunity because you are treated differently on the basis of your data. The ads and content you have access to, the prices you pay for the same services, and even how long you wait when you call customer service depend on your data.

We are much better at collecting personal data than we are at keeping it safe. But personal data is a serious threat, and we shouldn’t be collecting it in the first place if we are incapable of keeping it safe. Using smartphone location data acquired from a data broker, reporters from The New York Times were able to track military officials with security clearances, powerful lawyers and their guests, and even the president of the United States (through the phone of someone believed to be a Secret Service agent).

Our current data economy is based on collecting as much personal data as possible, storing it indefinitely, and selling it to the highest bidder. Having so much sensitive data circulating freely is reckless. By designing our economy around surveillance, we are building a dangerous structure for social control that is at odds with freedom. In the surveillance society we are constructing, there is no such thing as under the radar. It shouldn’t be up to us to constantly opt out of data collection. The default matters, and the default should be no data collection…(More)”.

Is there a role for consent in privacy?


Article by Robert Gellman: “After decades, we still talk about the role of notice and choice in privacy. Yet there seems to be broad recognition that notice and choice do nothing for the privacy of consumers. Some American businesses cling to notice and choice because they hate all the alternatives. Some legislators draft laws with elements of notice and choice, either because it’s easier to draft a law that way, because they don’t know any better or because they carry water for business.

For present purposes, I will talk about notice and choice generically as consent. Consent is a broader concept than choice, but the difference doesn’t matter for the point I want to make. How you frame consent is complex. There are many alternatives and many approaches. It’s not just a matter of opt-in or opt-out. While I’m discarding issues, I also want to acknowledge and set aside the eight basic Fair Information Practices. There is no notice and choice principle in FIPS, and FIPs are not specifically important here.

Until recently, my view was that consent in almost any form is pretty much death for consumer privacy. No matter how you structure it, websites and others will find a way to wheedle consent from consumers. Those who want to exploit consumer data will cajole, pressure, threaten, mystify, obscure, entice or otherwise coax consumers to agree.

Suddenly, I’m not as sure of my conclusion about consent. What changed my mind? There is a new data point from Apple’s App Tracking Transparency framework. Apple requires mobile application developers to obtain opt-in consent before serving targeted advertising via Apple’s Identifier for Advertisers. Early reports suggest consumers are saying “NO” in overwhelming numbers — overwhelming as in more than 90%.

It isn’t this strong consumer reaction that makes me think consent might possibly have a place. I want to highlight a different aspect of the Apple framework….(More)”.

ASEAN Data Management Framework


ASEAN Framework: “Due to the growing interactions between data, connected things and people, trust in data has become the pre-condition for fully realising the gains of digital transformation. SMEs are threading a fine line between balancing digital initiatives and concurrently managing data protection and customer privacy safeguards to ensure that these do not impede innovation. Therefore, there is a motivation to focus on digital data governance as it is critical to boost economic integration and technology adoption across all sectors in the ten ASEAN Member States (AMS).
To ensure that their data is appropriately managed and protected, organisations need to know what levels of technical, procedural and physical controls they need to put in place. The categorisation of datasets help organisations manage their data assets and put in place the right level of controls. This is applicable for both data at rest as well as data in transit. The establishment of an ASEAN Data Management Framework will promote sound data governance practices by helping organisations to discover the datasets they have, assign it with the appropriate categories, manage the data, protect it accordingly and all these while continuing to comply with relevant regulations. Improved governance and protection will instil trust in data sharing both between organisations and between countries, which will then promote the growth of trade and the flow of data among AMS and their partners in the digital economy….(More)”

Unity in Privacy Diversity: A Kaleidoscopic View of Privacy Definitions


Paper by Bert-Jaap Koops and Maša Galič: “Contrary to the common claim that privacy is a concept in disarray, this Article argues that there is considerable coherence in the way privacy has been conceptualized over many decades of privacy scholarship. Seemingly disparate approaches and widely differing definitions actually share close family resemblances that, viewed together from a bird’s-eye perspective, suggest that the concept of privacy is more akin to a kaleidoscope than to a swamp. As a heuristic device to look at this kaleidoscope, we use a working definition of privacy as having spaces in which you can be yourselves, building on two major strands in privacy theory: identity-building and boundary-management. With this heuristic, we analyze how six authoritative privacy accounts can be understood in the terms and rationale of other definitions. We show how the notions of Cohen (room for boundary management), Johnson (freedom from others’ judgement), Nissenbaum (contextual integrity), Reiman (personhood), Warren and Brandeis (being let alone), and Westin (control over information) have significant overlap with—or may even be equivalent to—an understanding of privacy in terms of identity-fostering spaces. Our kaleidoscopic perspective highlights not only that there is coherence in privacy, but also helps to understand the function and value of having many different privacy definitions around: each time and context bring their own privacy-related challenges, which might best be addressed through a certain conceptualization of privacy that works in that particular context. As the world turns its kaleidoscope of emerging privacy issues, privacy scholarship turns its kaleidoscope of privacy definitions along. The result of this kaleidoscopic perspective on privacy is an illuminating picture of unity in diversity….(More)”.

Governance mechanisms for sharing of health data: An approach towards selecting attributes for complex discrete choice experiment studies


Paper by Jennifer Viberg Johansson: “Discrete Choice Experiment (DCE) is a well-established technique to elicit individual preferences, but it has rarely been used to elicit governance preferences for health data sharing.

The aim of this article was to describe the process of identifying attributes for a DCE study aiming to elicit preferences of citizens in Sweden, Iceland and the UK for governance mechanisms for digitally sharing different kinds of health data in different contexts.

A three-step approach was utilised to inform the attribute and level selection: 1) Attribute identification, 2) Attribute development and 3) Attribute refinement. First, we developed an initial set of potential attributes from a literature review and a workshop with experts. To further develop attributes, focus group discussions with citizens (n = 13), ranking exercises among focus group participants (n = 48) and expert interviews (n = 18) were performed. Thereafter, attributes were refined using group discussion (n = 3) with experts as well as cognitive interviews with citizens (n = 11).

The results led to the selection of seven attributes for further development: 1) level of identification, 2) the purpose of data use, 3) type of information, 4) consent, 5) new data user, 6) collector and 7) the oversight of data sharing. Differences were found between countries regarding the order of top three attributes. The process outlined participants’ conceptualisation of the chosen attributes, and what we learned for our attribute development phase.

This study demonstrates a process for selection of attributes for a (multi-country) DCE involving three stages: Attribute identification, Attribute development and Attribute refinement. This study can contribute to improve the ethical aspects and good practice of this phase in DCE studies. Specifically, it can contribute to the development of governance mechanisms in the digital world, where people’s health data are shared for multiple purposes….(More)”.

Privacy Tech’s Third Generation


“A Review of the Emerging Privacy Tech Sector” by Privacy Tech Alliance and Future of Privacy Forum: “As we enter the third phase of development of the privacy tech market, purchasers are demanding more integrated solutions, product offerings are more comprehensive, and startup valuations are higher than ever, according to a new report from the Future of Privacy Forum and Privacy Tech Alliance. These factors are leading to companies providing a wider range of services, acting as risk management platforms, and focusing on support of business outcomes.

According to the report, “Privacy Tech’s Third Generation: A Review of the Emerging Privacy Tech Sector,” regulations are often the biggest driver for buyers’ initial privacy tech purchases. Organizations also are deploying tools to mitigate potential harms from the use of data. However, buyers serving global markets increasingly need privacy tech that offers data availability and control and supports its utility, in addition to regulatory compliance. 

The report finds the COVID-19 pandemic has accelerated global marketplace adoption of privacy tech as dependence on digital technologies grows. Privacy is becoming a competitive differentiator in some sectors, and TechCrunch reports that 200+ privacy startups have together raised more than $3.5 billion over hundreds of individual rounds of funding….(More)”.