The Inevitable Weaponization of App Data Is Here

Joseph Cox at VICE: “…After years of warning from researchers, journalists, and even governments, someone used highly sensitive location data from a smartphone app to track and publicly harass a specific person. In this case, Catholic Substack publication The Pillar said it used location data ultimately tied to Grindr to trace the movements of a priest, and then outed him publicly as potentially gay without his consent. The Washington Post reported on Tuesday that the outing led to his resignation….

The data itself didn’t contain each mobile phone user’s real name, but The Pillar and its partner were able to pinpoint which device belonged to Burill by observing one that appeared at the USCCB staff residence and headquarters, locations of meetings that he was in, as well as his family lake house and an apartment that has him listed as a resident. In other words, they managed to, as experts have long said is easy to do, unmask this specific person and their movements across time from an supposedly anonymous dataset.

A Grindr spokesperson told Motherboard in an emailed statement that “Grindr’s response is aligned with the editorial story published by the Washington Post which describes the original blog post from The Pillar as homophobic and full of unsubstantiated inuendo. The alleged activities listed in that unattributed blog post are infeasible from a technical standpoint and incredibly unlikely to occur. There is absolutely no evidence supporting the allegations of improper data collection or usage related to the Grindr app as purported.”…

“The research from The Pillar aligns to the reality that Grindr has historically treated user data with almost no care or concern, and dozens of potential ad tech vendors could have ingested the data that led to the doxxing,” Zach Edwards, a researcher who has closely followed the supply chain of various sources of data, told Motherboard in an online chat. “No one should be doxxed and outed for adult consenting relationships, but Grindr never treated their own users with the respect they deserve, and the Grindr app has shared user data to dozens of ad tech and analytics vendors for years.”…(More)”.

Government algorithms are out of control and ruin lives

Nani Jansen Reventlow at Open Democracy: “Government services are increasingly being automated and technology is relied on more and more to make crucial decisions about our lives and livelihoods. This includes decisions about what type of support we can access in times of need: welfarebenefits, and other government services.

Technology has the potential to not only reproduce but amplify structural inequalities in our societies. If you combine this drive for automation with a broader context of criminalising poverty and systemic racism, this can have disastrous effects.

A recent example is the ‘child benefits scandal’ that brought down the Dutch government at the start of 2021. In the Netherlands, working parents are eligible for a government contribution toward the costs of daycare. This can run up to 90% of the actual costs for those with a low income. While contributions are often directly paid to childcare providers, parents are responsible for them. This means that, if the tax authorities determine that any allowance was wrongfully paid out, parents are liable for repaying them.

To detect cases of fraud, the Dutch tax authorities used a system that was outright discriminatory. An investigation by the Dutch Data Protection Authority last year showed that parents were singled out for special scrutiny because of their ethnic origin or dual nationality.  “The whole system was organised in a discriminatory manner and was also used as such,” it stated.

The fallout of these ‘fraud detection’ efforts was enormous. It is currently estimated that 46,000 parents were wrongly accused of having fraudulently claimed child care allowances. Families were forced to repay tens of thousands of euros, leading to financial hardship, loss of livelihood, homes, and in one case, even loss of life – one parent died by suicide. While we can still hope that justice for these families won’t be denied, it will certainly be delayed: this weekend, it became clear that it could take up to ten years to handle all claims. An unacceptable timeline, given how precarious the situation will be for many of those affected….(More)”.

Luxury Surveillance

Essay by Chris Gilliard and David Golumbia: One of the most troubling features of the digital revolution is that some people pay to subject themselves to surveillance that others are forced to endure and would, if anything, pay to be free of.

Consider a GPS tracker you can wear around one of your arms or legs. Make it sleek and cool — think the Apple Watch or FitBit —  and some will pay hundreds or even thousands of dollars for the privilege of wearing it. Make it bulky and obtrusive, and others, as a condition of release from jail or prison, being on probation, or awaiting an immigration hearing, will be forced to wear one — and forced to pay for it too.

In each case, the device collects intimate and detailed biometric information about its wearer and uploads that data to servers, communities, and repositories. To the providers of the devices, this data and the subsequent processing of it are the main reasons the devices exist. They are means of extraction: That data enables further study, prediction, and control of human beings and populations. While some providers certainly profit from the sale of devices, this secondary market for behavioral control and prediction is where the real money is — the heart of what Shoshana Zuboff rightly calls surveillance capitalism.

The formerly incarcerated person knows that their ankle monitor exists for that purpose: to predict and control their behavior. But the Apple Watch wearer likely thinks about it little, if at all — despite the fact that the watch has the potential to collect and analyze much more data about its user (e.g. health metrics like blood pressure, blood glucose levels, ECG data) than parole or probation officers are even allowed to gather about their “clients” without specific warrant. Fitness-tracker wearers are effectively putting themselves on parole and paying for the privilege.

Both the Apple Watch and the FitBit can be understood as examples of luxury surveillance: surveillance that people pay for and whose tracking, monitoring, and quantification features are understood by the user as benefits they are likely to celebrate. Google, which has recently acquired FitBit, is seemingly leaning into the category, launching a more expensive version of the device named the “Luxe.” Only certain people can afford luxury surveillance, but that is not necessarily a matter of money: In general terms, consumers of luxury surveillance see themselves as powerful and sovereign, and perhaps even immune from unwelcome monitoring and control. They see self-quantification and tracking not as disciplinary or coercive, but as a kind of care or empowerment. They understand it as something extra, something “smart.”…(More)”.

Google launches new search tool to help combat food insecurity

Article by Andrew J. Hawkins: “Google announced a new website designed to be a “one-stop shop” for people with food insecurity. The “Find Food Support” site includes a food locator tool powered by Google Maps which people can use to search for their nearest food bank, food pantry, or school lunch program pickup site in their community.

Google is working with non-profit groups like No Kid Hungry and FoodFinder, as well as the US Department of Agriculture, to aggregate 90,000 locations with free food support across all 50 states — with more locations to come.

The new site is a product of Google’s newly formed Food for Good team, formerly known as Project Delta when it was headquartered at Alphabet’s X moonshot division. Project Delta’s mission is to “create a smarter food system,” which includes standardizing data to improve communication between food distributors to curb food waste….(More)”.

Unity in Privacy Diversity: A Kaleidoscopic View of Privacy Definitions

Paper by Bert-Jaap Koops and Maša Galič: “Contrary to the common claim that privacy is a concept in disarray, this Article argues that there is considerable coherence in the way privacy has been conceptualized over many decades of privacy scholarship. Seemingly disparate approaches and widely differing definitions actually share close family resemblances that, viewed together from a bird’s-eye perspective, suggest that the concept of privacy is more akin to a kaleidoscope than to a swamp. As a heuristic device to look at this kaleidoscope, we use a working definition of privacy as having spaces in which you can be yourselves, building on two major strands in privacy theory: identity-building and boundary-management. With this heuristic, we analyze how six authoritative privacy accounts can be understood in the terms and rationale of other definitions. We show how the notions of Cohen (room for boundary management), Johnson (freedom from others’ judgement), Nissenbaum (contextual integrity), Reiman (personhood), Warren and Brandeis (being let alone), and Westin (control over information) have significant overlap with—or may even be equivalent to—an understanding of privacy in terms of identity-fostering spaces. Our kaleidoscopic perspective highlights not only that there is coherence in privacy, but also helps to understand the function and value of having many different privacy definitions around: each time and context bring their own privacy-related challenges, which might best be addressed through a certain conceptualization of privacy that works in that particular context. As the world turns its kaleidoscope of emerging privacy issues, privacy scholarship turns its kaleidoscope of privacy definitions along. The result of this kaleidoscopic perspective on privacy is an illuminating picture of unity in diversity….(More)”.

We know what you did during lockdown

An FT Film written by James Graham: “The Covid-19 pandemic has so scrambled our lives that we have barely blinked when the state has told us how many people can attend a wedding, where we can travel or even whether we should hug each other. This normalisation of the abnormal, during the moral panic of a national healthcare emergency, is the subject of People You May Know, a short film written by the playwright James Graham and commissioned by the Financial Times.

One of Britain’s most inquisitive and versatile playwrights, Graham says he has long been worried about the expansion of the “creeping data state” and has an almost “existential anxiety about privacy on all levels, emotional, philosophical, political, social”. Those concerns were first explored in his play Privacy (2014) in response to the revelations of Edward Snowden, the US security contractor turned whistleblower, who described how “the architecture of oppression” of the surveillance state had been built, if not yet fully utilised. 

In his new FT film, Graham investigates how the response to the pandemic has enabled the further intrusion of the data state and what it might mean for us all. “The power of drama is that it allows you to take a few more stepping stones into the imagined future,” he says in a Google Meet interview. …(More) (Film)”

The Case for Better Governance of Children’s Data: A Manifesto

The Case for Better Governance of Children’s Data: A Manifesto

Report by Jasmina Byrne, Emma Day and Linda Raftree: “Every child is different, with unique identities and their capacities and circumstances evolve over their lifecycle. Children are more vulnerable than adults and are less able to understand the long-term implications of consenting to their data collection. For these reasons, children’s data deserve to be treated differently.

While responsible data use can underpin many benefits for children, ensuring that children are protected, empowered and granted control of their data is still a challenge.

To maximise the benefits of data use for children and to protect them from harm requires a new model of data governance that is fitting for the 21st century.

UNICEF has worked with 17 global experts to develop a Manifesto that articulates a vision for a better approach to children’s data.

This Manifesto includes key action points and a call for a governance model purposefully designed to deliver on the needs and rights of children. It is the first step in ensuring that children’s rights are given due weight in data governance legal frameworks and processes as they evolve around the world….(More)”

Tech for disabled people is booming around the world. So where’s the funding?

Article by Devi Lockwood: “Erick Ponce works in a government communications department in northern Ecuador. The 26-year-old happens to be deaf — a disability he has had since childhood. Communicating fluidly with his non-signing colleagues at work, and in public spaces like the supermarket, has been a lifelong challenge. 

In 2017, Ponce became one of the first users of an experimental app called SpeakLiz, developed by an Ecuadorian startup called Talov. It transforms written text to sound, transcribes spoken words, and can alert a deaf or hard-of-hearing person to sounds like that of an ambulance, motorcycles, music, or a crying baby. 

Once he began using SpeakLiz, Ponce’s coworkers — and his family — were able to understand him more easily. “You cannot imagine what it feels like to speak with your son after 20 years,” his father told the app’s engineers. Now a part of the Talov team, Ponce demos new products to make them better before they hit the market. 

The startup has launched two subscription apps on iOS and Android: SpeakLiz, in 2017, for the hearing impaired, and Vision, in 2019, for the visually impaired. Talov’s founders, Hugo Jácome and Carlos Obando, have been working on the apps for over five years. 

SpeakLiz and Vision are, by many measures, successful. Their software is used by more than 7,000 people in 81 countries and is available in 35 languages. The founders won an award from MIT Technology Review and a contest organized by the History Channel. Talov was named among the top 100 most innovative startups in Latin America in 2019. 

But the startup is still struggling. Venture capitalists aren’t knocking on its door. Jácome and Obando sold some of their possessions to raise enough money to launch, and the team has next to no funding to continue expanding.

Although the last few years have seen significant advances in technology and innovation for disabled people, critics say the market is undervalued….(More)”.

The Paths to Digital Self-Determination – A Foundational Theoretical Framework

Paper by Nydia Remolina and Mark Findlay: “A deluge of data is giving rise to new understandings and experiences of society and economy as our digital footprint grows steadily. Are data subjects able to determine themselves in this data-driven society? The emerging debates about autonomy and communal responsibility in the context of data access or protection, highlight a pressing imperative to re-imagine the ‘self’ in the digital space. Empowerment, autonomy, sovereignty, human centricity, are all terms often associated with the notion of digital self-determination in current policy language. More academics, industry experts, policymakers, regulators are now advocating self-determination in a data-driven world. The attitudes to self-determination range from alienating data as property through to broad considerations of communal access and enrichment. Digital self-determination is a complex notion to be viewed from different perspectives and in unique spaces, re-shaping what we understand as self-determination in the non-digital world. This paper explores the notion of digital self-determination by presenting a foundational theoretical framework based on pre-existent self-determination theories and exploring the implications of the digital society in the determination of the self. Only by better appreciating and critically framing the discussion of digital self-determination, is it possible to engage in trustworthy data spaces, and ensure ethical human-centric approaches when living in a data driven society….(More)”.

The Case for Open Land-Data Systems

Tim Hanstad at Project Syndicate: “Last month, a former Zimbabwean cabinet minister was arrested for illegally selling parcels of state land. A few days earlier, a Malaysian court convicted the ex-chairman of a state-owned land development agency of corruption. And in January, the Estonian government collapsed amid allegations of corrupt property dealings. These recent events all turned the spotlight on the growing but neglected threat of land-related corruption.

Such corruption can flourish in countries that are unprepared to manage the heightened demand for land that accompanies economic and population growth. Land governance in these countries – institutions, policies, rules, and records for managing land rights and use – is underdeveloped, which undermines the security of citizens’ land rights and enables covert land grabs by the well connected.

In Ghana, for example, the government keeps land records for only about 2% of currently operating farms; the ownership of the remainder is largely undocumented. In India, these records were, until recently, often kept in disorganized stacks in government offices.

Under such circumstances, corruption becomes relatively easy and lucrative. After all, when recordkeeping is nonexistent or chaotic, who can confidently identify the rightful owner of a parcel of land? As the United Nations Food and Agriculture Organization and Transparency International put it in a report a decade ago, “where land governance is deficient, high levels of corruption often flourish.” This corruption “is pervasive and without effective means of control.”

Globally, one in five people report having paid a bribe to access land services. In Africa, two out of three people believe the rich are likely to pay bribes or use their connections to grab land. Uncertainty about land rights can also affect housing security – around a billion people worldwide say they expect to be forced from their homes over the next five years.

Inevitably, the marginalized and vulnerable are the worst affected, whether they are widows driven from their homes by speculators or entire communities subjected to forced eviction by developers. Weak land rights and corruption also fuel conflict within communities, such as in Kenya, where political parties promise already-occupied land to supporters in an attempt to win votes.

But there is reason for hope. The ongoing revolution in information and communications technology provides unprecedented opportunities to digitize and open land records. Doing so would clarify the land rights of hundreds of millions of people globally and limit the scope for corrupt practices….(More)”.