A data ‘black hole’: Europol ordered to delete vast store of personal data


Article by Apostolis Fotiadis, Ludek Stavinoha, Giacomo Zandonini, Daniel Howden: “…The EU’s police agency, Europol, will be forced to delete much of a vast store of personal data that it has been found to have amassed unlawfully by the bloc’s data protection watchdog. The unprecedented finding from the European Data Protection Supervisor (EDPS) targets what privacy experts are calling a “big data ark” containing billions of points of information. Sensitive data in the ark has been drawn from crime reports, hacked from encrypted phone services and sampled from asylum seekers never involved in any crime.

According to internal documents seen by the Guardian, Europol’s cache contains at least 4 petabytes – equivalent to 3m CD-Roms or a fifth of the entire contents of the US Library of Congress. Data protection advocates say the volume of information held on Europol’s systems amounts to mass surveillance and is a step on its road to becoming a European counterpart to the US National Security Agency (NSA), the organisation whose clandestine online spying was revealed by whistleblower Edward Snowden….(More)”.

The Crowdsourced Panopticon


Book by Jeremy Weissman: “Behind the omnipresent screens of our laptops and smartphones, a digitally networked public has quickly grown larger than the population of any nation on Earth. On the flipside, in front of the ubiquitous recording devices that saturate our lives, individuals are hyper-exposed through a worldwide online broadcast that encourages the public to watch, judge, rate, and rank people’s lives. The interplay of these two forces – the invisibility of the anonymous crowd and the exposure of the individual before that crowd – is a central focus of this book. Informed by critiques of conformity and mass media by some of the greatest philosophers of the past two centuries, as well as by a wide range of historical and empirical studies, Weissman helps shed light on what may happen when our lives are increasingly broadcast online for everyone all the time, to be judged by the global community…(More)”.

Data for Common Purpose: Leveraging Consent to Build Trust


Report by the World Economic Forum: “Over the past few decades, the digital world has been a breeding ground for bad actors, data breaches, and dark patterns of data collection and use. Shifting individuals’ perceptions away from skepticism to a position of trust is no easy task with no easy answers. This report provides a pragmatic approach to strengthen the engagement of individuals and positively affect the experiences of those who contribute data for the common good…(More)”.

Amsterdam introduces mandatory register for sensors


Sarah Wray at Cities Today: “Private companies, research institutions and government organisations in Amsterdam are now obliged to report sensors deployed in public spaces.

The information is being displayed via an online map to give residents more insight into how, where and what data is collected from sources such as cameras, air quality and traffic sensors, Wi-Fi counters and smart billboards. The map shows the type of sensor, the owner and whether personal data is processed.

A statement from the city said: “Amsterdam believes that residents have the right to know where and when data is collected. The sensor register and the reporting obligation help to create awareness. It is one of the 18 actions from the Amsterdam Data Strategy.”

The requirement applies to new sensors and those that are already installed in the city, including mobile sensors.

So far, only sensors from the City of Amsterdam have been included in the register. Other owners are now urged to report their sensors and have until 1 June 2022 before enforcement action will be taken.

If there is no response even after warnings, the municipality can remove the sensor at the owner’s expense, the city said.

The obligation to report sensors  is part of a regulation update recently passed by the City Council…(More)”.

Pandemic Privacy


A Preliminary Analysis of Collection Technologies, Data Collection Laws, and Legislative Reform during COVID-19 by Benjamin Ballard, Amanda Cutinha, and Christopher Parsons: “…a preliminary comparative analysis of how different information technologies were mobilized in response to COVID-19 to collect data, the extent to which Canadian health or privacy or emergencies laws impeded the response to COVID-19, and ultimately, the potential consequences of reforming data protection or privacy laws to enable more expansive data collection, use, or disclosure of personal information in future health emergencies. In analyzing how data has been collected in the United States, United Kingdom, and Canada, we found that while many of the data collection methods could be mapped onto a trajectory of past collection practices, the breadth and extent of data collection in tandem with how communications networks were repurposed constituted novel technological responses to a health crisis. Similarly, while the intersection of public and private interests in providing healthcare and government services is not new, the ability for private companies such as Google and Apple to forcefully shape some of the technology-enabled pandemic responses speaks to the significant ability of private companies to guide or direct public health measures that rely on contemporary smartphone technologies. While we found that the uses of technologies were linked to historical efforts to combat the spread of disease, the nature and extent of private surveillance to enable public action was arguably unprecedented….(More)”.

22 Questions to Assess Responsible Data for Children (RD4C)


An Audit Tool by The GovLab and UNICEF: “Around the world and across domains, institutions are using data to improve service delivery for children. Data for and about children can, however, pose risks of misuse, such as unauthorized access or data breaches, as well as missed use of data that could have improved children’s lives if harnessed effectively. 

The RD4C Principles — Participatory; Professionally Accountable; People-Centric; Prevention of Harms Across the Data Life Cycle; Proportional; Protective of Children’s Rights; and Purpose-Driven — were developed by the GovLab and UNICEF to guide responsible data handling toward saving children’s lives, defending their rights, and helping them fulfill their potential from early childhood through adolescence. These principles were developed to act as a north star, guiding practitioners toward more responsible data practices.

Today, The GovLab and UNICEF, as part of the Responsible Data for Children initiative (RD4C), are pleased to launch a new tool that aims to put the principles into practice. 22 Questions to Assess Responsible Data for Children (RD4C) is an audit tool to help stakeholders involved in the administration of data systems that handle data for and about children align their practices with the RD4C Principles. 

The tool encourages users to reflect on their data handling practices and strategy by posing questions regarding: 

  • Why: the purpose and rationale for the data system;
  • What: the data handled through the system; 
  • Who: the stakeholders involved in the system’s use, including data subjects;
  • How: the presence of operations, policies, and procedures; and 
  • When and where: temporal and place-based considerations….(More)”.
6b8bb1de 5bb6 474d B91a 99add0d5e4cd

Privacy Principles for Mobility Data


About: “The Principles are a set of values and priorities intended to guide the mobility ecosystem in the responsible use of data and the protection of individual privacy. They are intended to serve as a guiding “North Star” to assess technical and policy decisions that have implications for privacy when handling mobility data. The principles are designed to apply to all sectors, including public, private, research and non-profit….

Increasingly, organizations in the public, private and nonprofit sectors are faced with decisions that have data privacy implications. For organizations utilizing mobility data, these principles provide a baseline framework to both identify and address these situations. Individuals whose data is being collected, utilized and shared must be afforded proper protections and opportunities for agency in how information about them is used and handled. These principles offer guidance for how to engage in this process.

Human movement generates data in many ways: directly through the usage of GPS-enabled mobility services or devices, indirectly through phones or other devices with geolocation and even through cameras and other sensors that observe the public realm. While these principles were written with shared mobility services in mind, many of them will be applicable in other contexts in which data arising out of individual movement is collected and analyzed. We encourage any organization working with this type of data to adapt and apply these principles in their specific context.

While not all mobility data may present a privacy risk to individuals, all stakeholders managing mobility data should treat it as personal information that is sensitive, unless it can be demonstrated that it doesn’t present a privacy risk to individuals.

These principles were developed through a collaboration organized by the New Urban Mobility (NUMO) alliance, the North American Bikeshare & Scootershare Association (NABSA) and the Open Mobility Foundation (OMF) in 2020. These groups convened a diverse set of stakeholders representing cities, mobility service providers, technology companies, privacy advocates and academia. Over the course of many months, this group heard from privacy experts, discussed key topics related to data privacy and identified core ideas and common themes to serve as a basis for these Principles….(More)”.

Evaluating the trade-off between privacy, public health safety, and digital security in a pandemic


Paper by Titi Akinsanmi and Aishat Salami: “COVID-19 has impacted all aspects of everyday normalcy globally. During the height of the pandemic, people shared their (PI) with one goal—to protect themselves from contracting an “unknown and rapidly mutating” virus. The technologies (from applications based on mobile devices to online platforms) collect (with or without informed consent) large amounts of PI including location, travel, and personal health information. These were deployed to monitor, track, and control the spread of the virus. However, many of these measures encouraged the trade-off on privacy for safety. In this paper, we reexamine the nature of privacy through the lens of safety focused on the health sector, digital security, and what constitutes an infraction or otherwise of the privacy rights of individuals in a pandemic as experienced in the past 18 months. This paper makes a case for maintaining a balance between the benefit, which the contact tracing apps offer in the containment of COVID-19 with the need to ensure end-user privacy and data security. Specifically, it strengthens the case for designing with transparency and accountability measures and safeguards in place as critical to protecting the privacy and digital security of users—in the use, collection, and retention of user data. We recommend oversight measures to ensure compliance with the principles of lawful processing, knowing that these, among others, would ensure the integration of privacy by design principles even in unforeseen crises like an ongoing pandemic; entrench public trust and acceptance, and protect the digital security of people…(More)”.

Can data die?


Article by Jennifer Ding: “…To me, the crux of the Lenna story is how little power we have over our data and how it is used and abused. This threat seems disproportionately higher for women who are often overrepresented in internet content, but underrepresented in internet company leadership and decision making. Given this reality, engineering and product decisions will continue to consciously (and unconsciously) exclude our needs and concerns.

While social norms are changing towards non-consensual data collection and data exploitation, digital norms seem to be moving in the opposite direction. Advancements in machine learning algorithms and data storage capabilities are only making data misuse easier. Whether the outcome is revenge porn or targeted ads, surveillance or discriminatory AI, if we want a world where our data can retire when it’s outlived its time, or when it’s directly harming our lives, we must create the tools and policies that empower data subjects to have a say in what happens to their data… including allowing their data to die…(More)”

Nonprofit Websites Are Riddled With Ad Trackers


Article by By Alfred Ng and Maddy Varner: “Last year, nearly 200 million people visited the website of Planned Parenthood, a nonprofit that many people turn to for very private matters like sex education, access to contraceptives, and access to abortions. What those visitors may not have known is that as soon as they opened plannedparenthood.org, some two dozen ad trackers embedded in the site alerted a slew of companies whose business is not reproductive freedom but gathering, selling, and using browsing data.

The Markup ran Planned Parenthood’s website through our Blacklight tool and found 28 ad trackers and 40 third-party cookies tracking visitors, in addition to so-called “session recorders” that could be capturing the mouse movements and keystrokes of people visiting the homepage in search of things like information on contraceptives and abortions. The site also contained trackers that tell Facebook and Google if users visited the site.

The Markup’s scan found Planned Parenthood’s site communicating with companies like Oracle, Verizon, LiveRamp, TowerData, and Quantcast—some of which have made a business of assembling and selling access to masses of digital data about people’s habits.

Katie Skibinski, vice president for digital products at Planned Parenthood, said the data collected on its website is “used only for internal purposes by Planned Parenthood and our affiliates,” and the company doesn’t “sell” data to third parties.

“While we aim to use data to learn how we can be most impactful, at Planned Parenthood, data-driven learning is always thoughtfully executed with respect for patient and user privacy,” Skibinski said. “This means using analytics platforms to collect aggregate data to gather insights and identify trends that help us improve our digital programs.”

Skibinski did not dispute that the organization shares data with third parties, including data brokers.

Blacklight scan of Planned Parenthood Gulf Coast—a localized website specifically for people in the Gulf region, including Texas, where abortion has been essentially outlawed—churned up similar results.

Planned Parenthood is not alone when it comes to nonprofits, some operating in sensitive areas like mental health and addiction, gathering and sharing data on website visitors.

Using our Blacklight tool, The Markup scanned more than 23,000 websites of nonprofit organizations, including those belonging to abortion providers and nonprofit addiction treatment centers. The Markup used the IRS’s nonprofit master file to identify nonprofits that have filed a tax return since 2019 and that the agency categorizes as focusing on areas like mental health and crisis intervention, civil rights, and medical research. We then examined each nonprofit’s website as publicly listed in GuideStar. We found that about 86 percent of them had third-party cookies or tracking network requests. By comparison, when The Markup did a survey of the top 80,000 websites in 2020, we found 87 percent used some type of third-party tracking.

About 11 percent of the 23,856 nonprofit websites we scanned had a Facebook pixel embedded, while 18 percent used the Google Analytics “Remarketing Audiences” feature.

The Markup found that 439 of the nonprofit websites loaded scripts called session recorders, which can monitor visitors’ clicks and keystrokes. Eighty-nine of those were for websites that belonged to nonprofits that the IRS categorizes as primarily focusing on mental health and crisis intervention issues…(More)”.