How do we ensure anonymisation is effective?


Chapter by the Information Commissioner’s Office (UK): “Effective anonymisation reduces identifiability risk to a sufficiently remote level.
• Identifiability is about whether someone is “identified or identifiable”. This doesn’t just concern someone’s name, but other information and factors that can distinguish them from someone else.
• Identifiability exists on a spectrum, where the status of information can change depending on the circumstances of its processing.
• When assessing whether someone is identifiable, you need to take account of the “means reasonably likely to be used”. You should base this on objective factors such as the costs and time required to identify, the available technologies, and the state of technological development over time.
• However, you do not need to take into account any purely hypothetical or theoretical chance of identifiability. The key is what is reasonably likely relative to the circumstances, not what is conceivably likely in absolute.
• You also need to consider both the information itself as well as the environment in which it is processed. This will be impacted by the type of data release (to the public, to a defined group, etc) and the status of the information in the other party’s hands.
• When considering releasing anonymous information to the world at large, you may have to implement more robust techniques to achieve effective anonymisation than when releasing to particular groups or individual organisations.
• There are likely to be many borderline cases where you need to use careful judgement based on the specific circumstances of the case.
• Applying a “motivated intruder” test is a good starting point to consider identifiability risk.
• You should review your risk assessments and decision-making processes at appropriate intervals. The appropriate time for, and frequency of, any reviews depends on the circumstances…(More)”.

Using location data responsibly in cities and local government


Article by Ben Hawes: “City and local governments increasingly recognise the power of location data to help them deliver more and better services, exactly where and when they are needed. The use of this data is going to grow, with more pressure to manage resources and emerging challenges including responding to extreme weather events and other climate impacts.

But using location data to target and manage local services comes with risks to the equitable delivery of services, privacy and accountability. To make the best use of these growing data resources, city leaders and their teams need to understand those risks and address them, and to be able to explain their uses of data to citizens.

The Locus Charter, launched earlier this year, is a set of common principles to promote responsible practice when using location data. The Charter could be very valuable to local governments, to help them navigate the challenges and realise the rewards offered by data about the places they manage….

Compared to private companies, local public bodies already have special responsibilities to ensure transparency and fairness. New sources of data can help, but can also generate new questions. Local governments have generally been able to improve services as they learned more about the people they served. Now they must manage the risks of knowing too much about people, and acting intrusively. They can also risk distorting service provision because their data about people in places is uneven or unrepresentative.

Many city and local governments fully recognise that data-driven delivery comes with risks, and are developing specific local data ethics frameworks to guide their work. Some of these, like Kansas City’s, are specifically aimed at managing data privacy. Others cover broader uses of data, like Greater Manchester’s Declaration for Intelligent and Responsible Data Practice (DTPR). DTPR is an open source communication standard that helps people understand how data is being used in public places.

London is engaging citizens on an Emerging Technology Charter, to explore new and ethically charged questions around data. Govlab supports an AI Localism repository of actions taken by local decision-makers to address the use of AI within a city or community. The EU Sherpa programme (Shaping the Ethical Dimensions of Smart Information Systems) includes a smart cities strand, and has published a case-study on the Ethics of Using Smart City AI and Big Data.

Smart city applications make it potentially possible to collect data in many ways, for many purposes, but the technologies cannot answer questions about what is appropriate. In The Smart Enough City: Putting Technology in its Place to Reclaim Our Urban Future (2019), author Ben Green describes examples when some cities have failed and others succeeded in judging what smart applications should be used.

Attention to what constitutes ethical practice with location data can give additional help to leaders making that kind of judgement….(More)”

Building Consumer Confidence Through Transparency and Control


Cisco 2021 Consumer Privacy Survey: “Protecting privacy continues to be a critical issue for individuals, organizations, and governments around the world. Eighteen months into the COVID-19 pandemic, our health information and vaccination status are needed more than ever to understand the virus, control the spread, and enable safer environments for work, learning, recreation, and other activities. Nonetheless, people want privacy protections to be maintained, and they expect organizations and governments to keep their data safe and used only for pandemic response. Individuals are also increasingly taking action to protect themselves and their data. This report, our third annual review of consumer privacy, explores current trends, challenges, and opportunities in privacy for consumers.

The report draws upon data gathered from a June 2021 survey where respondents were not informed of who was conducting the study and respondents were anonymous to the researchers. Respondents included 2600 adults (over the age of 18) in 12 countries (5 Europe, 4 Asia Pacific, and 3 Americas). Participants were asked about their attitudes and activities regarding companies’ use of their personal data, level of support for COVID-19 related information sharing, awareness and reaction to privacy legislation, and attitudes regarding artificial intelligence (AI) and automated decision making.

The findings from this research demonstrates the growing importance of privacy to the individual and its implications on the businesses and governments that serve them. Key highlights of this report

  1. Consumers want transparency and control with respect to business data practices – an increasing number will act to protect their data
  2. Privacy laws are viewed very positively around the world, but awareness of these laws remains low
  3. Despite the ongoing pandemic, most consumers want little or no reduction in privacy protections, while still supporting public health and safety efforts
  4. Consumers are very concerned about the use of their personal information in AI and abuse has eroded trust…(More)”.

The Downside to State and Local Privacy Regulations


GovTech: “To fight back against cyber threats, state and local governments have started to implement tighter privacy regulations. But is this trend a good thing? Or do stricter rules present more challenges than they do solutions?

According to Daniel Castro, vice president of the Information Technology and Innovation Foundation, one of the main issues with stricter privacy regulations is having no centralized rules for states to follow.

“Probably the biggest problem is states setting up a set of contradictory overlapping rules across the country,” Castro said. “This creates a serious cost on organizations and businesses. They can abide by 50 state privacy laws, but there could be different regulations across local jurisdictions.”

One example of a hurdle for organizations and businesses is local jurisdictions creating specific rules for facial recognition and biometric technology.

“Let’s say a company starts selling a smart doorbell service; because of these rules, this service might not be able to be legally sold in one jurisdiction,” Castro said.

Another concern relates to the distinction between government data collection and commercial data collection, said Washington state Chief Privacy Officer Katy Ruckle. Sometimes there is a notion that one law can apply to everything, but different data types involve different types of rights for individuals.

“An example I like to use is somebody that’s been committed to a mental health institution for mental health needs,” Ruckle said. “Their data collection is very different from somebody buying a vacuum cleaner off Amazon.”

On the topic of governments collecting data, Castro emphasized the importance of knowing how data will be utilized in order to set appropriate privacy regulations….(More)”

The Battle for Digital Privacy Is Reshaping the Internet


Brian X. Chen at The New York Times: “Apple introduced a pop-up window for iPhones in April that asks people for their permission to be tracked by different apps.

Google recently outlined plans to disable a tracking technology in its Chrome web browser.

And Facebook said last month that hundreds of its engineers were working on a new method of showing ads without relying on people’s personal data.

The developments may seem like technical tinkering, but they were connected to something bigger: an intensifying battle over the future of the internet. The struggle has entangled tech titans, upended Madison Avenue and disrupted small businesses. And it heralds a profound shift in how people’s personal information may be used online, with sweeping implications for the ways that businesses make money digitally.

At the center of the tussle is what has been the internet’s lifeblood: advertising.

More than 20 years ago, the internet drove an upheaval in the advertising industry. It eviscerated newspapers and magazines that had relied on selling classified and print ads, and threatened to dethrone television advertising as the prime way for marketers to reach large audiences….

If personal information is no longer the currency that people give for online content and services, something else must take its place. Media publishers, app makers and e-commerce shops are now exploring different paths to surviving a privacy-conscious internet, in some cases overturning their business models. Many are choosing to make people pay for what they get online by levying subscription fees and other charges instead of using their personal data.

Jeff Green, the chief executive of the Trade Desk, an ad-technology company in Ventura, Calif., that works with major ad agencies, said the behind-the-scenes fight was fundamental to the nature of the web…(More)”

The State of Consumer Data Privacy Laws in the US (And Why It Matters)


Article by Thorin Klosowski at the New York Times: “With more of the things people buy being internet-connected, more of our reviews and recommendations at Wirecutter are including lengthy sections detailing the privacy and security features of such products, everything from smart thermostats to fitness trackers. As the data these devices collect is sold and shared—and hacked—deciding what risks you’re comfortable with is a necessary part of making an informed choice. And those risks vary widely, in part because there’s no single, comprehensive federal law regulating how most companies collect, store, or share customer data.

Most of the data economy underpinning common products and services is invisible to shoppers. As your data gets passed around between countless third parties, there aren’t just more companies profiting from your data, but also more possibilities for your data to be leaked or breached in a way that causes real harm. In just the past year, we’ve seen a news outlet use pseudonymous app data, allegedly leaked from an advertiser associated with the dating app Grindr, to out a priest. We’ve read about the US government buying location data from a prayer app. Researchers have found opioid-addiction treatment apps sharing sensitive data. And T-Mobile recently suffered a data breach that affected at least 40 million people, some who had never even had a T-Mobile account.

“We have these companies that are amassing just gigantic amounts of data about each and every one of us, all day, every day,” said Kate Ruane, senior legislative counsel for the First Amendment and consumer privacy at the American Civil Liberties Union. Ruane also pointed out how data ends up being used in surprising ways—intentionally or not—such as in targeting ads or adjusting interest rates based on race. “Your data is being taken and it is being used in ways that are harmful.”

Consumer data privacy laws can give individuals rights to control their data, but if poorly implemented such laws could also maintain the status quo. “We can stop it,” Ruane continued. “We can create a better internet, a better world, that is more privacy protective.”…(More)”

Unpacking China’s game-changing data law


Article by Shen Lu: “China’s National Congress passed the highly anticipated Personal Information Protection Law on Friday, a significant piece of legislation that will provide Chinese citizens significant privacy protections while also bolstering Beijing’s ambitions to set international norms in data protection.

China’s PIPL is not only key to Beijing’s vision for a next-generation digital economy; it is also likely to influence other countries currently adopting their own data protection laws.

The new law clearly draws inspiration from the European Union’s General Data Protection Regulation, and like its precursor is an effort to respond to genuine grassroots demand for greater right to consumer privacy. But what distinguishes China’s PIPL from the GDPR and other laws on the books is China’s emphasis on national security, which is a broadly defined trump card that triggers data localization requirements and cross-border data flow restrictions….

The PIPL reinforces Beijing’s ambition to defend its digital sovereignty. If foreign entities “engage in personal information handling activities that violate the personal information rights and interests of citizens of the People’s Republic of China, or harm the national security or public interest of the People’s Republic of China,” China’s enforcement agencies may blacklist them, “limiting or prohibiting the provision of personal information to them.” And China may reciprocate against countries or regions that adopt “discriminatory prohibitions, limitations or other similar measures against the People’s Republic of China in the area of personal information protection.”…

Many Asian governments are in the process of writing or rewriting data protection laws. Vietnam, India, Pakistan and Sri Lanka have all inserted localization provisions in their respective data protection laws. “[The PIPL framework] can provide encouragement to countries that would be tempted to use the data protection law that includes data transfer provisions to add this national security component,” Girot said.

This new breed of data protection law could lead to a fragmented global privacy landscape. Localization requirements can be a headache for transnational tech companies, particularly cloud service providers. And the CAC, one of the data regulators in charge of implementing and enforcing the PIPL, is also tasked with implementing a national security policy, which could present a challenge to international cooperation….(More)

AI, big data, and the future of consent


Paper by Adam J. Andreotta, Nin Kirkham & Marco Rizzi: “In this paper, we discuss several problems with current Big data practices which, we claim, seriously erode the role of informed consent as it pertains to the use of personal information. To illustrate these problems, we consider how the notion of informed consent has been understood and operationalised in the ethical regulation of biomedical research (and medical practices, more broadly) and compare this with current Big data practices. We do so by first discussing three types of problems that can impede informed consent with respect to Big data use. First, we discuss the transparency (or explanation) problem. Second, we discuss the re-repurposed data problem. Third, we discuss the meaningful alternatives problem. In the final section of the paper, we suggest some solutions to these problems. In particular, we propose that the use of personal data for commercial and administrative objectives could be subject to a ‘soft governance’ ethical regulation, akin to the way that all projects involving human participants (e.g., social science projects, human medical data and tissue use) are regulated in Australia through the Human Research Ethics Committees (HRECs). We also consider alternatives to the standard consent forms, and privacy policies, that could make use of some of the latest research focussed on the usability of pictorial legal contracts…(More)”

The Future of Digital Surveillance


Book by Yong Jin Park: “Are humans hard-wired to make good decisions about managing their privacy in an increasingly public world? Or are we helpless victims of surveillance through our use of invasive digital media? Exploring the chasm between the tyranny of surveillance and the ideal of privacy, this book traces the origins of personal data collection in digital technologies including artificial intelligence (AI) embedded in social network sites, search engines, mobile apps, the web, and email. The Future of Digital Surveillance argues against a technologically deterministic view—digital technologies by nature do not cause surveillance. Instead, the shaping of surveillance technologies is embedded in a complex set of individual psychology, institutional behaviors, and policy principles….(More)”

Privacy Tradeoffs: Who Should Make Them, and How?


Paper by Jane R. Bambauer: “Privacy debates are contentious in part because we have not reached a broadly recognized cultural consensus about whether interests in privacy are like most other interests that can be traded off in utilitarian, cost-benefit terms, or if instead privacy is different—fundamental to conceptions of dignity and personal liberty. Thus, at the heart of privacy debates is an unresolved question: is privacy just another interest that can and should be bartered, mined, and used in the economy, or is it different?

This question identifies and isolates a wedge between those who hold essentially utilitarian views of ethics (and who would see many data practices as acceptable) and those who hold views of natural and fundamental rights (for whom common data mining practices are either never acceptable or, at the very least, never acceptable without significant participation and consent of the subject).

This essay provides an intervention of a purely descriptive sort. First, I lay out several candidates for ethical guidelines that might legitimately undergird privacy law and policy. Only one of the ethical models (the natural right to sanctuary) can track the full scope and implications of fundamental rights-based privacy laws like the GDPR.

Second, the project contributes to the field of descriptive ethics by using a vignette experiment to discover which of the various ethical models people actually do seem to hold and abide by. The vignette study uses a factorial design to help isolate the roles of various factors that may contribute to the respondents’ gauge of what an ethical firm should or should not do in the context of personal data use as well as two other non-privacy-related contexts. The results can shed light on whether privacy-related ethics are different and distinct from business ethics more generally. They also illuminate which version(s) of “good” and “bad” share broad support and deserve to be reflected in privacy law or business practice.

The results of the vignette experiment show that on balance, Americans subscribe to some form of utilitarianism, although a substantial minority subscribe to a natural right to sanctuary approach. Thus, consent and prohibitions of data practices are appropriate where the likely risks to some groups (most importantly, data subjects, but also firms and third parties) outweigh the benefits….(More)”