The Downside to State and Local Privacy Regulations


GovTech: “To fight back against cyber threats, state and local governments have started to implement tighter privacy regulations. But is this trend a good thing? Or do stricter rules present more challenges than they do solutions?

According to Daniel Castro, vice president of the Information Technology and Innovation Foundation, one of the main issues with stricter privacy regulations is having no centralized rules for states to follow.

“Probably the biggest problem is states setting up a set of contradictory overlapping rules across the country,” Castro said. “This creates a serious cost on organizations and businesses. They can abide by 50 state privacy laws, but there could be different regulations across local jurisdictions.”

One example of a hurdle for organizations and businesses is local jurisdictions creating specific rules for facial recognition and biometric technology.

“Let’s say a company starts selling a smart doorbell service; because of these rules, this service might not be able to be legally sold in one jurisdiction,” Castro said.

Another concern relates to the distinction between government data collection and commercial data collection, said Washington state Chief Privacy Officer Katy Ruckle. Sometimes there is a notion that one law can apply to everything, but different data types involve different types of rights for individuals.

“An example I like to use is somebody that’s been committed to a mental health institution for mental health needs,” Ruckle said. “Their data collection is very different from somebody buying a vacuum cleaner off Amazon.”

On the topic of governments collecting data, Castro emphasized the importance of knowing how data will be utilized in order to set appropriate privacy regulations….(More)”

The Battle for Digital Privacy Is Reshaping the Internet


Brian X. Chen at The New York Times: “Apple introduced a pop-up window for iPhones in April that asks people for their permission to be tracked by different apps.

Google recently outlined plans to disable a tracking technology in its Chrome web browser.

And Facebook said last month that hundreds of its engineers were working on a new method of showing ads without relying on people’s personal data.

The developments may seem like technical tinkering, but they were connected to something bigger: an intensifying battle over the future of the internet. The struggle has entangled tech titans, upended Madison Avenue and disrupted small businesses. And it heralds a profound shift in how people’s personal information may be used online, with sweeping implications for the ways that businesses make money digitally.

At the center of the tussle is what has been the internet’s lifeblood: advertising.

More than 20 years ago, the internet drove an upheaval in the advertising industry. It eviscerated newspapers and magazines that had relied on selling classified and print ads, and threatened to dethrone television advertising as the prime way for marketers to reach large audiences….

If personal information is no longer the currency that people give for online content and services, something else must take its place. Media publishers, app makers and e-commerce shops are now exploring different paths to surviving a privacy-conscious internet, in some cases overturning their business models. Many are choosing to make people pay for what they get online by levying subscription fees and other charges instead of using their personal data.

Jeff Green, the chief executive of the Trade Desk, an ad-technology company in Ventura, Calif., that works with major ad agencies, said the behind-the-scenes fight was fundamental to the nature of the web…(More)”

The State of Consumer Data Privacy Laws in the US (And Why It Matters)


Article by Thorin Klosowski at the New York Times: “With more of the things people buy being internet-connected, more of our reviews and recommendations at Wirecutter are including lengthy sections detailing the privacy and security features of such products, everything from smart thermostats to fitness trackers. As the data these devices collect is sold and shared—and hacked—deciding what risks you’re comfortable with is a necessary part of making an informed choice. And those risks vary widely, in part because there’s no single, comprehensive federal law regulating how most companies collect, store, or share customer data.

Most of the data economy underpinning common products and services is invisible to shoppers. As your data gets passed around between countless third parties, there aren’t just more companies profiting from your data, but also more possibilities for your data to be leaked or breached in a way that causes real harm. In just the past year, we’ve seen a news outlet use pseudonymous app data, allegedly leaked from an advertiser associated with the dating app Grindr, to out a priest. We’ve read about the US government buying location data from a prayer app. Researchers have found opioid-addiction treatment apps sharing sensitive data. And T-Mobile recently suffered a data breach that affected at least 40 million people, some who had never even had a T-Mobile account.

“We have these companies that are amassing just gigantic amounts of data about each and every one of us, all day, every day,” said Kate Ruane, senior legislative counsel for the First Amendment and consumer privacy at the American Civil Liberties Union. Ruane also pointed out how data ends up being used in surprising ways—intentionally or not—such as in targeting ads or adjusting interest rates based on race. “Your data is being taken and it is being used in ways that are harmful.”

Consumer data privacy laws can give individuals rights to control their data, but if poorly implemented such laws could also maintain the status quo. “We can stop it,” Ruane continued. “We can create a better internet, a better world, that is more privacy protective.”…(More)”

Unpacking China’s game-changing data law


Article by Shen Lu: “China’s National Congress passed the highly anticipated Personal Information Protection Law on Friday, a significant piece of legislation that will provide Chinese citizens significant privacy protections while also bolstering Beijing’s ambitions to set international norms in data protection.

China’s PIPL is not only key to Beijing’s vision for a next-generation digital economy; it is also likely to influence other countries currently adopting their own data protection laws.

The new law clearly draws inspiration from the European Union’s General Data Protection Regulation, and like its precursor is an effort to respond to genuine grassroots demand for greater right to consumer privacy. But what distinguishes China’s PIPL from the GDPR and other laws on the books is China’s emphasis on national security, which is a broadly defined trump card that triggers data localization requirements and cross-border data flow restrictions….

The PIPL reinforces Beijing’s ambition to defend its digital sovereignty. If foreign entities “engage in personal information handling activities that violate the personal information rights and interests of citizens of the People’s Republic of China, or harm the national security or public interest of the People’s Republic of China,” China’s enforcement agencies may blacklist them, “limiting or prohibiting the provision of personal information to them.” And China may reciprocate against countries or regions that adopt “discriminatory prohibitions, limitations or other similar measures against the People’s Republic of China in the area of personal information protection.”…

Many Asian governments are in the process of writing or rewriting data protection laws. Vietnam, India, Pakistan and Sri Lanka have all inserted localization provisions in their respective data protection laws. “[The PIPL framework] can provide encouragement to countries that would be tempted to use the data protection law that includes data transfer provisions to add this national security component,” Girot said.

This new breed of data protection law could lead to a fragmented global privacy landscape. Localization requirements can be a headache for transnational tech companies, particularly cloud service providers. And the CAC, one of the data regulators in charge of implementing and enforcing the PIPL, is also tasked with implementing a national security policy, which could present a challenge to international cooperation….(More)

AI, big data, and the future of consent


Paper by Adam J. Andreotta, Nin Kirkham & Marco Rizzi: “In this paper, we discuss several problems with current Big data practices which, we claim, seriously erode the role of informed consent as it pertains to the use of personal information. To illustrate these problems, we consider how the notion of informed consent has been understood and operationalised in the ethical regulation of biomedical research (and medical practices, more broadly) and compare this with current Big data practices. We do so by first discussing three types of problems that can impede informed consent with respect to Big data use. First, we discuss the transparency (or explanation) problem. Second, we discuss the re-repurposed data problem. Third, we discuss the meaningful alternatives problem. In the final section of the paper, we suggest some solutions to these problems. In particular, we propose that the use of personal data for commercial and administrative objectives could be subject to a ‘soft governance’ ethical regulation, akin to the way that all projects involving human participants (e.g., social science projects, human medical data and tissue use) are regulated in Australia through the Human Research Ethics Committees (HRECs). We also consider alternatives to the standard consent forms, and privacy policies, that could make use of some of the latest research focussed on the usability of pictorial legal contracts…(More)”

The Future of Digital Surveillance


Book by Yong Jin Park: “Are humans hard-wired to make good decisions about managing their privacy in an increasingly public world? Or are we helpless victims of surveillance through our use of invasive digital media? Exploring the chasm between the tyranny of surveillance and the ideal of privacy, this book traces the origins of personal data collection in digital technologies including artificial intelligence (AI) embedded in social network sites, search engines, mobile apps, the web, and email. The Future of Digital Surveillance argues against a technologically deterministic view—digital technologies by nature do not cause surveillance. Instead, the shaping of surveillance technologies is embedded in a complex set of individual psychology, institutional behaviors, and policy principles….(More)”

Privacy Tradeoffs: Who Should Make Them, and How?


Paper by Jane R. Bambauer: “Privacy debates are contentious in part because we have not reached a broadly recognized cultural consensus about whether interests in privacy are like most other interests that can be traded off in utilitarian, cost-benefit terms, or if instead privacy is different—fundamental to conceptions of dignity and personal liberty. Thus, at the heart of privacy debates is an unresolved question: is privacy just another interest that can and should be bartered, mined, and used in the economy, or is it different?

This question identifies and isolates a wedge between those who hold essentially utilitarian views of ethics (and who would see many data practices as acceptable) and those who hold views of natural and fundamental rights (for whom common data mining practices are either never acceptable or, at the very least, never acceptable without significant participation and consent of the subject).

This essay provides an intervention of a purely descriptive sort. First, I lay out several candidates for ethical guidelines that might legitimately undergird privacy law and policy. Only one of the ethical models (the natural right to sanctuary) can track the full scope and implications of fundamental rights-based privacy laws like the GDPR.

Second, the project contributes to the field of descriptive ethics by using a vignette experiment to discover which of the various ethical models people actually do seem to hold and abide by. The vignette study uses a factorial design to help isolate the roles of various factors that may contribute to the respondents’ gauge of what an ethical firm should or should not do in the context of personal data use as well as two other non-privacy-related contexts. The results can shed light on whether privacy-related ethics are different and distinct from business ethics more generally. They also illuminate which version(s) of “good” and “bad” share broad support and deserve to be reflected in privacy law or business practice.

The results of the vignette experiment show that on balance, Americans subscribe to some form of utilitarianism, although a substantial minority subscribe to a natural right to sanctuary approach. Thus, consent and prohibitions of data practices are appropriate where the likely risks to some groups (most importantly, data subjects, but also firms and third parties) outweigh the benefits….(More)”

Whose Streets? Our Streets!


Report by Rebecca Williams: “The extent to which “smart city” technology is altering our sense of freedom in public spaces deserves more attention if we want a democratic future. Democracy–the rule of the people–constitutes our collective self-determination and protects us against domination and abuse. Democracy requires safe spaces, or commons, for people to organically and spontaneously convene regardless of their background or position to campaign for their causes, discuss politics, and protest. In these commons, where anyone can take a stand and be noticed is where a notion of collective good can be developed and communicated. Public spaces, like our streets, parks, and squares, have historically played a significant role in the development of democracy. We should fight to preserve the freedoms intrinsic to our public spaces because they make democracy possible.

Last summer, approximately 15 to 26 million people participated in Black Lives Matter protests after the murder of George Floyd making it the largest mass movement in U.S. history. In June, the San Diego Police Department obtained footage of Black Lives Matter protesters from “smart streetlight” cameras, sparking shock and outrage from San Diego community members. These “smart streetlights” were promoted as part of citywide efforts to become a “smart city” to help with traffic control and air quality monitoring. Despite discoverable documentation about the streetlight’s capabilities and data policies on their website, including a data-sharing agreement about how they would share data with the police, the community had no expectation that the streetlights would be surveilling protestors. After media coverage and ongoing advocacy from the Transparent and Responsible Use of Surveillance Technology San Diego (TRUSTSD) coalition, the City Council, set aside the funding for the streetlights4 until a surveillance technology ordinance was considered and the Mayor ordered the 3,000+ streetlight cameras off. Due to the way power was supplied to the cameras, they remained on, but the city reported it no longer had access to the data it collected. In November, the City Council voted unanimously in favor of a surveillance ordinance and to establish a Privacy Advisory Board.In May, it was revealed that the San Diego Police Department had previously (in 2017) held back materials to Congress’ House Committee on Oversight and Reform about their use facial recognition technology. This story, with its mission creep and mishaps, is representative of a broader set of “smart city” cautionary trends that took place in the last year. These cautionary trends call us to question if our public spaces become places where one fears punishment, how will that affect collective action and political movements?

This report is an urgent warning of where we are headed if we maintain our current trajectory of augmenting our public space with trackers of all kinds. In this report, I outline how current “smart city” technologies can watch you. I argue that all “smart city” technology trends toward corporate and state surveillance and that if we don’t stop and blunt these trends now that totalitarianism, panopticonism, discrimination, privatization, and solutionism will challenge our democratic possibilities. This report examines these harms through cautionary trends supported by examples from this last year and provides 10 calls to action for advocates, legislatures, and technology companies to prevent these harms. If we act now, we can ensure the technology in our public spaces protect and promote democracy and that we do not continue down this path of an elite few tracking the many….(More)”

The Patient, Data Protection and Changing Healthcare Models


Book by Griet Verhenneman on The Impact of e-Health on Informed Consent, Anonymisation and Purpose Limitation: “Healthcare is changing. It is moving to a paperless environment and becoming a team-based, interdisciplinary and patient-centred profession. Modern healthcare models reflect our data-driven economy, and adopt value-driven strategies, evidence-based medicine, new technology, decision support and automated decision-making. Amidst these changes are the patients, and their right to data protection, privacy and autonomy. The question arises of how to match phenomena that characterise the predominant ethos in modern healthcare systems, such as e-health and personalised medicine, to patient autonomy and data protection laws. That matching exercise is essential. The successful adoption of ICT in healthcare depends, at least partly, on how the public’s concerns about data protection and confidentiality are addressed.

Three backbone principles of European data protection law are considered to be bottlenecks for the implementation of modern healthcare systems: informed consent, anonymisation and purpose limitation. This book assesses the adequacy of these principles and considers them in the context of technological and societal evolutions. A must-read for every professional active in the field of data protection law, health law, policy development or IT-driven innovation…(More)”.

Census Data Change to Protect Privacy Rattles Researchers, Minority Groups


Paul Overberg at the Wall Street Journal: A plan to protect the confidentiality of Americans’ responses to the 2020 census by injecting small, calculated distortions into the results is raising concerns that it will erode their usability for research and distribution of state and federal funds.

The Census Bureau is due to release the first major results of the decennial count in mid-August. They will offer the first detailed look at the population and racial makeup of thousands of counties and cities, as well as tribal areas, neighborhoods, school districts and smaller areas that will be used to redraw congressional, legislative and local districts to balance their populations.

The bureau will adjust most of those statistics to prevent someone from recombining them in a way that would disclose information about an individual respondent. Testing by the bureau shows that improvements in data science, computing power and commercial databases make that feasible.

Last week the bureau’s acting director said the plan was a necessary update of older methods to protect confidentiality. Ron Jarmin said the agency searched for alternatives before settling on differential privacy, a systematic approach to add statistical noise to data, something it has done in some fashion for years.

“I’m pretty confident that it’s going to meet users’ expectations,” Mr. Jarmin said at a panel during an online conference of government data users. “We have to deal with the technology as it is and as it evolves.”…(More)”.