Denmark is appointing an ambassador to big tech


Matthew Hughes in The Next Web: “Question: Is Facebook a country? It sounds silly, but when you think about it, it does have many attributes in common with nation states. For starters, it’s got a population that’s bigger than that of India, and its 2016 revenue wasn’t too far from Estonia’s GDP. It also has a ‘national ethos’. If America’s philosophy is capitalism, Cuba’s is communism, and Sweden’s is social democracy, Facebook’s is ‘togetherness’, as corny as that may sound.

 Given all of the above, is it really any surprise that Denmark is considering appointing a ‘big tech ambassador’ whose job is to establish and manage the country’s relationship with the world’s most powerful tech companies?

Denmark’s “digital ambassador” is a first. No country has ever created such a role. Their job will be to liase with the likes of Google, Twitter, Facebook.

Given the fraught relationship many European countries have with American big-tech – especially on issues of taxation, privacy, and national security – Denmark’s decision to extend an olive branch seems sensible.

Speaking with the Washington Post, Danish Foreign Minister Anders Samuelsen said, “just as we engage in a diplomatic dialogue with countries, we also need to establish and prioritize comprehensive relations with tech actors, such as Google, Facebook, Apple and so on. The idea is, we see a lot of companies and new technologies that will in many ways involve and be part of everyday life of citizens in Denmark.”….(More)”

Data Disrupts Corruption


Carlos Santiso & Ben Roseth at Stanford Social Innovation Review: “…The Panama Papers scandal demonstrates the power of data analytics to uncover corruption in a world flooded with terabytes needing only the computing capacity to make sense of it all. The Rousse impeachment illustrates how open data can be used to bring leaders to account. Together, these stories show how data, both “big” and “open,” is driving the fight against corruption with fast-paced, evidence-driven, crowd-sourced efforts. Open data can put vast quantities of information into the hands of countless watchdogs and whistleblowers. Big data can turn that information into insight, making corruption easier to identify, trace, and predict. To realize the movement’s full potential, technologists, activists, officials, and citizens must redouble their efforts to integrate data analytics into policy making and government institutions….

Making big data open cannot, in itself, drive anticorruption efforts. “Without analytics,” a 2014 White House report on big data and individual privacy underscored, “big datasets could be stored, and they could be retrieved, wholly or selectively. But what comes out would be exactly what went in.”

In this context, it is useful to distinguish the four main stages of data analytics to illustrate its potential in the global fight against corruption: Descriptive analytics uses data to describe what has happened in analyzing complex policy issues; diagnostic analytics goes a step further by mining and triangulating data to explain why a specific policy problem has happened, identify its root causes, and decipher underlying structural trends; predictive analytics uses data and algorithms to predict what is most likely to occur, by utilizing machine learning; and prescriptive analytics proposes what should be done to cause or prevent something from happening….

Despite the big data movement’s promise for fighting corruption, many challenges remain. The smart use of open and big data should focus not only on uncovering corruption, but also on better understanding its underlying causes and preventing its recurrence. Anticorruption analytics cannot exist in a vacuum; it must fit in a strategic institutional framework that starts with quality information and leads to reform. Even the most sophisticated technologies and data innovations cannot prevent what French novelist Théophile Gautier described as the “inexplicable attraction of corruption, even amongst the most honest souls.” Unless it is harnessed for improvements in governance and institutions, data analytics will not have the impact that it could, nor be sustainable in the long run…(More)”.

Social Media for Government


Book by Gohar Feroz Khan: “This book provides practical know-how on understanding, implementing, and managing main stream social media tools (e.g., blogs and micro-blogs, social network sites, and content communities) from a public sector perspective. Through social media, government organizations can inform citizens, promote their services, seek public views and feedback, and monitor satisfaction with the services they offer so as to improve their quality. Given the exponential growth of social media in contemporary society, it has become an essential tool for communication, content sharing, and collaboration. This growth and these tools also present an unparalleled opportunity to implement a transparent, open, and collaborative government.  However, many government organization, particularly those in the developing world, are still somewhat reluctant to leverage social media, as it requires significant policy and governance changes, as well as specific know-how, skills and resources to plan, implement and manage social media tools. As a result, governments around the world ignore or mishandle the opportunities and threats presented by social media. To help policy makers and governments implement a social media driven government, this book provides guidance in developing an effective social media policy and strategy. It also addresses issues such as those related to security and privacy….(More)”

Protecting One’s Own Privacy in a Big Data Economy


Anita L. Allen in the Harvard Law Review Forum: “Big Data is the vast quantities of information amenable to large-scale collection, storage, and analysis. Using such data, companies and researchers can deploy complex algorithms and artificial intelligence technologies to reveal otherwise unascertained patterns, links, behaviors, trends, identities, and practical knowledge. The information that comprises Big Data arises from government and business practices, consumer transactions, and the digital applications sometimes referred to as the “Internet of Things.” Individuals invisibly contribute to Big Data whenever they live digital lifestyles or otherwise participate in the digital economy, such as when they shop with a credit card, get treated at a hospital, apply for a job online, research a topic on Google, or post on Facebook.

Privacy advocates and civil libertarians say Big Data amounts to digital surveillance that potentially results in unwanted personal disclosures, identity theft, and discrimination in contexts such as employment, housing, and financial services. These advocates and activists say typical consumers and internet users do not understand the extent to which their activities generate data that is being collected, analyzed, and put to use for varied governmental and business purposes.

I have argued elsewhere that individuals have a moral obligation to respect not only other people’s privacy but also their own. Here, I wish to comment first on whether the notion that individuals have a moral obligation to protect their own information privacy is rendered utterly implausible by current and likely future Big Data practices; and on whether a conception of an ethical duty to self-help in the Big Data context may be more pragmatically framed as a duty to be part of collective actions encouraging business and government to adopt more robust privacy protections and data security measures….(More)”

Empirical data on the privacy paradox


Benjamin Wittes and Emma Kohse at Brookings: “The contemporary debate about the effects of new technology on individual privacy centers on the idea that privacy is an eroding value. The erosion is ongoing and takes place because of the government and big corporations that collect data on us all: In the consumer space, technology and the companies that create it erode privacy, as consumers trade away their solitude either unknowingly or in exchange for convenience and efficiency.

On January 13, we released a Brookings paper that challenges this idea. Entitled, “The Privacy Paradox II: Measuring the Privacy Benefits of Privacy Threats,” we try to measure the extent to which this focus ignores the significant privacy benefits of the technologies that concern privacy advocates. And we conclude that quantifiable effects in consumer behavior strongly support the reality of these benefits.

In 2015, one of us, writing with Jodie Liu, laid out the basic idea last year in a paper published by Brookings called “The Privacy Paradox: the Privacy Benefits of Privacy Threats.” (The title, incidentally, became the name of Lawfare’s privacy-oriented subsidiary page.) Individuals, we argued, might be more concerned with keeping private information from specific people—friends, neighbors, parents, or even store clerks—than from large, remote corporations, and they might actively prefer to give information remote corporations by way of shielding it from those immediately around them. By failing to associate this concern with the concept of privacy, academic and public debates tends to ignore countervailing privacy benefits associated with privacy threats, and thereby keeps score in a way biased toward the threats side of the ledger.To cite a few examples, an individual may choose to use a Kindle e-reader to read Fifty Shades of Grey precisely because she values the privacy benefit of hiding her book choice from the eyes of people on the bus or the store clerk at the book store, rather than for reasons of mere convenience. This privacy benefit, for many consumers, can outweigh the privacy concern presented by Amazon’s data mining. At the very least, the privacy benefits of the Kindle should enter into the discussion.

To cite a few examples, an individual may choose to use a Kindle e-reader to read Fifty Shades of Grey precisely because she values the privacy benefit of hiding her book choice from the eyes of people on the bus or the store clerk at the book store, rather than for reasons of mere convenience. This privacy benefit, for many consumers, can outweigh the privacy concern presented by Amazon’s data mining. At the very least, the privacy benefits of the Kindle should enter into the discussion.

In this paper, we tried to begin the task for measuring the effect and reasoning that supported the thesis in the “Privacy Paradox” using Google Surveys, an online survey tool….(More)”.

The Signal Code


The Signal Code: “Humanitarian action adheres to the core humanitarian principles of impartiality, neutrality, independence, and humanity, as well as respect for international humanitarian and human rights law. These foundational principles are enshrined within core humanitarian doctrine, particularly the Red Cross/NGO Code of Conduct5 and the Humanitarian Charter.6 Together, these principles establish a duty of care for populations affected by the actions of humanitarian actors and impose adherence to a standard of reasonable care for those engaged in humanitarian action.

Engagement in HIAs, including the use of data and ICTs, must be consistent with these foundational principles and respect the human rights of crisis-affected people to be considered “humanitarian.” In addition to offering potential benefits to those affected by crisis, HIAs, including the use of ICTs, can cause harm to the safety, wellbeing, and the realization of the human rights of crisis-affected people. Absent a clear understanding of which rights apply to this context, the utilization of new technologies, and in particular experimental applications of these technologies, may be more likely to harm communities and violate the fundamental human rights of individuals.

The Signal Code is based on the application of the UDHR, the Nuremberg Code, the Geneva Convention, and other instruments of customary international law related to HIAs and the use of ICTs by crisis affected-populations and by humanitarians on their behalf. The fundamental human rights undergirding this Code are the rights to life, liberty, and security; the protection of privacy; freedom of expression; and the right to share in scientific advancement and its benefits as expressed in Articles 3, 12, 19, and 27 of the UDHR.7

The Signal Code asserts that all people have fundamental rights to access, transmit, and benefit from information as a basic humanitarian need; to be protected from harms that may result from the provision of information during crisis; to have a reasonable expectation of privacy and data security; to have agency over how their data is collected and used; and to seek redress and rectification when data pertaining to them causes harm or is inaccurate.

These rights are found to apply specifically to the access, collection, generation, processing, use, treatment, and transmission of information, including data, during humanitarian crises. These rights are also found herein to be interrelated and interdependent. To realize any of these rights individually requires realization of all of these rights in concert.

These rights are found to apply to all phases of the data lifecycle—before, during, and after the collection, processing, transmission, storage, or release of data. These rights are also found to be elastic, meaning that they apply to new technologies and scenarios that have not yet been identified or encountered by current practice and theory.

Data is, formally, a collection of symbols which function as a representation of information or knowledge. The term raw data is often used with two different meanings, the first being uncleaned data, that is, data that has been collected in an uncontrolled environment, and unprocessed data, which is collected data that has not been processed in such a way as to make it suitable for decision making. Colloquially, and in the humanitarian context, data is usually thought of solely in the machine readable or digital sense. For the purposes of the Signal Code, we use the term data to encompass information both in its analog and digital representations. Where it is necessary to address data solely in its digital representation, we refer to it as digital data.

No right herein may be used to abridge any other right. Nothing in this code may be interpreted as giving any state, group, or person the right to engage in any activity or perform any act that destroys the rights described herein.

The five human rights that exist specific to information and HIAs during humanitarian crises are the following:

The Right to Information
The Right to Protection
The Right to Data Security and Privacy
The Right to Data Agency
The Right to Redress and Rectification…(More)”

Data capitalism is cashing in on our privacy . . . for now


John Thornhill in the Financial Times: “The buzz at last week’s Consumer Electronics Show in Las Vegas was all about connectivity and machine learning. …The primary effect of these consumer tech products seems limited — but we will need to pay increasing attention to the secondary consequences of these connected devices. They are just the most visible manifestation of a fundamental transformation that is likely to shape our societies far more than Brexit, Donald Trump or squabbles over the South China Sea. It concerns who collects, owns and uses data. The subject of data is so antiseptic that it seldom generates excitement. To make it sound sexy, some have described data as the “new oil”, fuelling our digital economies. In reality, it is likely to prove far more significant than that. Data are increasingly determining economic value, reshaping the practice of power and intruding into the innermost areas of our lives. Some commentators have suggested that this transformation is so profound that we are moving from an era of financial capitalism into one of data capitalism. The Israeli historian Yuval Noah Harari even argues that Dataism, as he calls it, can be compared with the birth of a religion, given the claims of its most fervent disciples to provide universal solutions. …

Sir Nigel Shadbolt, co-founder of the Open Data Institute, argues in a recent FT TechTonic podcast that it is too early to give up on privacy…The next impending revolution, he argues, will be about giving consumers control over their data. Considering the increasing processing power and memory capacity of smartphones, he believes new models of data collection and more localised use may soon gain traction. One example is the Blue Button service used by US veterans, which allows individuals to maintain and update their medical records. “That has turned out to be a really revolutionary step,” he says. “I think we are going to see a lot more of that kind of re-empowering.” According to this view, we can use data to create a far smarter world without sacrificing precious rights. If we truly believe in such a benign future, we had better hurry up and invent it….(More)”

Chasing Shadows: Visions of Our Coming Transparent World


A science fiction and tech-vision anthology “about the coming era of transparency in the information age” edited by David Brin & Stephen W. Potts: “Young people log their lives with hourly True Confessions. Cops wear lapel-cams and spy agencies peer at us — and face defections and whistle blowers. Bank records leak and “uncrackable” firewalls topple. As we debate internet privacy, revenge porn, the NSA, and Edward Snowden, cameras get smaller, faster, and more numerous.

Has Orwell’s Big Brother finally come to pass? Or have we become a global society of thousands of Little Brothers — watching, judging, and reporting on one another?

Partnering with the Arthur C. Clarke Center for Human Imagination, and inspired by Brin’s nonfiction book, The Transparent Society: Will Technology Make Us Choose Between Privacy and Freedom?, noted author and futurist David Brin and scholar Stephen W. Potts have compiled essays and short stories from writers such as Robert J. Sawyer, James Morrow, William Gibson, Damon Knight, Jack McDevitt, and many others to examine the benefits and pitfalls of technological transparency in all its permutations.

Among the many questions…

  • Do we answer surveillance with sousveillance, shining accountability upward?
  • Will we spiral into busybody judgmentalism? Or might people choose to leave each other alone?
  • Will empathy increase or decrease in a more transparent world?
  • What if we could own our information, and buy and sell it in a web bazaar?…(More)”

Notable Privacy and Security Books from 2016


Daniel J. Solove at Technology, Academics, Policy: “Here are some notable books on privacy and security from 2016….

Chris Jay Hoofnagle, Federal Trade Commission Privacy Law and Policy

From my blurb: “Chris Hoofnagle has written the definitive book about the FTC’s involvement in privacy and security. This is a deep, thorough, erudite, clear, and insightful work – one of the very best books on privacy and security.”

My interview with Hoofnagle about his book: The 5 Things Every Privacy Lawyer Needs to Know about the FTC: An Interview with Chris Hoofnagle

My further thoughts on the book in my interview post above: “This is a book that all privacy and cybersecurity lawyers should have on their shelves. The book is the most comprehensive scholarly discussion of the FTC’s activities in these areas, and it also delves deep in the FTC’s history and activities in other areas to provide much-needed context to understand how it functions and reasons in privacy and security cases. There is simply no better resource on the FTC and privacy. This is a great book and a must-read. It is filled with countless fascinating things that will surprise you about the FTC, which has quite a rich and storied history. And it is an accessible and lively read too – Chris really makes the issues come alive.”

Gary T. Marx, Windows into the Soul: Surveillance and Society in an Age of High Technology

From Peter Grabosky: “The first word that came to mind while reading this book was cornucopia. After decades of research on surveillance, Gary Marx has delivered an abundant harvest indeed. The book is much more than a straightforward treatise. It borders on the encyclopedic, and is literally overflowing with ideas, observations, and analyses. Windows into the Soul commands the attention of anyone interested in surveillance, past, present, and future. The book’s website contains a rich abundance of complementary material. An additional chapter consists of an intellectual autobiography discussing the author’s interest in, and personal experience with, surveillance over the course of his career. Because of its extraordinary breadth, the book should appeal to a wide readership…. it will be of interest to scholars of deviance and social control, cultural studies, criminal justice and criminology. But the book should be read well beyond the towers of academe. The security industry, broadly defined to include private security and intelligence companies as well as state law enforcement and intelligence agencies, would benefit from the book’s insights. So too should it be read by those in the information technology industries, including the manufacturers of the devices and applications which are central to contemporary surveillance, and which are shaping our future.”

Susan C. Lawrence, Privacy and the Past: Research, Law, Archives, Ethics

From the book blurb: “When the new HIPAA privacy rules regarding the release of health information took effect, medical historians suddenly faced a raft of new ethical and legal challenges—even in cases where their subjects had died years, or even a century, earlier. In Privacy and the Past, medical historian Susan C. Lawrence explores the impact of these new privacy rules, offering insight into what historians should do when they research, write about, and name real people in their work.”

Ronald J. Krotoszynski, Privacy Revisited: A Global Perspective on the Right to Be Left Alone

From Mark Tushnet: “Professor Krotoszynski provides a valuable overview of how several constitutional systems accommodate competing interests in privacy, speech, and democracy. He shows how scholarship in comparative law can help one think about one’s own legal system while remaining sensitive to the different cultural and institutional settings of each nation’s law. A very useful contribution.”

Laura K. Donohue, The Future of Foreign Intelligence: Privacy and Surveillance in a Digital Age

Gordon Corera, Cyberspies: The Secret History of Surveillance, Hacking, and Digital Espionage

J. Macgregor Wise, Surveillance and Film…(More; See also Nonfiction Privacy + Security Books).

Beyond IRBs: Designing Ethical Review Processes for Big Data Research


Conference Proceedings by Future of Privacy Forum: “The ethical framework applying to human subject research in the biomedical and behavioral research fields dates back to the Belmont Report.Drafted in 1976 and adopted by the United States government in 1991 as the Common Rule, the Belmont principles were geared towards a paradigmatic controlled scientific experiment with a limited population of human subjects interacting directly with researchers and manifesting their informed consent. These days, researchers in academic institutions as well as private sector businesses not subject to the Common Rule, conduct analysis of a wide array of data sources, from massive commercial or government databases to individual tweets or Facebook postings publicly available online, with little or no opportunity to directly engage human subjects to obtain their consent or even inform them of research activities.

Data analysis is now used in multiple contexts, such as combatting fraud in the payment card industry, reducing the time commuters spend on the road, detecting harmful drug interactions, improving marketing mechanisms, personalizing the delivery of education in K-12 schools, encouraging exercise and weight loss, and much more. And companies deploy data research not only to maximize economic gain but also to test new products and services to ensure they are safe and effective. These data uses promise tremendous societal benefits but at the same time create new risks to privacy, fairness, due process and other civil liberties.

Increasingly, corporate officers find themselves struggling to navigate unsettled social norms and make ethical choices that are more befitting of philosophers than business managers or even lawyers. The ethical dilemmas arising from data analysis transcend privacy and trigger concerns about stigmatization, discrimination, human subject research, algorithmic decision making and filter bubbles.

The challenge of fitting the round peg of data-focused research into the square hole of existing ethical and legal frameworks will determine whether society can reap the tremendous opportunities hidden in the data exhaust of governments and cities, health care institutions and schools, social networks and search engines, while at the same time protecting privacy, fairness, equality and the integrity of the scientific process. One commentator called this “the biggest civil rights issue of our time.”…(More)”