How We Became Our Data


Book by Colin Koopman: “We are now acutely aware, as if all of the sudden, that data matters enormously to how we live. How did information come to be so integral to what we can do? How did we become people who effortlessly present our lives in social media profiles and who are meticulously recorded in state surveillance dossiers and online marketing databases? What is the story behind data coming to matter so much to who we are?


In How We Became Our Data, Colin Koopman excavates early moments of our rapidly accelerating data-tracking technologies and their consequences for how we think of and express our selfhood today. Koopman explores the emergence of mass-scale record-keeping systems like birth certificates and social security numbers, as well as new data techniques for categorizing personality traits, measuring intelligence, and even racializing subjects. This all culminates in what Koopman calls the “informational person” and the “informational power” we are now subject to. The recent explosion of digital technologies that are turning us into a series of algorithmic data points is shown to have a deeper and more turbulent past than we commonly think. Blending philosophy, history, political theory, and media theory in conversation with thinkers like Michel Foucault, Jürgen Habermas, and Friedrich Kittler, Koopman presents an illuminating perspective on how we have come to think of our personhood—and how we can resist its erosion….(More)”.

The Digital Roadmap


Report by the Pathway for Prosperity Commission: “The Digital Roadmap presents an overarching vision for a globally connected world that both delivers on the opportunities presented by technology, and limits downside risks. Importantly, it also sets out how this vision can be achieved.

Craft a digital compact for inclusive development

Embracing country-wide digital change will be disruptive. Navigating it requires coordinated action. Reconfiguring an economy will result in some resistance. The best way to achieve buy-in, and to balance trade-offs, is through dialogue: the private sector and civil society in its broadest sense (including community leaders, academia, trade unions, NGOs, and faith groups). The political economy of upheaval is difficult, but change can be managed with discussions that are inclusive of multiple groups. These dialogues should result in a national digital compact: a shared vision of the future to which everyone commits. The Pathways Commission has supported three countries – Ethiopia, Mongolia and South Africa – as they each developed country-wide digital strategies, using the Digital Economy Kit.

Put people at the centre of the digital future

Rapid technological affects peoples’ lives.Failure to put people at the centre of social and economic change can lead to social unrest. The pace and intensity of change means it’s all the more important that people are at the centre of the digital future – not the technology. This requires equipping people to benefit from opportunities, while also protecting them from the potential harms of the digital age. Governments should take responsibility for ensuring that vocational education is truly useful for workers and for business in the digital age. The private sector needs to be involved in keeping curricula up to date.

Build the digital essentials

Digital products and services cannot be created in a vacuum – essential components need to be in place: physical infrastructure, foundational digital systems (such as digital identification and mobile money), and capital to invest in innovation. These are the basic ingredients needed for existing firms to adopt more productive technologies, and for digital entrepreneurs to build and innovate. Having reliable infrastructure and interoperable systems means that firms and service providers can focus on their core business, without having to build an enabling environment from scratch.

Reach everyone with digital technologies

If technology is to be a force for development for everyone, it must reach everyone.Just over half of the world’s population is connected to a digital life; for the rest, digital opportunities don’t mean much. Without digital connections, people can’t participate in digital work platforms, benefit from new technologies in education, or engage with government services online. Women, people with lower levels of education, and people in poverty are usually those who lack digital access. Reaching everyone requires looking beyond current business models. The private sector needs to design for inclusion, ensuring the poorest and most marginalised consumers, to ensure they are not left even further behind.

Govern technology for the future

The unprecedented pace of change and emergence of new risks in the digital era (such as algorithmic bias, cybersecurity, and threats to privacy) are creating headaches for even the most well-resourced countries. For developing countries, the challenges are even bigger. Digital technologies fundamentally shape what people do and how they do it: freelancers may face algorithms that determine chances to get hired. Banks might face a financial system with heightened risk from new, non-bank deposit holders. These issues, and many others, require new and adaptive approaches to decision-making. Emerging global norms will need to consider the needs of developing countries….(More)”.

An Open Letter to Law School Deans about Privacy Law Education in Law Schools


Daniel Solove: “Recently a group of legal academics and practitioners in the field of privacy law sent a letter to the deans of all U.S. law schools about privacy law education in law schools.  My own brief intro about this endeavor is here in italics, followed by the letter. The signatories to the letter have signed onto the letter, not this italicized intro.

Although the field of privacy law grown dramatically in past two decades, education in law schools about privacy law has significantly lagged behind.  Most U.S. law schools lack a course on privacy law. Of those that have courses, many are small seminars, often taught by adjuncts.  Of the law schools that do have a privacy course, most often just have one course. Most schools lack a full-time faculty member who focuses substantially on privacy law.

This state of affairs is a great detriment to students. I am constantly approached by students and graduates from law schools across the country who are wondering how they can learn about privacy law and enter the field. Many express great disappointment at the lack of any courses, faculty, or activities at their schools.

After years of hoping that the legal academy would wake up and respond, I came to the realization that this wasn’t going to happen on its own. The following letter [click here for the PDF version] aims to make deans aware of the privacy law field. I hope that the letter is met with action….(More)”.

Mayor de Blasio Signs Executive Order to Establish Algorithms Management and Policy Officer


Press release: “Mayor Bill de Blasio today signed an Executive Order to establish an Algorithms Management and Policy Officer within the Mayor’s Office of Operations. The Officer will serve as a centralized resource on algorithm policy and develop guidelines and best practices to assist City agencies in their use of algorithms to make decisions. The new Officer will ensure relevant algorithms used by the City to deliver services promote equity, fairness and accountability. The creation of the position follows review of the recommendations from the Automated Decision Systems (ADS) Task Force Report required by Local Law 49 of 2018, published here.

“Fairness and equity are central to improving the lives of New Yorkers,” said Mayor Bill de Blasio.“With every new technology comes added responsibility, and I look forward to welcoming an Algorithms Management and Policy Officer to my team to ensure the tools we use to make decisions are fair and transparent.”…

The Algorithms Management and Policy Officer will develop guidelines and best practices to assist City agencies in their use of tools or systems that rely on algorithms and related technologies to support decision-making. As part of that effort, the Officer and their personnel support will develop processes for agency reporting and provide resources that will help the public learn more about how New York City government uses algorithms to make decisions and deliver services….(More)”.

AI For Good Is Often Bad


Mark Latonero at Wired: “….Within the last few years, a number of tech companies, from Google to Huawei, have launched their own programs under the AI for Good banner. They deploy technologies like machine-learning algorithms to address critical issues like crime, poverty, hunger, and disease. In May, French president Emmanuel Macron invited about 60 leaders of AI-driven companies, like Facebook’s Mark Zuckerberg, to a Tech for Good Summit in Paris. The same month, the United Nations in Geneva hosted its third annual AI for Global Good Summit sponsored by XPrize. (Disclosure: I have spoken at it twice.) A recent McKinsey report on AI for Social Good provides an analysis of 160 current cases claiming to use AI to address the world’s most pressing and intractable problems.

While AI for good programs often warrant genuine excitement, they should also invite increased scrutiny. Good intentions are not enough when it comes to deploying AI for those in greatest need. In fact, the fanfare around these projects smacks of tech solutionism, which can mask root causes and the risks of experimenting with AI on vulnerable people without appropriate safeguards.

Tech companies that set out to develop a tool for the common good, not only their self-interest, soon face a dilemma: They lack the expertise in the intractable social and humanitarian issues facing much of the world. That’s why companies like Intel have partnered with National Geographic and the Leonardo DiCaprio Foundation on wildlife trafficking. And why Facebook partnered with the Red Cross to find missing people after disasters. IBM’s social-good program alone boasts 19 partnerships with NGOs and government agencies. Partnerships are smart. The last thing society needs is for engineers in enclaves like Silicon Valley to deploy AI tools for global problems they know little about….(More)”.

Meaningfully Engaging Youth in the Governance of the Global Refugee System


Bushra Ebadi at the World Refugee Council Research Paper Series: “Young people aged 15 to 35 comprise one-third of the world’s population, yet they are largely absent from decision-making fora and, as such, unaccounted for in policy making, programming and laws. The disenfranchisement of displaced youth is a particular problem, because it further marginalizes young people who have already experienced persecution and been forcibly displaced.

This paper aims to demonstrate the importance of including displaced youth in governance and decision making, to identify key barriers to engagement that displaced youth face, and to highlight effective strategies for engaging youth. Comprehensive financial, legal, social and governance reforms are needed in order to facilitate and support the meaningful engagement of youth in the refugee and IDP systems. Without these reforms and partnerships between youth and other diverse stakeholders, it will be difficult to achieve sustainable solutions for forcibly displaced populations and the communities that host them….(More)”.

The Trace


About: “The Trace is an independent, nonpartisan, nonprofit newsroom dedicated to shining a light on America’s gun violence crisis….

Every year in our country, a firearm is used in nearly 500,000 crimes, resulting in the deaths and injuries of more than 110,000 people. Shootings devastate families and communities and drain billions of dollars from local, state, and federal governments. Meanwhile, the problem of gun violence has been compounded by another: the shortage of knowledge about the issue…

Data and records are shielded from public view—or don’t exist. Gun-lobby backed restrictions on federal gun violence research deprive policymakers and public health experts of potentially life-saving facts. Other laws limit the information that law enforcement agencies can share on illegal guns and curb litigation that could allow scrutiny of industry practices….

We make the problem clear. In partnership with Slate, we built an eye-opening, interactive map plotting the locations of nearly 40,000 incidents of gun violence nationwide. The feature received millions of pageviews and generated extensive local coverage and social media conversation. “So many shootings and deaths, so close to my home,” wrote one reader. “And I hadn’t even heard about most of them.”…(More)”.

Thinking About the Commons


Carol M. Rose at the International Journal of the Commons: “This article, originally a speech in the conference, Leçons de Droit Comparé sur les Communs, Sciences-Po, Paris, explores current developments in theoretical thinking about the commons. It keys off contemporary reconsiderations of Garret Hardin’s “Tragedy of the Commons” and Elinor Ostrom’s response to Hardin in Governing the Commons and later work.

Ostrom was among the best-known critics of Hardin’s idea of a “tragedy,” but Ostrom’s own work has also raised some questions in more recent commons literature. One key question is the very uncertain relationship between community-based resource control and democratic rights. A second key question revolves around the understanding of commons on the one hand as limited common regimes, central to Ostrom’s work, or as open access, as espoused by more recent advocates of widespread access to information and communications networks….(More)”.

Americans’ views about privacy, surveillance and data-sharing


Pew Research Center: “In key ways, today’s digitally networked society runs on quid pro quos: People exchange details about themselves and their activities for services and products on the web or apps. Many are willing to accept the deals they are offered in return for sharing insight about their purchases, behaviors and social lives. At times, their personal information is collected by government on the grounds that there are benefits to public safety and security.

A majority of Americans are concerned about this collection and use of their data, according to a new report from Pew Research Center….

Americans vary in their attitudes toward data-sharing in the pursuit of public good. Though many Americans don’t think they benefit much from the collection of their data, and they find that the potential risks of this practice outweigh the benefits, there are some scenarios in which the public is more likely to accept the idea of data-sharing. In line with findings in a 2015 Center survey showing that some Americans are comfortable with trade-offs in sharing data, about half of U.S. adults (49%) say it is acceptable for the government to collect data about all Americans in order to assess potential terrorist threats. That compares with 31% who feel it is unacceptable to collect data about all Americans for that purpose. By contrast, just one-quarter say it is acceptable for smart speaker makers to share users’ audio recordings with law enforcement to help with criminal investigations, versus 49% who find that unacceptable….(More)”.

Decision-making in the Age of the Algorithm


Paper by Thea Snow: “Frontline practitioners in the public sector – from social workers to police to custody officers – make important decisions every day about people’s lives. Operating in the context of a sector grappling with how to manage rising demand, coupled with diminishing resources, frontline practitioners are being asked to make very important decisions quickly and with limited information. To do this, public sector organisations are turning to new technologies to support decision-making, in particular, predictive analytics tools, which use machine learning algorithms to discover patterns in data and make predictions.

While many guides exist around ethical AI design, there is little guidance on how to support a productive human-machine interaction in relation to AI. This report aims to fill this gap by focusing on the issue of human-machine interaction. How people are working with tools is significant because, simply put, for predictive analytics tools to be effective, frontline practitioners need to use them well. It encourages public sector organisations to think about how people feel about predictive analytics tools – what they’re fearful of, what they’re excited about, what they don’t understand.

Based on insights drawn from an extensive literature review, interviews with frontline practitioners, and discussions with experts across a range of fields, the guide also identifies three key principles that play a significant role in supporting a constructive human-machine relationship: context, understanding, and agency….(More)”.