Shareveillance: Subjectivity between open and closed data


Clare Birchall in Big Data and Society: “This article attempts to question modes of sharing and watching to rethink political subjectivity beyond that which is enabled and enforced by the current data regime. It identifies and examines a ‘shareveillant’ subjectivity: a form configured by the sharing and watching that subjects have to withstand and enact in the contemporary data assemblage. Looking at government open and closed data as case studies, this article demonstrates how ‘shareveillance’ produces an anti-political role for the public. In describing shareveillance as, after Jacques Rancière, a distribution of the (digital) sensible, this article posits a politico-ethical injunction to cut into the share and flow of data in order to arrange a more enabling assemblage of data and its affects. In order to interrupt shareveillance, this article borrows a concept from Édouard Glissant and his concern with raced otherness to imagine what a ‘right to opacity’ might mean in the digital context. To assert this right is not to endorse the individual subject in her sovereignty and solitude, but rather to imagine a collective political subjectivity and relationality according to the important question of what it means to ‘share well’ beyond the veillant expectations of the state.

Two questions dominate current debates at the intersection of privacy, governance, security, and transparency: How much, and what kind of data should citizens have to share with surveillant states? And: How much data from government departments should states share with citizens? Yet, these issues are rarely expressed in terms of ‘sharing’ in the way that I will be doing in this article. More often, when thought in tandem with the digital, ‘sharing’ is used in reference to either free trials of software (‘shareware’); the practice of peer-to-peer file sharing; platforms that facilitate the pooling, borrowing, swapping, renting, or selling of resources, skills, and assets that have come to be known as the ‘sharing economy’; or the business of linking and liking on social media, which invites us to share our feelings, preferences, thoughts, interests, photographs, articles, and web links. Sharing in the digital context has been framed as a form of exchange, then, but also communication and distribution (see John, 2013; Wittel, 2011).

In order to understand the politics of open and opaque government data practices, which either share with citizens or ask citizens to share, I will extend existing commentaries on the distributive qualities of sharing by drawing on Jacques Rancière’s notion of the ‘distribution of the sensible’ (2004a) – a settlement that determines what is visible, audible, sayable, knowable and what share or role we each have within it. In the process, I articulate ‘sharing’ with ‘veillance’ (veiller ‘to watch’ is from the Latin vigilare, from vigil, ‘watchful’) to turn the focus from prevalent ways of understanding digital sharing towards a form of contemporary subjectivity. What I call ‘shareveillance’ – a state in which we are always already sharing; indeed, in which any relationship with data is only made possible through a conditional idea of sharing – produces an anti-politicised public caught between different data practices.

I will argue that both open and opaque government data initiatives involve, albeit differently pitched, forms of sharing and veillance. Government practices that share data with citizens involve veillance because they call on citizens to monitor and act upon that data – we are envisioned (‘veiled’ and hailed) as auditing and entrepreneurial subjects. Citizens have to monitor the state’s data, that is, or they are expected to innovate with it and make it profitable. Data sharing therefore apportions responsibility without power. It watches citizens watching the state, delimiting the ways in which citizens can engage with that data and, therefore, the scope of the political per se….(More)”.

Information Isn’t Just Power


Review by Lucy Bernholz  in the Stanford Social Innovation Review:  “Information is power.” This truism pervades Missed Information, an effort by two scientists to examine the role that information now plays as the raw material of modern scholarship, public policy, and institutional behavior. The scholars—David Sarokin, an environmental scientist for the US government, and Jay Schulkin, a research professor of neuroscience at Georgetown University—make this basic case convincingly. In its ever-present, digital, and networked form, data doesn’t just shape government policies and actions—it also creates its own host of controversies. Government policies about collecting, storing, and analyzing information fuel protests and political lobbying, opposing movements for openness and surveillance, and individual acts seen as both treason and heroism. The very fact that two scholars from such different fields are collaborating on this subject is evidence that digitized information has become the lingua franca of present-day affairs.

To Sarokin and Schulkin, the main downside to all this newly available information is that it creates an imbalance of power in who can access and control it. Governments and businesses have visibility into the lives of citizens and customers that is not reciprocated. The US government knows our every move, but we know what our government is doing only when a whistleblower tells us. Businesses have ever more data and ever-finer ways to sort and sift it, yet customers know next to nothing about what is being done with it.

The authors argue, however, that new digital networks also provide opportunities to recalibrate the balance of information and return some power to ordinary citizens. These negotiations are under way all around us. Our current political debates about security versus privacy, and the nature and scope of government transparency, show how the lines of control between governments and the governed are being redrawn. In health care, consumers, advocates, and public policymakers are starting to create online ratings of hospitals, doctors, and the costs of medical procedures. The traditional oneway street of corporate annual reporting is being supplemented by consumer ratings, customer feedback loops, and new information about supply chains and environmental and social factors. Sarokin and Schulkin go to great lengths to show the potential of tools such as comparison guides for patients or sustainability indices for shoppers to enable more informed user decisions.

This argument is important, but it is incomplete. The book’s title, Missed Information, refers to “information that is unintentionally (for the most part) overlooked in the decision-making process—overlooked both by those who provide information and by those who use it.” What is missing from the book, ironically, is a compelling discussion of why this “missed information” is missing. ….

Grouping the book with others of the “Big Data Will Save Us” genre isn’t entirely fair. Sarokin and Schulkin go to great lengths to point out how much of the information we collect is never used for anything, good or bad….(More)”

Is Open Data the Death of FOIA?


Beth Noveck at the Yale Law Journal: “For fifty years, the Freedom of Information Act (FOIA) has been the platinum standard for open government in the United States. The statute is considered the legal bedrock of the public’s right to know about the workings of our government. More than one hundred countries and all fifty states have enacted their own freedom of information laws. At the same time, FOIA’s many limitations have also become evident: a cumbersome process, delays in responses, and redactions that frustrate journalists and other information seekers. Politically-motivated nuisance requests bedevil government agencies.With over 700,000 FOIA requests filed every year, the federal government faces the costs of a mounting backlog.

In recent years, however, an entirely different approach to government transparency in line with the era of big data has emerged: open government data. Open government data —generally shortened to open data—has many definitions but is generally considered to be publicly available information that can be universally and readily accessed, used, and redistributed free of charge in digital form. Open data is not limited to statistics, but also includes text such as the United States Federal Register, the daily newspaper of government, which was released as open data in bulk form in 2010.

To understand how significant the open data movement is for FOIA, this Essay discusses the impact of open data on the institutions and functions of government and the ways open data contrasts markedly with FOIA. Open data emphasizes the proactive publication of whole classes of information. Open data includes data about the workings of government but also data collected by the government about the economy and society posted online in a centralized repository for use by the wider public, including academic users seeking information as the basis for original research and commercial users looking to create new products and services. For example, Pixar used open data from the United States Geological Survey to create more realistic detail in scenes from its movie The Good Dinosaur.

By contrast, FOIA promotes ex post publication of information created by the government especially about its own workings in response to specific demands by individual requestors. I argue that open data’s more systematic and collaborative approach represents a radical and welcome departure from FOIA because open data concentrates on information as a means to solve problems to the end of improving government effectiveness. Open data is legitimated by the improved outcomes it yields and grounded in a theory of government effectiveness and, as a result, eschews the adversarial and ad hoc FOIA approach. Ultimately, however, each tactic offers important complementary benefits. The proactive information disclosure regime of open data is strengthened by FOIA’s rights of legal enforcement. Together, they stand to become the hallmark of government transparency in the fifty years ahead….(More)”.

Comparing resistance to open data performance measurement


Paper by Gregory Michener and Otavio Ritter in Public Administration : “Much is known about governmental resistance to disclosure laws, less so about multi-stakeholder resistance to open data. This study compares open data initiatives within the primary and secondary school systems of Brazil and the UK, focusing on stakeholder resistance and corresponding policy solutions. The analytical framework is based on the ‘Three-Ps’ of open data resistance to performance metrics, corresponding to professional, political, and privacy-related concerns. Evidence shows that resistance is highly nuanced, as stakeholders alternately serve as both principals and agents. School administrators, for example, are simultaneously principals to service providers and teachers, and at once agents to parents and politicians. Relying on a different systems comparison, in-depth interviews, and newspaper content analyses, we find that similar stakeholders across countries demonstrate strikingly divergent levels of resistance. In overcoming stakeholder resistance – across socioeconomic divides – context conscientious ‘data-informed’ evaluations may promote greater acceptance than narrowly ‘data-driven’ performance measurements…(More)”

Towards a DataPlace: mapping data in a game to encourage participatory design in smart cities


Paper by Barker, Matthew; Wolff, Annika and van der Linden, Janet: “The smart city has been envisioned as a place where citizens can participate in city decision making and in the design of city services. As a key part of this vision, pervasive digital technology and open data legislation are being framed as vehicles for citizens to access rich data about their city. It has become apparent though, that simply providing access to these resources does not automatically lead to the development of data-driven applications. If we are going to engage more of the citizenry in smart city design and raise productivity, we are going to need to make the data itself more accessible, engaging and intelligible for non-experts. This ongoing study is exploring one method for doing so. As part of the MK:Smart City project team, we are developing a tangible data look-up interface that acts as an alternative to the conventional DataBase. This interface, or DataPlace as we are calling it, takes the form of a map, which the user places sensors on to physically capture real-time data. This is a simulation of the physical act of capturing data in the real world. We discuss the design of the DataPlace prototype under development and the planned user trials to test out our hypothesis; that a DataPlace can make handling data more accessible, intelligible and engaging for non-experts than conventional interface types….(More)”

New UN resolution on the right to privacy in the digital age: crucial and timely


Deborah Brown at the Internet Policy Review: “The rapid pace of technological development enables individuals all over the world to use new information and communications technologies (ICTs) to improve their lives. At the same time, technology is enhancing the capacity of governments, companies and individuals to undertake surveillance, interception and data collection, which may violate or abuse human rights, in particular the right to privacy. In this context, the UN General Assembly’s Third Committee adoption on 21 November of a new resolution on the right to privacy in the digital age comes as timely and crucial for protecting the right to privacy in light of new challenges.

As with previous UN resolutions on this topic, the resolution adopted on 21 November 2016 recognises the importance of respecting international commitments in relation to the right to privacy. It underscores that any legitimate concerns states may have with regard to their security can and should be addressed in a manner consistent with obligations under international human rights law.

Recognising that more and more personal data is being collected, processed, and shared, this year’s resolution expresses concern about the sale or multiple re-sales of personal data, which often happens without the individual’s free, explicit and informed consent. It calls for the strengthening of prevention of and protection against such violations, and calls on states to develop preventative measures, sanctions, and remedies.

This year, the resolution more explicitly acknowledges the role of the private sector. It calls on states to put in place (or maintain) effective sanctions and remedies to prevent the private sector from committing violations and abuses of the right to privacy. This is in line with states’ obligations under the UN Guiding Principles on Business and Human Rights, which require states to protect against abuses by businesses within their territories or jurisdictions. The resolution specifically calls on states to refrain from requiring companies to take steps that interfere with the right to privacy in an arbitrary or unlawful way. With respect to companies, it recalls the responsibility of the private sector to respect human rights, and specifically calls on them to inform users about company policies that may impact their right to privacy….(More)”

New Institute Pushes the Boundaries of Big Data


Press Release: “Each year thousands of genomes are sequenced, millions of neuronal activity traces are recorded, and light from hundreds of millions of galaxies is captured by our newest telescopes, all creating datasets of staggering size. These complex datasets are then stored for analysis.

Ongoing analysis of these information streams has illuminated a problem, however: Scientists’ standard methodologies are inadequate to the task of analyzing massive quantities of data. The development of new methods and software to learn from data and to model — at sufficient resolution — the complex processes they reflect is now a pressing concern in the scientific community.

To address these challenges, the Simons Foundation has launched a substantial new internal research group called the Flatiron Institute (FI). The FI is the first multidisciplinary institute focused entirely on computation. It is also the first center of its kind to be wholly supported by private philanthropy, providing a permanent home for up to 250 scientists and collaborating expert programmers all working together to create, deploy and support new state-of-the-art computational methods. Few existing institutions support the combination of scientists and programmers, instead leaving programming to relatively impermanent graduate students and postdoctoral fellows, and none have done so at the scale of the Flatiron Institute or with such a broad scope, at a single location.

The institute will hold conferences and meetings and serve as a focal point for computational science around the world….(More)”.

Digital Kenya: An Entrepreneurial Revolution in the Making


(Open Access) book edited by Bitange Ndemo and Tim Weiss: “Presenting rigorous and original research, this volume offers key insights into the historical, cultural, social, economic and political forces at play in the creation of world-class ICT innovations in Kenya. Following the arrival of fiber-optic cables in 2009, Digital Kenya examines why the initial entrepreneurial spirit and digital revolution has begun to falter despite support from motivated entrepreneurs, international investors, policy experts and others. Written by engaged scholars and professionals in the field, the book offers 15 eye-opening chapters and 14 one-on-one conversations with entrepreneurs and investors to ask why establishing ICT start-ups on a continental and global scale remains a challenge on the “Silicon Savannah”. The authors present evidence-based recommendations to help Kenya to continue producing globally impactful  ICT innovations that improve the lives of those still waiting on the side-lines, and to inspire other nations to do the same….(More)”

Talent Gap Is a Main Roadblock as Agencies Eye Emerging Tech


Theo Douglas in GovTech: “U.S. public service agencies are closely eyeing emerging technologies, chiefly advanced analytics and predictive modeling, according to a new report from Accenture, but like their counterparts globally they must address talent and complexity issues before adoption rates will rise.

The report, Emerging Technologies in Public Service, compiled a nine-nation survey of IT officials across all levels of government in policing and justice, health and social services, revenue, border services, pension/Social Security and administration, and was released earlier this week.

It revealed a deep interest in emerging tech from the public sector, finding 70 percent of agencies are evaluating their potential — but a much lower adoption level, with just 25 percent going beyond piloting to implementation….

The revenue and tax industries have been early adopters of advanced analytics and predictive modeling, he said, while biometrics and video analytics are resonating with police agencies.

In Australia, the tax office found using voiceprint technology could save 75,000 work hours annually.

Closer to home, Utah Chief Technology Officer Dave Fletcher told Accenture that consolidating data centers into a virtualized infrastructure improved speed and flexibility, so some processes that once took weeks or months can now happen in minutes or hours.

Nationally, 70 percent of agencies have either piloted or implemented an advanced analytics or predictive modeling program. Biometrics and identity analytics were the next most popular technologies, with 29 percent piloting or implementing, followed by machine learning at 22 percent.

Those numbers contrast globally with Australia, where 68 percent of government agencies have charged into piloting and implementing biometric and identity analytics programs; and Germany and Singapore, where 27 percent and 57 percent of agencies respectively have piloted or adopted video analytic programs.

Overall, 78 percent of respondents said they were either underway or had implemented some machine-learning technologies.

The benefits of embracing emerging tech that were identified ranged from finding better ways of working through automation to innovating and developing new services and reducing costs.

Agencies told Accenture their No. 1 objective was increasing customer satisfaction. But 89 percent said they’d expect a return on implementing intelligent technology within two years. Four-fifths, or 80 percent, agreed intelligent tech would improve employees’ job satisfaction….(More).

Open Data Workspace for Analyzing Hate Crime Trends


Press Release: “The Anti-Defamation League (ADL) and data.world today announced the launch of a public, open data workspace to help understand and combat the rise of hate crimes. The new workspace offers instant access to ADL data alongside relevant data from the FBI and other authoritative sources, and provides citizens, journalists and lawmakers with tools to more effectively analyze, visualize and discuss hate crimes across the United States.

The new workspace was unveiled at ADL’s inaugural “Never Is Now” Summit on Anti-Semitism, a daylong event bringing together nearly 1,000 people in New York City to hear from an array of experts on developing innovative new ways to combat anti-Semitism and bigotry….

Hate Crime Reporting Gaps


The color scale depicts total reported hate crime incidents per 100,000 people in each state. States with darker shading have more reported incidents of hate crimes while states with lighter shading have fewer reported incidents. The green circles proportionally represent cities that either Did Not Report hate crime data or affirmatively reported 0 hate crimes for the year 2015. Note the lightly shaded states in which many cities either Do Not Report or affirmatively report 0 hate crimes….(More)”