Shareveillance: Subjectivity between open and closed data


Clare Birchall in Big Data and Society: “This article attempts to question modes of sharing and watching to rethink political subjectivity beyond that which is enabled and enforced by the current data regime. It identifies and examines a ‘shareveillant’ subjectivity: a form configured by the sharing and watching that subjects have to withstand and enact in the contemporary data assemblage. Looking at government open and closed data as case studies, this article demonstrates how ‘shareveillance’ produces an anti-political role for the public. In describing shareveillance as, after Jacques Rancière, a distribution of the (digital) sensible, this article posits a politico-ethical injunction to cut into the share and flow of data in order to arrange a more enabling assemblage of data and its affects. In order to interrupt shareveillance, this article borrows a concept from Édouard Glissant and his concern with raced otherness to imagine what a ‘right to opacity’ might mean in the digital context. To assert this right is not to endorse the individual subject in her sovereignty and solitude, but rather to imagine a collective political subjectivity and relationality according to the important question of what it means to ‘share well’ beyond the veillant expectations of the state.

Two questions dominate current debates at the intersection of privacy, governance, security, and transparency: How much, and what kind of data should citizens have to share with surveillant states? And: How much data from government departments should states share with citizens? Yet, these issues are rarely expressed in terms of ‘sharing’ in the way that I will be doing in this article. More often, when thought in tandem with the digital, ‘sharing’ is used in reference to either free trials of software (‘shareware’); the practice of peer-to-peer file sharing; platforms that facilitate the pooling, borrowing, swapping, renting, or selling of resources, skills, and assets that have come to be known as the ‘sharing economy’; or the business of linking and liking on social media, which invites us to share our feelings, preferences, thoughts, interests, photographs, articles, and web links. Sharing in the digital context has been framed as a form of exchange, then, but also communication and distribution (see John, 2013; Wittel, 2011).

In order to understand the politics of open and opaque government data practices, which either share with citizens or ask citizens to share, I will extend existing commentaries on the distributive qualities of sharing by drawing on Jacques Rancière’s notion of the ‘distribution of the sensible’ (2004a) – a settlement that determines what is visible, audible, sayable, knowable and what share or role we each have within it. In the process, I articulate ‘sharing’ with ‘veillance’ (veiller ‘to watch’ is from the Latin vigilare, from vigil, ‘watchful’) to turn the focus from prevalent ways of understanding digital sharing towards a form of contemporary subjectivity. What I call ‘shareveillance’ – a state in which we are always already sharing; indeed, in which any relationship with data is only made possible through a conditional idea of sharing – produces an anti-politicised public caught between different data practices.

I will argue that both open and opaque government data initiatives involve, albeit differently pitched, forms of sharing and veillance. Government practices that share data with citizens involve veillance because they call on citizens to monitor and act upon that data – we are envisioned (‘veiled’ and hailed) as auditing and entrepreneurial subjects. Citizens have to monitor the state’s data, that is, or they are expected to innovate with it and make it profitable. Data sharing therefore apportions responsibility without power. It watches citizens watching the state, delimiting the ways in which citizens can engage with that data and, therefore, the scope of the political per se….(More)”.

The internet is crowdsourcing ways to drain the fake news swamp


 at CNET: “Fighting the scourge of fake news online is one of the unexpected new crusades emerging from the fallout of Donald Trump’s upset presidential election win last week. Not surprisingly, the internet has no shortage of ideas for how to get its own house in order.

Eli Pariser, author of the seminal book “The Filter Bubble”that pre-saged some of the consequences of online platforms that tend to sequester users into non-overlapping ideological silos, is leading an inspired brainstorming effort via this open Google Doc.

Pariser put out the public call to collaborate via Twitter on Thursday and within 24 hours 21 pages worth of bullet-pointed suggestions has already piled up in the doc….

Suggestions ranged from the common call for news aggregators and social media platforms to hire more human editors, to launching more media literacy programs or creating “credibility scores” for shared content and/or users who share or report fake news.

Many of the suggestions are aimed at Facebook, which has taken a heavy heaping of criticism since the election and a recent report that found the top fake election news stories saw more engagement on Facebook than the top real election stories….

In addition to the crowdsourced brainstorming approach, plenty of others are chiming in with possible solutions. Author, blogger and journalism professor Jeff Jarvis teamed up with entrepreneur and investor John Borthwick of Betaworks to lay out 15 concrete ideas for addressing fake news on Medium ….The Trust Project at Santa Clara University is working to develop solutions to attack fake news that include systems for author verification and citations….(More)

Between Governance of the Past and Technology of the Future


Think Piece by Heather Grabbe for ESPAS 2016 conference: ” In many parts of everyday life, voters are used to a consumer experience where they get instant feedback and personal participation; but party membership, ballot boxes and stump speeches do not offer the same speed, control or personal engagement. The institutions of representative democracy at national and EU level — political parties, elected members, law-making — do not offer the same quality of experience for their ultimate consumers.

This matters because it is causing voters to switch off. Broad participation by most of the population in the practice of democracy is vital for societies to remain open because it ensures pluralism and prevents takeover of power by narrow interests. But in some countries and some elections, turnout is regularly below a third of registered voters, especially in European Parliament elections.

The internet is driving the major trends that create this disconnection and disruption. Here are four vital areas in which politics should adapt, including at EU level:

  • Expectation. Voters have a growing sense that political parties and law-making are out of touch, but not that politics is irrelevant. …
  • Affiliation. … people are interested in new forms of affiliation, especially through social media and alternative networks. …
  • Location. Digital technology allows people to find myriad new ways to express their political views publicly, outside of formal political spaces. …
  • Information. The internet has made vast amounts of data and a huge range of information sources across an enormous spectrum of issues available to every human with an internet connection. How is this information overload affecting engagement with politics? ….(More)”

Is Social Media Killing Democracy?


Phil Howard at Culture Digitally: “This is the big year for computational propaganda—using immense data sets to manipulate public opinion over social media.  Both the Brexit referendum and US election have revealed the limits of modern democracy, and social media platforms are currently setting those limits. 

Platforms like Twitter and Facebook now provide a structure for our political lives.  We’ve always relied on many kinds of sources for our political news and information.  Family, friends, news organizations, charismatic politicians certainly predate the internet.  But whereas those are sources of information, social media now provides the structure for political conversation.  And the problem is that these technologies permit too much fake news, encourage our herding instincts, and aren’t expected to provide public goods.

First, social algorithms allow fake news stories from untrustworthy sources to spread like wildfire over networks of family and friends.  …

Second, social media algorithms provide very real structure to what political scientists often call “elective affinity” or “selective exposure”…

The third problem is that technology companies, including Facebook and Twitter, have been given a “moral pass” on the obligations we hold journalists and civil society groups to….

Facebook has run several experiments now, published in scholarly journals, demonstrating that they have the ability to accurately anticipate and measure social trends.  Whereas journalists and social scientists feel an obligation to openly analyze and discuss public preferences, we do not expect this of Facebook.  The network effects that clearly were unmeasured by pollsters were almost certainly observable to Facebook.  When it comes to news and information about politics, or public preferences on important social questions, Facebook has a moral obligation to share data and prevent computational propaganda.  The Brexit referendum and US election have taught us that Twitter and Facebook are now media companies.  Their engineering decisions are effectively editorial decisions, and we need to expect more openness about how their algorithms work.  And we should expect them to deliberate about their editorial decisions.

There are some ways to fix these problems.  Opaque software algorithms shape what people find in their news feeds.  We’ve all noticed fake news stories, often called clickbait, and while these can be an entertaining part of using the internet, it is bad when they are used to manipulate public opinion.  These algorithms work as “bots” on social media platforms like Twitter, where they were used in both the Brexit and US Presidential campaign to aggressively advance the case for leaving Europe and the case for electing Trump.  Similar algorithms work behind the scenes on Facebook, where they govern what content from your social networks actually gets your attention. 

So the first way to strengthen democratic practices is for academics, journalists, policy makers and the interested public to audit social media algorithms….(More)”.

The Participatory Condition in the Digital Age


Book edited by Darin Barney, Gabriella Coleman, Christine Ross, Jonathan Sterne, and Tamar Tembeck:

The Participatory Condition in the Digital Age

“Just what is the “participatory condition”? It is the situation in which taking part in something with others has become both environmental and normative. The fact that we have always participated does not mean we have always lived under the participatory condition. What is distinctive about the present is the extent to which the everyday social, economic, cultural, and political activities that comprise simply being in the world have been thematized and organized around the priority of participation.

Structured along four axes investigating the relations between participation and politics, surveillance, openness, and aesthetics, The Participatory Condition in the Digital Age comprises fifteen essays that explore the promises, possibilities, and failures of contemporary participatory media practices as related to power, Occupy Wall Street, the Arab Spring uprisings, worker-owned cooperatives for the post-Internet age; paradoxes of participation, media activism, open source projects; participatory civic life; commercial surveillance; contemporary art and design; and education.

This book represents the most comprehensive and transdisciplinary endeavor to date to examine the nature, place, and value of participation in the digital age. Just as in 1979, when Jean-François Lyotard proposed that “the postmodern condition” was characterized by the questioning of historical grand narratives, The Participatory Condition in the Digital Age investigates how participation has become a central preoccupation of our time….(More)”

Big Data Is Not a Monolith


Book edited by Cassidy R. Sugimoto, Hamid R. Ekbia and Michael Mattioli: “Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies.

The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data’s ramifications. The contributors look at big data’s effect on individuals as it exerts social control through monitoring, mining, and manipulation; big data and society, examining both its empowering and its constraining effects; big data and science, considering issues of data governance, provenance, reuse, and trust; and big data and organizations, discussing data responsibility, “data harm,” and decision making….(More)”

Improving Services—At What Cost? Examining the Ethics of Twitter Research


Case study by Sara Mannheimer, Scott W. H. Young and Doralyn Rossmann: “As social media use has become widespread, academic and corporate researchers have identified social networking services as sources of detailed information about people’s viewpoints and behaviors. Social media users share thoughts, have conversations, and build communities in open, online spaces, and researchers analyze social media data for a variety of purposes—from tracking the spread of disease (Lampos & Cristianini, 2010) to conducting market research (Patino, Pitta, & Quinones, 2012; Hornikx & Hendriks, 2015) to forecasting elections (Tumasjan et al., 2010). Twitter in particular has emerged as a leading platform for social media research, partly because user data from non-private Twitter accounts is openly accessible via an application programming interface (API). This case study describes research conducted by Montana State University (MSU) librarians to analyze the MSU Library’s Twitter community, and the ethical questions that we encountered over the course of the research. The case study will walk through our Twitter research at the MSU Library, and then suggest discussion questions to frame an ethical conversation surrounding social media research. We offer a number of areas of ethical inquiry that we recommend be engaged with as a cohesive whole….(More)”.

Maker City: A Practical Guide for Reinventing Our Cities


Book by Peter Hirshberg, Dale Dougherty, and Marcia Kadanoff: “Maker City, or the Maker City Playbook is a comprehensive set of case studies and how-to information useful for city leaders, civic innovators, nonprofits, and others engaged in urban economic development. The Maker City Playbook is committed to going beyond stories to find patterns and discern promising practices to help city leaders make even more informed decisions.

  • Chapter 1: Introduction and a Call to Action
  • Chapter 2: The Maker movement and Cities
  • Chapter 3: The Maker City as Open Ecosystem
  • Chapter 4: Education and Learning in the Maker City
  • Chapter 5: Workforce Development in the Maker City
  • Chapter 6: Advanced Manufacturing and Supply Chain inside the Maker City
  • Chapter 7: Real Estate Matters in the Maker City
  • Chapter 8: Civic Engagement in the Maker City
  • Chapter 9: The Future of the Maker City

Maker City Project is a collaboration between the Kauffman Foundation, the Gray Area for the Arts, and Maker Media. Read for free here: https://makercitybook.com/

Crowdsourcing and cellphone data could help guide urban revitalization


Science Magazine: “For years, researchers at the MIT Media Lab have been developing a database of images captured at regular distances around several major cities. The images are scored according to different visual characteristics — how safe the depicted areas look, how affluent, how lively, and the like….Adjusted for factors such as population density and distance from city centers, the correlation between perceived safety and visitation rates was strong, but it was particularly strong for women and people over 50. The correlation was negative for people under 30, which means that males in their 20s were actually more likely to visit neighborhoods generally perceived to be unsafe than to visit neighborhoods perceived to be safe.

In the same paper, the researchers also identified several visual features that are highly correlated with judgments that a particular area is safe or unsafe. Consequently, the work could help guide city planners in decisions about how to revitalize declining neighborhoods.,,,

Jacobs’ theory, Hidalgo says, is that neighborhoods in which residents can continuously keep track of street activity tend to be safer; a corollary is that buildings with street-facing windows tend to create a sense of safety, since they imply the possibility of surveillance. Newman’s theory is an elaboration on Jacobs’, suggesting that architectural features that demarcate public and private spaces, such as flights of stairs leading up to apartment entryways or archways separating plazas from the surrounding streets, foster the sense that crossing a threshold will bring on closer scrutiny….(More)”

The Potential and Reality of Data Journalism in Developing Media Markets


Screen Shot 2016-10-25 at 6.29.02 AMInternews Report: “Data has the potential to help communities understand their biggest challenges – why people become sick or well, why development initiatives succeed or fail, how government actions align with citizens’ priorities. However, most people do not have the skills or inclination to engage with data directly. That’s where data journalists and the open data community come in.

This report explains the role of data journalists and open data, and lays out the key considerations that can help predict the success or failure of new data journalism initiatives….

Read the report