Stefaan Verhulst
Paper by Bruno Isabelle, Didier Emmanuel and Vitale Tommaso: “This article introduces the special issue on statactivism, a particular form of action within the repertoire used by contemporary social movements: the mobilization of statistics. Traditionally, statistics has been used by the worker movement within the class conflicts. But in the current configuration of state restructuring, new accumulation regimes, and changes in work organization in capitalists societies, the activist use of statistics is moving. This first article seeks to show the use of statistics and quantification in contentious performances connected with state restructuring, main transformations of the varieties of capitalisms, and changes in work organization regimes. The double role of statistics in representing as well as criticizing reality is considered. After showing how important statistical tools are in producing a shared reading of reality, we will discuss the two main dimensions of statactivism – disclosure and affirmation. In other words, we will see the role of stat-activists in denouncing a certain state of reality, and then the efforts to use statistics in creating equivalency among disparate conditions and in cementing emerging social categories. Finally, we present the main contributions of the various research papers in this special issue regarding the use of statistics as a form of action within a larger repertoire of contentious action. Six empirical papers focus on statactivism against the penal machinery in the early 1970s (Grégory Salle), on the mobilisation on the price index in Guadalupe in 2009 (Boris Samuel), and in Argentina in 2007 (Celia Lury and Ana Gross), on the mobilisations of experts to consolidate a link between working conditions and health issues (Marion Gilles), on the production of activity data for disability policy in France (Pierre-Yves Baudot), and on the use of statistics in social mobilizations for gender equality (Eugenia De Rosa). Alain Desrosières wrote the last paper, coping with mobilizations proposing innovations in the way of measuring inflation, unemployment, poverty, GDP, and climate change. This special issue is dedicated to him, in order to honor his everlasting intellectual legacy….(More)”
Mary O’Hara in The Guardian: “With warnings coming thick and fast about the stark ramifications of the government’s sweeping cuts to legal aid, it was probably inevitable that someone would come up with a new way to plug some gaps in access to justice. Enter the legal crowdfunder, CrowdJustice, an online platform where people who might not otherwise get their case heard can raise cash to pay for legal representation and court costs.
The brainchild of 33-year-old lawyer Julia Salasky, and the first of its kind in the UK, CrowdJustice provides people who have a public interest case but lack adequate financial resources with a forum where they can publicise their case and, if all goes to plan, generate funding for legal action by attracting public support and donations.
“We are trying to increase access to justice – that’s the baseline,” says Salasky. “I think it’s a social good.”
The platform was launched just a few months ago, but has already attracteda range of cases both large and small, including some that could set important legal precedents.
CrowdJustice has helped the campaign, Jengba (Joint Enterprise: Not Guilty by Association) to raise funds to intervene in a supreme court case to consider reforming the law of joint enterprise that can find people guilty of a crime, including murder, committed by someone else. The group amassed £10,000 in donations for legal assistance as part of their ongoing challenge to the legal doctrine of “joint enterprise”, which disproportionately prosecutes people from black and minority ethnic backgrounds for violent crimes where it is alleged they have acted together for a common purpose.
In another case, a Northern Irish woman who discovered she wasn’t entitled to her partner’s occupational pension after he died because of a bureaucratic requirement that did not apply to married couples, used CrowdJustice to help raise money to take her case all the way to the supreme court. “If she wins, it will have an enormous precedent-setting value for the legal rights of all couples who cohabit,” Salasky says….(The Guardian)”
IVIR and MIT: “The EU and US share a common commitment to privacy protection as a cornerstone of democracy. Following the Treaty of Lisbon, data privacy is a fundamental right that the European Union must proactively guarantee. In the United States, data privacy derives from constitutional protections in the First, Fourth and Fifth Amendment as well as federal and state statute, consumer protection law and common law. The ultimate goal of effective privacy protection is shared. However, current friction between the two legal systems poses challenges to realizing privacy and the free flow of information across the Atlantic. Recent expansion of online surveillance practices underline these challenges.
Over nine months, the group prepared a consensus report outlining a menu of privacy “bridges” that can be built to bring the European Union and the United States closer together. The efforts are aimed at providing a framework of practical options that advance strong, globally-accepted privacy values in a manner that respects the substantive and procedural differences between the two jurisdictions….
- Full Report (276 KB)
- Executive summary (106 KB)
(More)”
Involve: “Democratic reform comes in waves, propelled by technological, economic, political and social developments. There are periods of rapid change, followed by relative quiet.
We are currently in a period of significant political pressure for change to our institutions of democracy and government. With so many changes under discussion it is critically important that those proposing and carrying out reforms understand the impact that different reforms might have.
Most discussions of democratic reform focus on electoral democracy. However, for all their importance in the democratic system, elections rarely reveal what voters think clearly enough for elected representatives to act on them. Changing the electoral system will not alone significantly increase the level of democratic control held by citizens.
Room for a View, by Involve’s director Simon Burall, looks at democratic reform from a broader perspective than that of elections. Drawing on the work of democratic theorists, it uses a deliberative systems approach to examine the state of UK democracy. Rather than focusing exclusively on the extent to which individuals and communities are represented within institutions, it is equally concerned with the range of views present and how they interact.
Adapting the work of the democratic theorist John Dryzek, the report identifies seven components of the UK’s democratic system, describing and analysing the condition of each in turn. Assessing the UK’s democracy though this lens reveals it to be in fragile health. The representation of alternative views and narratives in all of the UK system’s seven components is poor, the components are weakly connected and, despite some positive signs, deliberative capacity is decreasing.
Room for a View suggests that a focus on the key institutions isn’t enough. If the health of UK democracy is to be improved, we need to move away from thinking about the representation of individual voters to thinking about the representation of views, perspectives and narratives. Doing this will fundamentally change the way we approach democratic reform.
Mark Samuels at ZDnet: “If information really is the lifeblood of modern organisations, then CIOs could create huge benefits from opening their data to new, creative pairs of eyes. Research from consultant McKinsey suggests that seven sectors alone could generate more than $3 trillion a year in additional value as a result of open data: that is, taking previously proprietary data (often starting with public sector data) and opening up access.
So, should your business consider giving outsiders access to insider information? ZDNet speaks to three experts.
More viewpoints can mean better results
Former Tullow Oil CIO Andrew Marks says debates about the potential openness of data in a private sector context are likely to be dominated by one major concern: information security.
“It’s a perfectly reasonable debate until people start thinking about privacy,” he says. “Putting information at risk, both in terms of customer data and competitive advantage, will be a risk too far for many senior executives.”
But what if CIOs could allay c-suite peers’ concerns and create a new opportunity? Marks points to the Goldcorp Challenge, which saw the mining specialist share its proprietary geological data to allow outside experts pick likely spots for mining. The challenge, which included prize money of $575,000 helped identify more than 110 sites, 50 per cent of which were previously unknown to the company. The value of gold found through the competition exceeded $6bn. Marks wonders whether other firms could take similarly brave steps.
“There is a period of time when information is very sensitive,” he says. “Once the value of data starts to become finite, then it might be beneficial for businesses to open the doors and to let outsiders play with the information. That approach, in terms of gamification, might lead to the creation of new ideas and innovations.”…
Marks says these projects help prove that, when it comes to data, more is likely to mean different – and possibly better – results. “Whether using big data algorithms or the human touch, the more viewpoints you bring together, the more you can increases chances of success and reduce risk,” he says.
“There is, therefore, always likely to be value in seeking an alternative perspective. Opening access to data means your firm is going to get more ideas, but CIOs and other senior executives need to think very carefully about what such openness means for the business, and the potential benefits.”….Some leading firms are already taking steps towards openness. Take Christina Scott, chief product and information officer at the Financial Times, who says the media organisation has used data analysts to help push the benefits of information-led insight across the business.
Her team has democratised data in order to make sure that all parts of the organisation can get the information they need to complete their day-to-day jobs. Scott says the approach is best viewed as an open data strategy, but within the safe confines of the existing enterprise firewall. While the tactic is internally focused currently, Scott says the FT is keen to find ways to make the most of external talent in the future.
“We’re starting to consider how we might open data beyond the organisation, too,” she says. “Our data holds a lot of value and insight, including across the metadata we’ve created. So it would be great to think about how we could use that information in a more open way.” Part of the FT’s business includes trade-focused magazines. Scott says opening the data could provide new insight to its B2B customers across a range of sectors. In fact, the firm has already dabbled at a smaller scale.
“We’ve run hackathons, where we’ve exposed our APIs and given people the chance to come up with some new ideas,” she says. “But I don’t think we’ve done as much work on open data as we could. And I think that’s the direction in which better organisations are moving. They recognise that not all innovation is going to happen within the company.”…
CIO Omid Shiraji is another IT expert who recognises that there is a general move towards a more open society. Any executive who expects to work within a tightly defined enterprise firewall is living in cloud cuckoo land, he argues. More to the point, they will miss out on big advantages.
“If you can expose your sources to a range of developers, you can start to benefit from massive innovation,” he says. “You can get really big benefits from opening your data to external experts who can focus on areas that you don’t have the capability to develop internally.”
Many IT leaders would like to open data to outside experts, suggests Shiraji. For CIOs who are keen to expose their sources, he suggests letting small-scale developers take a close look at in-house data silos in an attempt to discover what relationships might exist and what advantages could accrue….(More)”
Peter Williams, Jan Gravesen and Trinette Brownhill in Government Executive: “Governments around the world are facing competitive pressures and expectations from their constituents that are prompting them to innovate and dissolve age-old structures. Many governments have introduced a digital strategy in which at least one of the goals is aimed at bringing their organizations closer to citizens and businesses.
To achieve this, ideally IT and data in government would not be constrained by the different functional towers that make up the organization, as is often the case. They would not be constrained by complex, monolithic application design philosophies and lengthy implementation cycles, nor would development be constrained by the assumption that all activity has to be executed by the government itself.
Instead, applications would be created rapidly and cheaply, and modules would be shared as reusable blocks of code and integrated data. It would be relatively straightforward to integrate data from multiple departments to enable a focus on the complex needs of, say, a single parent who is diabetic and a student. Delivery would be facilitated in the manner best required, or preferred, by the citizen. Third parties would also be able to access these modules of code and data to build higher value government services that multiple agencies would then buy into. The code would run on a cloud infrastructure that maximizes the efficiency in which processing resources are used.
GaaP an organized set of ideas and principles that allows organizations to approach these ideals. It allows governments to institute more efficient sharing of IT resources as well as unlock data and functionality via application programming interfaces to allow third parties to build higher value citizen services. In doing so, security plays a crucial role protecting the privacy of constituents and enterprise assets.
We see increasingly well-established examples of GaaP services in many parts of the world. The notion has significantly influenced strategic thinking in the UK, Australia, Denmark, Canada and Singapore. In particular, it has evolved in a deliberate way in the UK’s Government Data Services, building on the Blairite notion of “joined up government”; in Australia’s e-government strategy and its myGov program; and as a significant influencer in Singapore’s entire approach to building its “smarter nation” infrastructure.
Collaborative Government
GaaP assumes a transformational shift in efficiency, effectiveness and transparency, in which agencies move toward a collaborative government and away from today’s siloed approach. That collaboration may be among agencies, but also with other entities (nongovernmental organizations, the private sector, citizens, etc.).
GaaP’s focus on collaboration enables public agencies to move away from their traditional towered approach to IT and increasingly make use of shared and composable services offered by a common – usually a virtualized, cloud-enabled – platform. This leads to more efficient use of development resources, platforms and IT support. We are seeing examples of this already with a group of townships in New York state and also with two large Spanish cities that are embarking on this approach.
While efficient resource and service sharing is central to the idea of GaaP, it is not sufficient. The idea is that GaaP must allow app developers, irrespective of whether they are citizens, private organizations or other public agencies, to develop new value-added services using published government data and APIs. In this sense, the platform becomes a connecting layer between public agencies’ systems and data on the one hand, and private citizens, organizations and other public agencies on the other.
In its most fundamental form, GaaP is able to:
- Consume data and government services from existing departmental systems.
- Consume syndicated services from platform-as-a-service or software-as-a-service providers in the public marketplace.
- Securely unlock these data and services and allow third parties –citizens, private organizations or other agencies – to combine services and data into higher-order services or more citizen-centric or business-centric services.
It is the openness, the secure interoperability, and the ability to compose new services on the basis of existing services and data that define the nature of the platform.
The Challenges
At one time, the challenge of creating a GaaP structure would have been technology: Today, it is governance….(More)”
Hamish Robertson and Joanne Travaglia in LSE’s The Impact Blog: “This is not the first ‘big data’ era but the second. The first was the explosion in data collection that occurred from the early 19th century – Hacking’s ‘avalanche of numbers’, precisely situated between 1820 and 1840. This was an analogue big data era, different to our current digital one but characterized by some very similar problems and concerns. Contemporary problems of data analysis and control include a variety of accepted factors that make them ‘big’ and these generally include size, complexity and technology issues. We also suggest that digitisation is a central process in this second big data era, one that seems obvious but which has also appears to have reached a new threshold. Until a decade or so ago ‘big data’ looked just like a digital version of conventional analogue records and systems. Ones whose management had become normalised through statistical and mathematical analysis. Now however we see a level of concern and anxiety, similar to the concerns that were faced in the first big data era.
This situation brings with it a socio-political dimension of interest to us, one in which our understanding of people and our actions on individuals, groups and populations are deeply implicated. The collection of social data had a purpose – understanding and controlling the population in a time of significant social change. To achieve this, new kinds of information and new methods for generating knowledge were required. Many ideas, concepts and categories developed during that first data revolution remain intact today, some uncritically accepted more now than when they were first developed. In this piece we draw out some connections between these two data ‘revolutions’ and the implications for the politics of information in contemporary society. It is clear that many of the problems in this first big data age and, more specifically, their solutions persist down to the present big data era….Our question then is how do we go about re-writing the ideological inheritance of that first data revolution? Can we or will we unpack the ideological sequelae of that past revolution during this present one? The initial indicators are not good in that there is a pervasive assumption in this broad interdisciplinary field that reductive categories are both necessary and natural. Our social ordering practices have influenced our social epistemology. We run the risk in the social sciences of perpetuating the ideological victories of the first data revolution as we progress through the second. The need for critical analysis grows apace not just with the production of each new technique or technology but with the uncritical acceptance of the concepts, categories and assumptions that emerged from that first data revolution. That first data revolution proved to be a successful anti-revolutionary response to the numerous threats to social order posed by the incredible changes of the nineteenth century, rather than the Enlightenment emancipation that was promised. (More)”
This is part of a wider series on the Politics of Data. For more on this topic, also see Mark Carrigan’sPhilosophy of Data Science interview series and the Discover Society special issue on the Politics of Data (Science).
Jennifer Hunter Childs et al in Survey Practice: “Periodically, the US Federal Government suffers from negative publicity, decreasing the confidence and trust people have in the government. Consequently, the leaders of several federal statistical agencies were interested in knowing if their public image would suffer from negative publicity. The researchers used data gathered in the Gallup Daily Poll to analyze and understand if negative government perceptions would negatively influence the perception of federal statistical agencies. The results indicate that as level of knowledge about and use of federal statistics increases, respondents’ differentiation among government entities also increases. For example, the strength of the relationship between people’s confidence in federal statistical agencies increased, whereas, the confidence in Congress and the military decreased. When confidence in Congress is particularly poor, results support the notion that increasing knowledge about the statistical system and increasing the public’s use of statistical data (through programs like the Census Bureau’s “Statistics in Schools”) could help people differentiate between sectors of the government, consequently increasing confidence in federal statistical agencies….(More)”
Michael McDonald, Peter Licari and Lia Merivaki in the Washington Post: “In modern campaigns, buzzwords like “microtargeting” and “big data” are often bandied about as essential to victory. These terms refer to the practice of analyzing (or “microtargeting”) millions of voter registration records (“big data”) to predict who will vote and for whom.
If you’ve ever gotten a message from a campaign, there’s a good chance you’ve been microtargeted. Serious campaigns use microtargeting to persuade voters through mailings, phone calls, knocking on doors, and — in our increasingly connected world — social media.
But the big data that fuels such efforts comes at a big price, which can create a serious barrier to entry for candidates and groups seeking to participate in elections — that is, if they are allowed to buy the data at all.
When we asked state election officials about prices and restrictions on who can use their voter registration files, we learned that the rules are unsettlingly arbitrary.
Contrast Arizona and Washington. Arizona sells its statewide voter file for an estimated $32,500, while Washington gives its file away for free. Before jumping to the conclusion that this is a red- state/blue-state thing, consider that Oklahoma gives its file away, too.
A number of states base their prices on a per-record formula, which can massively drive up the price despite the fact that files are often delivered electronically. Alabama sells its records for 1 cent per voter , which yields an approximately $30,000 charge for the lot. Seriously, in this day and age, who prices an electronic database by the record?
Some states will give more data to candidates than to outside groups. Delaware will provide phone numbers to candidates but not to nonprofit organizations doing nonpartisan voter mobilization.
In some states, the voter file is not even available to the general public. States such as South Carolina and Maryland permit access only to residents who are registered voters. States including Kentucky and North Dakota grant access only to campaigns, parties and other political organizations.
We estimate that it would cost roughly $140,000 for an independent presidential campaign or national nonprofit organization to compile a national voter file, and this would not be a one-time cost. Voter lists frequently change as voters are added and deleted.
Guess who most benefits from all the administrative chaos? Political parties and their candidates. Not only are they capable of raising the vast amounts of money needed to purchase the data, but, adding insult to injury, they sometimes don’t even have to. Some states literally bequeath the data to parties at no cost. Alabama goes so far as to give parties a free statewide copy for every election.
Who is hurt by this? Independent candidates and nonprofit organizations that want to run national campaigns but don’t have deep pockets. If someone like Donald Trump launched an independent presidential run, he could buy the necessary data without much difficulty. But a nonprofit focused on mobilizing low-income voters could be stretched thin….(More)”