Twitter releasing trove of user data to scientists for research


Joe Silver at ArsTechnica: “Twitter has a 200-million-strong and ever-growing user base that broadcasts 500 million updates daily. It has been lauded for its ability to unsettle repressive political regimes, bring much-needed accountability to corporations that mistreat their customers, and combat other societal ills (whether such characterizations are, in fact, accurate). Now, the company has taken aim at disrupting another important sphere of human society: the scientific research community.
Back in February, the site announced its plan—in collaboration with Gnip—to provide a handful of research institutions with free access to its data sets from 2006 to the present. It’s a pilot program called “Twitter Data Grants,” with the hashtag #DataGrants. At the time, Twitter’s engineering blog explained the plan to enlist grant applications to access its treasure trove of user data:

Twitter has an expansive set of data from which we can glean insights and learn about a variety of topics, from health-related information such as when and where the flu may hit to global events like ringing in the new year. To date, it has been challenging for researchers outside the company who are tackling big questions to collaborate with us to access our public, historical data. Our Data Grants program aims to change that by connecting research institutions and academics with the data they need.

In April, Twitter announced that, after reviewing the more than 1,300 proposals submitted from more than 60 different countries, it had selected six institutions to provide with data access. Projects approved included a study of foodborne gastrointestinal illnesses, a study measuring happiness levels in cities based on images shared on Twitter, and a study using geosocial intelligence to model urban flooding in Jakarta, Indonesia. There’s even a project exploring the relationship between tweets and sports team performance.
Twitter did not directly respond to our questions on Tuesday afternoon regarding the specific amount and types of data the company is providing to the six institutions. But in its privacy policy, Twitter explains that most user information is intended to be broadcast widely. As a result, the company likely believes that sharing such information with scientific researchers is well within its rights, as its services “are primarily designed to help you share information with the world,” Twitter says. “Most of the information you provide us is information you are asking us to make public.”
While mining such data sets will undoubtedly aid scientists in conducting experiments for which similar data was previously either unavailable or quite limited, these applications raise some legal and ethical questions. For example, Scientific American has asked whether Twitter will be able to retain any legal rights to scientific findings and whether mining tweets (many of which are not publicly accessible) for scientific research when Twitter users have not agreed to such uses is ethically sound.
In response, computational epidemiologists Caitlin Rivers and Bryan Lewis have proposed guidelines for ethical research practices when using social media data, such as avoiding personally identifiable information and making all the results publicly available….”

How open data can help shape the way we analyse electoral behaviour


Harvey Lewis (Deloitte), Ulrich Atz, Gianfranco Cecconi, Tom Heath (ODI) in The Guardian: Even after the local council elections in England and Northern Ireland on 22 May, which coincided with polling for the European Parliament, the next 12 months remain a busy time for the democratic process in the UK.
In September, the people of Scotland make their choice in a referendum on the future of the Union. Finally, the first fixed-term parliament in Westminster comes to an end with a general election in all areas of Great Britain and Northern Ireland in May 2015.
To ensure that as many people as possible are eligible and able to vote, the government is launching an ambitious programme of Individual Electoral Registration (IER) this summer. This will mean that the traditional, paper-based approach to household registration will shift to a tailored and largely digital process more in-keeping with the data-driven demands of the twenty-first century.
Under IER, citizens will need to provide ‘identifying information’, such as date of birth or national insurance number, when applying to register.

Ballots: stuck in the past?

However, despite the government’s attempts through IER to improve the veracity of information captured prior to ballots being posted, little has changed in terms of the vision for capturing, distributing and analysing digital data from election day itself.

Advertisement

Indeed, paper is still the chosen medium for data collection.
Digitising elections is fraught with difficulty, though. In the US, for example, the introduction of new voting machines created much controversy even though they are capable of providing ‘near-perfect’ ballot data.
The UK’s democratic process is not completely blind, though. Numerous opinion surveys are conducted both before and after polling, including the long-running British Election Study, to understand the shifting attitudes of a representative cross-section of the electorate.
But if the government does not retain in sufficient geographic detail digital information on the number of people who vote, then how can it learn what is necessary to reverse the long-running decline in turnout?

The effects of lack of data

To add to the debate around democratic engagement, a joint research team, with data scientists from Deloitte and the Open Data Institute (ODI), have been attempting to understand what makes voters tick.
Our research has been hampered by a significant lack of relevant data describing voter behaviour at electoral ward level, as well as difficulties in matching what little data is available to other open data sources, such as demographic data from the 2011 Census.
Even though individual ballot papers are collected and verified for counting the number of votes per candidate – the primary aim of elections, after all – the only recent elections for which aggregate turnout statistics have been published at ward level are the 2012 local council elections in England and Wales. In these elections, approximately 3,000 wards from a total of over 8,000 voted.
Data published by the Electoral Commission for the 2013 local council elections in England and Wales purports to be at ward level but is, in fact, for ‘county electoral divisions’, as explained by the Office for National Statistics.
Moreover, important factors related to the accessibility of polling stations – such as the distance from main population centres – could not be assessed because the location of polling stations remains the responsibility of individual local authorities – and only eight of these have so far published their data as open data.
Given these fundamental limitations, drawing any robust conclusions is difficult. Nevertheless, our research shows the potential for forecasting electoral turnout with relatively few census variables, the most significant of which are age and the size of the electorate in each ward.

What role can open data play?

The limited results described above provide a tantalising glimpse into a possible future scenario: where open data provides a deeper and more granular understanding of electoral behaviour.
On the back of more sophisticated analyses, policies for improving democratic engagement – particularly among young people – have the potential to become focused and evidence-driven.
And, although the data captured on election day will always remain primarily for the use of electing the public’s preferred candidate, an important secondary consideration is aggregating and publishing data that can be used more widely.
This may have been prohibitively expensive or too complex in the past but as storage and processing costs continue to fall, and the appetite for such knowledge grows, there is a compelling business case.
The benefits of this future scenario potentially include:

  • tailoring awareness and marketing campaigns to wards and other segments of the electorate most likely to respond positively and subsequently turn out to vote
  • increasing the efficiency with which European, general and local elections are held in the UK
  • improving transparency around the electoral process and stimulating increased democratic engagement
  • enhancing links to the Government’s other significant data collection activities, including the Census.

Achieving these benefits requires commitment to electoral data being collected and published in a systematic fashion at least at ward level. This would link work currently undertaken by the Electoral Commission, the ONS, Plymouth University’s Election Centre, the British Election Study and the more than 400 local authorities across the UK.”

How The Right People Analyzing The Best Data Are Transforming Government


NextGov: “Analytics is often touted as a new weapon in the technology arsenal of bleeding-edge organizations willing to spend lots of money to combat problems.
In reality, that’s not the case at all. Certainly, there are complex big data analytics tools that will analyze massive data sets to look for the proverbial needle in a haystack, but analytics 101 also includes smarter ways to look at existing data sets.
In this arena, government is making serious strides, according to Kathryn Stack, advisor for evidence-based innovation at the Office of Management and Budget. Speaking in Washington on Thursday at an analytics conference hosted by IBM, Stack provided an outline for agencies to spur innovation and improve mission by making smarter use of the data they already produce.
Interestingly, the first step has nothing to do with technology and everything to do with people. Get “the right people in the room,” Stack said, and make sure they value learning.
“One thing I have learned in my career is that if you really want transformative change, it’s important to bring the right players together across organizations – from your own department and different parts of government,” Stack said. “Too often, we lose a lot of money when siloed organizations lose sight of what the problem really is and spend a bunch of money, and at the end of the day we have invested in the wrong thing that doesn’t address the problem.”
The Department of Labor provides a great example for how to change a static organizational culture into one that integrates performance management, evaluation- and innovation-based processes. The department, she said, created a chief evaluation office and set up evaluation offices for each of its bureaus. These offices were tasked with focusing on important questions to improve performance, going inside programs to learn what is and isn’t working and identifying barriers that impeded experimentation and learning. At the same time, they helped develop partnerships across the agency – a major importance for any organization looking to make drastic changes.
Don’t overlook experimentation either, Stack said. Citing innovation leaders in the private sector such as Google, which runs 12,000 randomized experiments per year, Stack said agencies should not be afraid to get out and run with ideas. Not all of them will be good – only about 10 percent of Google’s experiments usher in new business changes – but even failures can bring meaningful value to the mission.
Stack used an experiment conducted by the United Kingdom’s Behavioral Insights Team as evidence.
The team continually tweaked language to tax compliance letters sent to individuals delinquent on their taxes. Significant experimentation ushered in lots of data, and the team analyzed it to find that one phrase, “Nine out of ten Britains pay their taxes on time,” improved collected revenue by five percent. That case shows how failures can bring about important successes.
“If you want to succeed, you’ve got to be willing to fail and test things out,” Stack said.
Any successful analytics effort in government is going to employ the right people, the best data – Stack said it’s not a secret that the government collects both useful and not-so-useful, “crappy” data – as well as the right technology and processes, too. For instance, there are numerous ways to measure return on investment, including dollars per customer served or costs per successful outcome.
“What is the total investment you have to make in a certain strategy in order to get a successful outcome?” Stack said. “Think about cost per outcome and how you do those calculations.”…”

Linking Social, Open, and Enterprise Data


Paper by T Omitola, J Davies, A Duke, H Glaser, N Shadbolt for Proceeding WIMS ’14 (Proceedings of the 4th International Conference on Web Intelligence, Mining and Semantics): “The new world of big data, of the LOD cloud, of the app economy, and of social media means that organisations no longer own, much less control, all the data they need to make the best informed business decisions. In this paper, we describe how we built a system using Linked Data principles to bring in data from Web 2.0 sites (LinkedIn, Salesforce), and other external business sites such as OpenCorporates, linking these together with pertinent internal British Telecommunications enterprise data into that enterprise data space. We describe the challenges faced during the implementation, which include sourcing the datasets, finding the appropriate “join points” from the individual datasets, as well as developing the client application used for data publication. We describe our solutions to these challenges and discuss the design decisions made. We conclude by drawing some general principles from this work.”

Politics or technology – which will save the world?


David Runciman in the Guardian: (Politics by David Runciman is due from Profile ..It is the first in a series of “Ideas in Profile”) “The most significant revolution of the 21st century so far is not political. It is the information technology revolution. Its transformative effects are everywhere. In many places, rapid technological change stands in stark contrast to the lack of political change. Take the United States. Its political system has hardly changed at all in the past 25 years. Even the moments of apparent transformation – such as the election of Obama in 2008 – have only reinforced how entrenched the established order is: once the excitement died away, Obama was left facing the same constrained political choices. American politics is stuck in a rut. But the lives of American citizens have been revolutionised over the same period. The birth of the web and the development of cheap and efficient devices through which to access it have completely altered the way people connect with each other. Networks of people with shared interests, tastes, concerns, fetishes, prejudices and fears have sprung up in limitless varieties. The information technology revolution has changed the way human beings befriend each other, how they meet, date, communicate, medicate, investigate, negotiate and decide who they want to be and what they want to do. Many aspects of our online world would be unrecognisable to someone who was transplanted here from any point in the 20th century. But the infighting and gridlock in Washington would be all too familiar.
This isn’t just an American story. China hasn’t changed much politically since 4 June 1989, when the massacre in Tiananmen Square snuffed out a would-be revolution and secured the current regime’s hold on power. But China itself has been totally altered since then. Economic growth is a large part of the difference. But so is the revolution in technology. A country of more than a billion people, nearly half of whom still live in the countryside, has been transformed by the mobile phone. There are currently over a billion phones in use in China. Ten years ago, fewer than one in 10 Chinese had access to one; today there is nearly one per person. Individuals whose horizons were until very recently constrained by physical geography – to live and die within a radius of a few miles from your birthplace was not unusual for Chinese peasants even into this century – now have access to the wider world. For the present, though maybe not for much longer, the spread of new technology has helped to stifle the call for greater political change. Who needs a political revolution when you’ve got a technological one?

Advertisement

Technology has the power to make politics seem obsolete. The speed of change leaves government looking slow, cumbersome, unwieldy and often irrelevant. It can also make political thinking look tame by comparison with the big ideas coming out of the tech industry. This doesn’t just apply to far‑out ideas about what will soon be technologically possible: intelligent robots, computer implants in the human brain, virtual reality that is indistinguishable from “real” reality (all things that Ray Kurzweil, co-founder of the Google-sponsored Singularity University, thinks are coming by 2030). In this post-ideological age some of the most exotic political visions are the ones that emerge from discussions about tech. You’ll find more radical libertarians and outright communists among computer scientists than among political scientists. Advances in computing have thrown up fresh ways to think about what it means to own something, what it means to share something and what it means to have a private life at all. These are among the basic questions of modern politics. However, the new answers rarely get expressed in political terms (with the exception of occasional debates about civil rights for robots). More often they are expressions of frustration with politics and sometimes of outright contempt for it. Technology isn’t seen as a way of doing politics better. It’s seen as a way of bypassing politics altogether.
In some circumstances, technology can and should bypass politics. The advent of widespread mobile phone ownership has allowed some of the world’s poorest citizens to wriggle free from the trap of failed government. In countries that lack basic infrastructure – an accessible transport network, a reliable legal system, a usable banking sector – phones enable people to create their own networks of ownership and exchange. In Africa, a grassroots, phone-based banking system has sprung up that for the first time permits money transfers without the physical exchange of cash. This makes it possible for the inhabitants of desperately poor and isolated rural areas to do business outside of their local communities. Technology caused this to happen; government didn’t. For many Africans, phones are an escape route from the constrained existence that bad politics has for so long mired them in.
But it would be a mistake to overstate what phones can do. They won’t rescue anyone from civil war. Africans can use their phones to tell the wider world of the horrors that are still taking place in some parts of the continent – in South Sudan, in Eritrea, in the Niger Delta, in the Central African Republic, in Somalia. Unfortunately the world does not often listen, and nor do the soldiers who are doing the killing. Phones have not changed the basic equation of political security: the people with the guns need a compelling reason not to use them. Technology by itself doesn’t give them that reason. Equally, technology by itself won’t provide the basic infrastructure whose lack it has provided a way around. If there are no functioning roads to get you to market, a phone is a godsend when you have something to sell. But in the long run, you still need the roads. In the end, only politics can rescue you from bad politics…”

A New Approach to Research


Clayton M. Christensen and Derek van Bever in Harvard Business Review: “In writing “The Capitalist’s Dilemma,” we asked students and alumni of our Harvard Business School course “Building and Sustaining a Successful Enterprise” to collaborate with us. Presented here is a map of that collaboration—how hundreds of contributors helped shape the seven core ideas in the article. The crowdsourcing of this article took place on the OI Engine platform (as used on OpenIDEO), which alumnus Tom Hulme helped develop, and was made possible through the leadership of the HBS Digital Initiative under the direction of Karim Lakhani and Matt Tucker. This effort represents the first attempt at creating a community of lifelong collaboration with HBS alumni.
The map illustrates how ideas build and flow, merge, and then diverge again over time. Diverse paths are taken to arrive at the final ideas in the article. It also shows how metrics we might presume are meaningful—comments on a post, for example—don’t always correlate with actual influence. We felt the approach we were taking to writing the article was different and disruptive. This visualization confirmed that for us, and helped us learn about crowdsourcing ideas, too.”

How Big Data Could Undo Our Civil-Rights Laws


Virginia Eubanks in the American Prospect: “From “reverse redlining” to selling out a pregnant teenager to her parents, the advance of technology could render obsolete our landmark civil-rights and anti-discrimination laws.
Big Data will eradicate extreme world poverty by 2028, according to Bono, front man for the band U2. But it also allows unscrupulous marketers and financial institutions to prey on the poor. Big Data, collected from the neonatal monitors of premature babies, can detect subtle warning signs of infection, allowing doctors to intervene earlier and save lives. But it can also help a big-box store identify a pregnant teenager—and carelessly inform her parents by sending coupons for baby items to her home. News-mining algorithms might have been able to predict the Arab Spring. But Big Data was certainly used to spy on American Muslims when the New York City Police Department collected license plate numbers of cars parked near mosques, and aimed surveillance cameras at Arab-American community and religious institutions.
Until recently, debate about the role of metadata and algorithms in American politics focused narrowly on consumer privacy protections and Edward Snowden’s revelations about the National Security Agency (NSA). That Big Data might have disproportionate impacts on the poor, women, or racial and religious minorities was rarely raised. But, as Wade Henderson, president and CEO of the Leadership Conference on Civil and Human Rights, and Rashad Robinson, executive director of ColorOfChange, a civil rights organization that seeks to empower black Americans and their allies, point out in a commentary at TPM Cafe, while big data can change business and government for the better, “it is also supercharging the potential for discrimination.”
In his January 17 speech on signals intelligence, President Barack Obama acknowledged as much, seeking to strike a balance between defending “legitimate” intelligence gathering on American citizens and admitting that our country has a history of spying on dissidents and activists, including, famously, Dr. Martin Luther King, Jr. If this balance seems precarious, it’s because the links between historical surveillance of social movements and today’s uses of Big Data are not lost on the new generation of activists.
“Surveillance, big data and privacy have a historical legacy,” says Amalia Deloney, policy director at the Center for Media Justice, an Oakland-based organization dedicated to strengthening the communication effectiveness of grassroots racial justice groups. “In the early 1960s, in-depth, comprehensive, orchestrated, purposeful spying was used to disrupt political movements in communities of color—the Yellow Peril, the American Indian Movement, the Brown Berets, or the Black Panthers—to create fear and chaos, and to spread bias and stereotypes.”
In the era of Big Data, the danger of reviving that legacy is real, especially as metadata collection renders legal protection of civil rights and liberties less enforceable….
Big Data and surveillance are unevenly distributed. In response, a coalition of 14 progressive organizations, including the ACLU, ColorOfChange, the Leadership Conference on Civil and Human Rights, the NAACP, National Council of La Raza, and the NOW Foundation, recently released five “Civil Rights Principles for the Era of Big Data.” In their statement, they demand:

  • An end to high-tech profiling;
  • Fairness in automated decisions;
  • The preservation of constitutional principles;
  • Individual control of personal information; and
  • Protection of people from inaccurate data.

This historic coalition aims to start a national conversation about the role of big data in social and political inequality. “We’re beginning to ask the right questions,” says O’Neill. “It’s not just about what can we do with this data. How are communities of color impacted? How are women within those communities impacted? We need to fold these concerns into the national conversation.”

Three projects meet the European Job Challenge and receive the Social Innovation Prize


EU Press Release: “Social innovation can be a tool to create new or better jobs, while giving an answer to pressing challenges faced by Europe. Today, Michel Barnier, European Commissioner, has awarded three European Social Innovation prizes to ground-breaking ideas to create new types of work and address social needs. The winning projects aim to help disadvantaged women by employing them to create affordable and limited fashion collections, create jobs in the sector of urban farming, and convert abandoned social housing into learning spaces and entrepreneurship labs.

After the success of the first edition in 2013, the European Commission launched a second round of the Social Innovation Competition in memory of Diogo Vasconcelos1. Its main goal was to invite Europeans to propose new solutions to answer The Job Challenge. The Commission received 1,254 ideas out of which three were awarded with a prize of €30,000 each.

Commissioner Michel Barnier said: “We believe that the winning projects can take advantage of unmet social needs and create sustainable jobs. I want these projects to be scaled up and replicated and inspire more social innovations in Europe. We need to tap into this potential to bring innovative solutions to the needs of our citizens and create new types of work.”

More informationon the Competition page

More jobs for Europe – three outstanding ideas

The following new and exceptional ideas are the winners of the second edition of the European Social Innovation Competition:

  • ‘From waste to wow! QUID project’ (Italy): fashion business demands perfection, and slightly damaged textile cannot be used for top brands. The project intends to recycle this first quality waste into limited collections and thereby provide jobs to disadvantaged women. This is about creating highly marketable products and social value through recycling.

  • ‘Urban Farm Lease’ (Belgium): urban agriculture could provide 6,000 direct jobs in Brussels, and an additional 1,500 jobs considering indirect employment (distribution, waste management, training or events). The project aims at providing training, connection and consultancy so that unemployed people take advantage of the large surfaces available for agriculture in the city (e.g. 908 hectares of land or 394 hectares of suitable flat roofs).

  • ‘Voidstarter’ (Ireland): all major cities in Europe have “voids”, units of social housing which are empty because city councils have insufficient budgets to make them into viable homes. At the same time these cities also experience pressure with social housing provision and homelessness. Voidstarter will provide unemployed people with learning opportunities alongside skilled tradespersons in the refurbishing of the voids.”

The rise of open data driven businesses in emerging markets


Alla Morrison at the Worldbank blog:

Key findings —

  • Many new data companies have emerged around the world in the last few years. Of these companies, the majority use some form of government data.
  • There are a large number of data companies in sectors with high social impact and tremendous development opportunities.
  • An actionable pipeline of data-driven companies exists in Latin America and in Asia. The most desired type of financing is equity, followed by quasi-equity in the amounts ranging from $100,000 to $5 million, with averages of between $2 and $3 million depending on the region. The total estimated need for financing may exceed $400 million.

“The economic value of open data is no longer a hypothesis
How can one make money with open data which is akin to air – free and open to everyone? Should the World Bank Group be in the catalyzer role for a sector that is just emerging?  And if so, what set of interventions would be the most effective? Can promoting open data-driven businesses contribute to the World Bank Group’s twin goals of fighting poverty and boosting shared prosperity?
These questions have been top of the mind since the World Bank Open Finances team convened a group of open data entrepreneurs from across Latin America to share their business models, success stories and challenges at the Open Data Business Models workshop in Uruguay in June 2013. We were in Uruguay to find out whether open data could lead to the creation of sustainable new businesses and jobs. To do so, we tested a couple of hypotheses: open data has economic value, beyond the benefits of increased transparency and accountability; and open data companies with sustainable business models already exist in emerging economies.
Encouraged by our findings in Uruguay we set out to further explore the economic development potential of open data, with a focus on:

  • Contribution of open data to countries’ GDP;
  • Innovative solutions to tackle social problems in key sectors like agriculture, health, education, transportation, climate change, financial services, especially those benefiting low income populations;
  • Economic benefits of governments’ buy-in into the commercial value of open data and resulting release of new datasets, which in turn would lead to increased transparency in public resource management (reductions in misallocations, a more level playing field in procurement) and better service delivery; and
  • Creation of data-related private sector jobs, especially suited for the tech savvy young generation.

We proposed a joint IFC/World Bank approach (From open data to development impact – the crucial role of private sector) that envisages providing financing to data-driven companies through a dedicated investment fund, as well as loans and grants to governments to create a favorable enabling environment. The concept was received enthusiastically for the most part by a wide group of peers at the Bank, the IFC, as well as NGOs, foundations, DFIs and private sector investors.
Thanks also in part to a McKinsey report last fall stating that open data could help unlock more than $3 trillion in value every year, the potential value of open data is now better understood. The acquisition of Climate Corporation (whose business model holds enormous potential for agriculture and food security, if governments open up the right data) for close to a billion dollars last November and the findings of the Open Data 500 project led by GovLab of the NYU further substantiated the hypothesis. These days no one asks whether open data has economic value; the focus has shifted to finding ways for companies, both startups and large corporations, and governments to unlock it. The first question though is – is it still too early to plan a significant intervention to spur open data driven economic growth in emerging markets?”

Learning from The Wealth of the Commons


Paper by Mae Shaw in Special issue of the Community Development Journal on “Commons Sense New thinking about an old idea: “We are poised between an old world that no longer works and a new one struggling to be born. Surrounded by centralized hierarchies on the one hand and predatory markets on the other, people around the world are searching for alternatives’.

This is the starting point for what David Bollier and Silke Helfrich, the editors of The Wealth of the Commons: A World Beyond Market and State (2012), describe as ‘an extended global exercise in commoning’ – Peter Linebaugh’s term for ‘the self-determination of commoners in managing their shared resources’ (p. 396). In other words, the book itself is offered as an active process of ‘making the path’ by presenting ‘some of the most promising new paths now being developed’. It is intended to be ‘rigorous enough for academic readers yet accessible enough for the layperson’. In this, it more than achieves its ambitions. The Wealth of the Commons is an edited collection of seventy-three short papers from thirty countries: ‘a collective venture of sharing, collaboration, negotiation and creative production among some of the most diverse commons scholars, activists and projects leaders imaginable’. This rich and diverse source of knowledge and inspiration could be described as ‘polyvocal’ in the sense that it presents a multiplicity of voices improvising around a single theme – sometimes in harmony, sometimes discordant, but always interesting.

The book brings together an impressive collection of contributors from different places, backgrounds and interests to explore the meaning of the commons and to advocate for it ‘as a new paradigm’ for the organization of public and private life. In this sense, it represents a project rather than an analysis: essentially espousing a cause with imperative urgency. This is not necessarily a weakness, but it does raise specific questions about what is included and what is absent or marginalized in this particular selection of accounts, and what might be lost along the way. What counts as ‘commons’ or ‘the commons’ or ‘the common’ (all used in the text) is a subject of discussion and contestation here, as elsewhere. The effort to ‘name and claim’ is an integral aspect of the project. As Jeffrey et al. (2012, p. 10) comment, ‘the struggle for the commons has never been without its own politics of separation and division’, raising valid questions about the prospects for a coherent paradigm at this stage. At the very least, however, this rich resource may prove seminal in countering those dominant paradigms of growth and development in which structural and cultural adjustments ‘serve as a justifying rhetoric for continuity in plunder’ of common resources (Mattei, p. 41).

The contributions fall into three general categories: those offering a critique of existing ‘increasingly dysfunctional’ market/state relations; those that ‘enlarge theoretical understandings of the commons as a way to change the world’; and those that ‘describe innovative working projects which demonstrate the feasibility’ of the commons.

What counts as the commons?

As acknowledged in many of the chapters, defining the commons in any consistent and convincing way can be deeply problematic. Like ‘community’ itself, it can be regarded to some degree as an ideological portmanteau which contains a variety of meanings. Nonetheless, there is a general commitment to confront such difficulties in an open way, and to be as clear as possible about what the commons might represent, what it might replace, and what it should not be confused with. Put most simply, the commons refers to what human beings share in nature and society that should be cherished for all now and for the future: ‘the term … provides the binding element between the natural and the social or cultural worlds’ (Weber p.11). Its profound challenge to the logic of competitive capitalist relations, therefore, is to ‘validate new schemes of human relations, production and governance … commonance’ (Bollier and Helfrich, p. xiv) that penetrate all levels of public and private life. This idea is explored in detail in many of the contributions.

The commons, then, claims to represent a philosophical stance, an intellectual framework, a moral and economic imperative, a set of organizing principles and commitments, a movement, and an emerging ‘global community of practice’ (O’Connell, 2012). It has also developed an increasingly shared discourse, which is designed to unsettle institutionalized norms and values and to reclaim or remake the language of co-operation, fairness and social justice. As the editorial points out, the language of capitalism is one that becomes ‘encoded into the epistemology of our language and internalized by people’. In community development, and elsewhere, we have become sensitized to the way in which progressive language can be appropriated to support individualistic market values. When empowerment can mean facilitated asset-stripping of local communities, and solidarity targets can be set by government (e.g. Scottish Government, 2007), then we must be wary about assuming proprietorial closure on the term ‘commons’ itself.

As Federici, in a particularly persuasive chapter, warns: ‘… capital is learning about the virtues of the common good’ (p. 46). She argues that, ‘since at least the 1990s, the language of the commons has been appropriated … by the World Bank and put at the service of privatization’. For this reason, it is important to think of the commons as a ‘quality of relations, a principle of co-operation and of responsibility to each other and to the earth, the forests, the seas, the animals’ (p. 50). This produces a different operational logic, which is explored in depth across the collection.

Deficiencies in the commons framework

To advance the commons as ‘a new paradigm’, it is necessary to locate it historically and to show the ways in which it has been colonized and compromised, as some of these pieces do. It may seem ironic that the meaning of ‘the commons’ to many people in the UK, for example, is that bear pit of parliamentary business, the House of Commons, in which adversarial rather than consensual politics is the order of the day. Reclaiming such foundational ideas is a lengthy and demanding process, as David Graeber shows in The Democracy Project, his recent account of the Occupy Movement, which for a time commanded considerable international interest. Drawing on Linebaugh, Federici contends that ‘commons have been the thread that has connected the history of the class struggle into our time’.

It is unfortunate, therefore, that the volume fails to address the relationship between organized labour and the commons, as highlighted in the introduction, because there is a distinctive contribution to be made here. As Harvey (2012) argues, decentralization and autonomy are also primary vehicles for reinforcing neoliberal class strategies of social reproduction and producing greater inequality. For example, in urban environments in particular, ‘the better the common qualities a social group creates, the more likely it is to be raided and appropriated by private profit-maximising interests’ leading inexorably to economic cleansing of whole areas. Gentrification and tourism are the clearest examples. The salience of class in general is an underdeveloped line of argument. If this authoritative collection is anything to go by, this may be a significant deficiency in the commons framework.

Without historical continuity – honouring the contribution of those ‘commoners’ who came before in various guises and places – there is a danger of falling into the contemporary trap of regarding ‘innovation’ as a way of separating us from our past. History in the past as well as in the making is as essential a part of our commons as is the present and the future – material, temporal and spiritual….”