New frontiers in social innovation research


Geoff Mulgan: “Nesta has published a new book with Palgrave which contains an introduction by me and many important chapters from leading academics around the world. I hope that many people will read it, and think about it, because it challenges, in a highly constructive way, many of the rather tired assumptions of the London media/political elite of both left and right.

The essay is by Roberto Mangabeira Unger, perhaps the world’s most creative and important contemporary intellectual. He is Professor of Law at Harvard (where he taught Obama); a philosopher and political theorist; author of one of the most interesting recent books on religion; co-author of an equally ground-breaking recent book on theoretical physics; and serves as strategy minister in the Brazilian government.

His argument is that a radically different way of thinking about politics, government and social change is emerging, which has either not been noticed by many political leaders, or misinterpreted. The essence of the argument is that practice is moving faster than theory; that systematic experimentation is a faster way to solve problems than clever authorship of pamphlets, white papers and plans; and that societies have the potential to be far more active agents of their own future than we assume.

The argument has implications for many fields. One is think-tanks. Twenty years ago I set up a think-tank, Demos. At that time the dominant model for policy making was to bring together some clever people in a capital city to write pamphlets, white papers and then laws. In the 1950s to 1970s a primary role was played by professors in universities, or royal commissions. Then it shifted to think-tanks. Sometimes teams within governments played a similar role – and I oversaw several of these, including the Strategy Unit in government. All saw policy as an essentially paper-based process, involving a linear transmission from abstract theories and analyses to practical implementation.

There’s still an important role to be played by think-tanks. But an opposite approach has now become common, and is promoted by Unger. In this approach, practice precedes theory. Experiment in the real world drives the development of new ideas – in business, civil society, and on the edges of the public sector. Learning by doing complements, and often leads analysis. The role of the academics and think-tanks shifts from inventing ideas to making sense of what’s emerging, and generalising it. Policies don’t try to specify every detail but rather set out broad directions and then enable a process of experiment and discovery.

As Unger shows, this approach has profound philosophical roots (reaching back to the 19th century pragmatists and beyond), and profound political implications (it’s almost opposite to the classic Marxist view, later adopted by the neoliberal right, in which intellectuals define solutions in theory which are then translated into practice). It also has profound implications for civil society – which he argues should adopt a maximalist rather than a minimalist view of social innovation.

The Unger approach doesn’t work for everything – for example, constitutional reform. But it is a superior method for improving most of the fields where governments have power – from welfare and health, to education and economic policy, and it has worked well for Nesta – evolving new models of healthcare, working with dozens of governments to redesign business policy, testing out new approaches to education.

The several hundred public sector labs and innovation teams around the world – from Chile to China, south Africa to Denmark – share this ethos too, as do many political leaders. Michael Bloomberg has been an exemplar, confident enough to innovate and experiment constantly in his time as New York Mayor. Won Soon Park in Korea is another…..

Unger’s chapter should be required reading for anyone aspiring to play a role in 21st century politics. You don’t have to agree with what he says. But you do need to work out where you disagree and why….(New Frontiers in Social Innovation Research)

The Problem-Solving Process That Prevents Groupthink


Art Markman at Harvard Business Review: “There are two reasons most of us aren’t very good at creative problem solving. First, few people get training in how to be creative in their education. Second, few people understand group dynamics well enough to harness their power to help groups maximize their creativity.

Resolving the first issue requires getting your employees to learn more about the way they think… a tall order for managers. The second issue, though, is well within your ability to change.

A key element of creativity is bringing existing knowledge to bear on a new problem or goal. The more people who can engage with that problem or goal, the more knowledge that is available to work on it. Unfortunately, quite a bit of research demonstrates that the traditional brainstorming methods first described by Alex Osborn in the 1950’s fail. When groups simply get together and start throwing out ideas, they actually come up with fewer ideas overall and fewer novel, actionable ideas than the individuals in that group would have come up with had they worked alone.

To fix this problem, it is important to think about the two phases of group problem-solving: divergence and convergence.

Divergence happens when the group considers as many different potential solutions as possible. For example, a common test of creativity is the “alternative uses” test. People are asked questions like, “How many different uses can you find for a brick?” This test requires strategies for considering as many distinct solutions as possible.

Convergence happens when the variety of proposed solutions are evaluated. In this phase, a large number of ideas are whittled to a smaller set of candidate solutions to the current problem.

The core principle of group creativity is that individuals working alone diverge, while group members working together converge. In group settings, as soon as one person states a potential solution to everyone else, that influences the memory of every person in the group in ways that make everyone think about the problem more similarly. That is why groups working together diverge less than individuals working alone.

To fix group idea generation, then, be aware of when you are trying to diverge and when you are trying to converge. For example, early in the process of problem-solving, think carefully about the problem itself. Have your group members work alone to craft statements describing the problem. Then, get them back together to discuss their descriptions. The individuals are likely to come up with a variety of distinct problem statements. The group discussion will lead everyone to accept one or a small number of variants of these statements to work on – this is healthy convergence.

When you start to generate solutions, you again want divergence. Again, have people work alone to start. Then collect people’s initial ideas and send them around to other group members and allow the divergence to continue as group members individually build on the ideas of their colleagues. Because people are still working alone, the way they build on other people’s ideas is still going to be different from how other group members are building on those ideas.

After this process, you can give the resulting ideas to everyone and then let the group get together to discuss them. This discussion will gradually lead the group to converge on a small number of candidate solutions….(More)”

The ‘data revolution’ will be open


Martin Tisne at Devex: “There is a huge amount of talk about a “data revolution.” The phrase emerged in the years preceding this September’s announcement of the Sustainable Development Goals, and has recently been strongly reaffirmed by the launch of a Global Partnership on Sustainable Development Data.

The importance of data in measuring, assessing and verifying the new SDGs has been powerfully made and usually includes a mention of the data needing to be “open.” However, the role of “open” has not been clearly articulated. Fundamentally, the discussion focuses on the role of data (statistics, for example) in decision-making, and not on the benefits of that data being open to the public. Until this case is made, difficult decisions to make data open will go by the wayside.

Much of the debate justly focuses on why data matters for decision-making. Knowing how many boys and girls are in primary and secondary schools, how good their education is, and the number of teachers in their schools, are examples of relevant data used in shaping education delivery, and perhaps policy. Likewise, new satellite and cellphone data can help us prevent and understand the causes of death by HIV and AIDS, tuberculosis, and malaria.

Proponents of the data revolution make powerful points, such as that 1 in 3 births go unregistered. If you are uncounted, you will be ignored. If you don’t have an identity, you do not exist.

Yet as important as this information is, I still can’t help but think: Do we change the course of history with the mere existence of more data or because people access it, mobilize and press for change?

We need an equally eloquent narrative for why open data matters and what it means.

To my thinking, we need the data to be open because we need to hold governments accountable for their promises under the SDGs, in order to incentivize action. The data needs to be available, accessible and comparable to enable journalists and civil society to prod, push and test the validity of these promises. After all, what good are the goals if governments do not deliver, beginning with the funding to implement? We will need to know what financial resources, both public and private, will be put to work and what budget allocations governments will make in their draft budgets. We need to have those debates in the open, not in smoke-filled rooms.

Second, the data needs to be open in order to be verified, quality-checked and improved. …(More)”

Creating Value through Open Data


Press Release: “Capgemini Consulting, the global strategy and transformation consulting arm of the Capgemini Group, today published two new reports on the state of play of Open Data in Europe, to mark the launch of the European Open Data Portal. The first report addresses “Open Data Maturity in Europe 2015: Insights into the European state of play” and the second focuses on “Creating Value through Open Data: Study on the Impact of Re-use of Public Data Resources.” The countries covered by these assessments include the EU28 countries plus Iceland, Liechtenstein, Norway, and Switzerland – commonly referred to as the EU28+ countries. The reports were requested by the European Commission within the framework of the Connecting Europe Facility program, supporting the deployment of European Open Data infrastructure.

Open Data refers to the information collected, produced or paid for by public bodies and can be freely used, modified and shared by anyone.. For the period 2016-2020, the direct market size for Open Data is estimated at EUR 325 billion for Europe. Capgemini’s study “Creating Value through Open Data” illustrates how Open Data can create economic value in multiple ways including increased market transactions, job creation from producing services and products based on Open Data, to cost savings and efficiency gains. For instance, effective use of Open Data could help save 629 million hours of unnecessary waiting time on the roads in the EU; and help reduce energy consumption by 16%. The accumulated cost savings for public administrations making use of Open Data across the EU28+ in 2020 are predicted to equal 1.7 bn EUR. Reaping these benefits requires reaching a high level of Open Data maturity.

In order to address the accessibility and the value of Open Data across European countries, the European Union has launched the Beta version of the European Data Portal. The Portal addresses the whole Data Value Chain, from data publishing to data re-use. Over 240,000 data sets are referenced on the Portal and 34 European countries. It offers seamless access to public data across Europe, with over 13 content categories to categorize data, ranging from health or education to transport or even science and justice. Anyone, citizens, businesses, journalists or administrations can search, access and re-use the full data collection. A wide range of data is available, from crime records in Helsinki, labor mobility in the Netherlands, forestry maps in France to the impact of digitization in Poland…..The study, “Open Data Maturity in Europe 2015: Insights into the European state of play”, uses two key indicators: Open Data Readiness and Portal Maturity. These indicators cover both the maturity of national policies supporting Open Data as well as an assessment of the features made available on national data portals. The study shows that the EU28+ have completed just 44% of the journey towards achieving full Open Data Maturity and there are large discrepancies across countries. A third of European countries (32%), recognized globally, are leading the way with solid policies, licensing norms, good portal traffic and many local initiatives and events to promote Open Data and its re-use….(More)”

Beyond Distrust: How Americans View Their Government


Overview - 1Pew Research Center: “A year ahead of the presidential election, the American public is deeply cynical about government, politics and the nation’s elected leaders in a way that has become quite familiar.

Currently, just 19% say they can trust the government always or most of the time,among the lowest levels in the past half-century. Only 20% would describe government programs as being well-run. And elected officials are held in such low regard that 55% of the public says “ordinary Americans” would do a better job of solving national problems.

Yet at the same time, most Americans have a lengthy to-do list for this object of their frustration: Majorities want the federal government to have a major role in addressing issues ranging from terrorism and disaster response to education and the environment.

And most Americans like the way the federal government handles many of these same issues, though they are broadly critical of its handling of others – especially poverty and immigration.

A new national survey by Pew Research Center, based on more than 6,000 interviews conducted between August 27 and October 4, 2015, finds that public attitudes about government and politics defy easy categorization. The study builds upon previous reports about the government’s role and performance in 2010 and 1998. This report was made possible by The Pew Charitable Trusts, which received support for the survey from The William and Flora Hewlett Foundation.

The partisan divide over the size and scope of government remains as wide as ever: Support for smaller government endures as a Republican touchstone. Fully 80% of Republicans and Republican-leaning independents say they prefer a smaller government with fewer services, compared with just 31% of Democrats and Democratic leaners.

Yet both Republicans and Democrats favor significant government involvement on an array of specific issues. Among the public overall, majorities say the federal government should have a major role in dealing with 12 of 13 issues included in the survey, all except advancing space exploration.

There is bipartisan agreement that the federal government should play a major role in dealing with terrorism, natural disasters, food and medicine safety, and roads and infrastructure. And while the presidential campaign has exposed sharp partisan divisions over immigration policy, large majorities of both Republicans (85%) and Democrats (80%) say the government should have a major role in managing the immigration system.

But the partisan differences over government’s appropriate role are revealing – with the widest gaps on several issues relating to the social safety net….(More)

Open government data: Out of the box


The Economist on “The open-data revolution has not lived up to expectations. But it is only getting started…

The app that helped save Mr Rich’s leg is one of many that incorporate government data—in this case, supplied by four health agencies. Six years ago America became the first country to make all data collected by its government “open by default”, except for personal information and that related to national security. Almost 200,000 datasets from 170 outfits have been posted on the data.gov website. Nearly 70 other countries have also made their data available: mostly rich, well-governed ones, but also a few that are not, such as India (see chart). The Open Knowledge Foundation, a London-based group, reckons that over 1m datasets have been published on open-data portals using its CKAN software, developed in 2010.

The War on Campus Sexual Assault Goes Digital


As the problem of sexual assault on college campuses has become a hot-button issue for school administrators and federal education regulators, one question keeps coming up: Why don’t more students report attacks?

According to a recent study of 27 schools, about one-quarter of female undergraduates and students who identified as queer or transgender said they had experienced nonconsensual sex or touching since entering college, but most of the students said they did not report it to school officials or support services.

Some felt the incidents weren’t serious enough. Others said they did not think anyone would believe them or they feared negative social consequences. Some felt it would be too emotionally difficult.

Now, in an effort to give students additional options — and to provide schools with more concrete data — a nonprofit software start-up in San Francisco called Sexual Health Innovations has developed an online reporting system for campus sexual violence.

Students at participating colleges can use its site, called Callisto, to record details of an assault anonymously. The site saves and time-stamps those records. That allows students to decide later whether they want to formally file reports with their schools — identifying themselves by their school-issued email addresses — or download their information and take it directly to the police. The site also offers a matching system in which a user can elect to file a report with the school electronically only if someone else names the same assailant.

Callisto’s hypothesis is that some college students — who already socialize, study and shop online — will be more likely initially to document a sexual assault on a third-party site than to report it to school officials on the phone or in person.

“If you have to walk into a building to report, you can only go at certain times of day and you’re not certain who you have to talk to, how many people you have to talk to, what they will ask,” Jessica Ladd, the nonprofit’s founder and chief executive, said in a recent interview in New York. “Whereas online, you can fill out a form at any time of day or night from anywhere and push a button.”

Callisto is part of a wave of apps and sites that tackle different facets of the sexual assault problem on campus. Some colleges and universities have introduced third-party mobile apps that enable students to see maps of local crime hot spots, report suspicious activity, request a ride from campus security services or allow their friends to track their movements virtually as they walk home. Many schools now ask students to participate in online or in-person training programs that present different situations involving sexual assault, relationship violence and issues of consent…..(More)”

Open Data as Open Educational Resources: Case studies of emerging practice


Book edited by Javiera Atenas and Leo Havemann: “…is the outcome of a collective effort that has its origins in the 5th Open Knowledge Open Education Working Group call, in which the idea of using Open Data in schools was mentioned. It occurred to us that Open Data and open educational resources seemed to us almost to exist in separate open worlds.

We decided to seek out evidence in the use of open data as OER, initially by conducting a bibliographical search. As we could not find published evidence, we decided to ask educators if they were in fact, using open data in this way, and wrote a post for this blog (with Ernesto Priego) explaining our perspective, called The 21st Century’s Raw Material: Using Open Data as Open Educational Resources. We ended the post with a link to an exploratory survey, the results of which indicated a need for more awareness of the existence and potential value of Open Data amongst educators…..

the case studies themselves. They have been provided by scholars and practitioners from different disciplines and countries, and they reflect different approaches to the use of open data. The first case study presents an approach to educating both teachers and students in the use of open data for civil monitoring via Scuola di OpenCoesione in Italy, and has been written by Chiara Ciociola and Luigi Reggi. The second case, by Tim Coughlan from the Open University, UK, showcases practical applications in the use of local and contextualised open data for the development of apps. The third case, written by Katie Shamash, Juan Pablo Alperin & Alessandra Bordini from Simon Fraser University, Canada, demonstrates how publishing students can engage, through data analysis, in very current debates around scholarly communications and be encouraged to publish their own findings. The fourth case by Alan Dix from Talis and University of Birmingham, UK, and Geoffrey Ellis from University of Konstanz, Germany, is unique because the data discussed in this case is self-produced, indeed ‘quantified self’ data, which was used with students as material for class discussion and, separately, as source data for another student’s dissertation project. Finally, the fifth case, presented by Virginia Power from University of the West of England, UK, examines strategies to develop data and statistical literacies in future librarians and knowledge managers, aiming to support and extend their theoretical understanding of the concept of the ‘knowledge society’ through the use of Open Data….(More)

The book can be downloaded here Open Data as Open Educational Resources

Predictive policing is ‘technological racism’


Shaun King at the New York Daily News: “The future is here.

For years now, the NYPD, the Miami PD, and many police departments around the country have been using new technology that claims it can predict where crime will happen and where police should focus their energies in order. They call it predictive policing. Months ago, I raised several red flags to such software because it does not appear to properly account for the presence of racism or racial profiling in how it predicts where crimes will be committed.

See, these systems claim to predict where crimes will happen based on prior arrest data. What they don’t account for is the widespread reality that race and racial profiling have everything to do with who is arrested and where they are arrested. For instance, study after study has shown that white people actually are more likely to sell drugs and do drugs than black people, but are exponentially less likely to be arrested for either crime. But, and this is where these systems fail, if the only data being entered into systems is based not on the more complex reality of who sells and purchases drugs, but on a racial stereotype, then the system will only perpetuate the racism that preceded it…

In essence, it’s not predicting who will sell drugs and where they will sell it, as much as it is actually predicting where a certain race of people may sell or purchase drugs. It’s technological racism at its finest.

Now, in addition to predictive policing, the state of Pennsylvania is pioneering predictive prison sentencing. Through complex questionnaires and surveys completed not by inmates, but by prison staff members, inmates may be given a smaller bail or shorter sentences or a higher bail and lengthier prison sentences. The surveys focus on family background, economic background, prior crimes, education levels and more.

When all of the data is scored, the result classifies prisoners as low, medium or high risk. While this may sound benign, it isn’t. No prisoner should ever be given a harsh sentence or an outrageous bail amount because of their family background or economic status. Even these surveys lend themselves to being racist and putting black and brown women and men in positions where it’s nearly impossible to get a good score because of prevalent problems in communities of color….(More)”

How big data and The Sims are helping us to build the cities of the future


The Next Web: “By 2050, the United Nations predicts that around 66 percent of the world’s population will be living in urban areas. It is expected that the greatest expansion will take place in developing regions such as Africa and Asia. Cities in these parts will be challenged to meet the needs of their residents, and provide sufficient housing, energy, waste disposal, healthcare, transportation, education and employment.

So, understanding how cities will grow – and how we can make them smarter and more sustainable along the way – is a high priority among researchers and governments the world over. We need to get to grips with the inner mechanisms of cities, if we’re to engineer them for the future. Fortunately, there are tools to help us do this. And even better, using them is a bit like playing SimCity….

Cities are complex systems. Increasingly, scientists studying cities have gone from thinking about “cities as machines”, to approaching “cities as organisms”. Viewing cities as complex, adaptive organisms – similar to natural systems like termite mounds or slime mould colonies – allows us to gain unique insights into their inner workings. …So, if cities are like organisms, it follows that we should examine them from the bottom-up, and seek to understand how unexpected large-scale phenomena emerge from individual-level interactions. Specifically, we can simulate how the behaviour of individual “agents” – whether they are people, households, or organisations – affect the urban environment, using a set of techniques known as “agent-based modelling”….These days, increases in computing power and the proliferation of big datagive agent-based modelling unprecedented power and scope. One of the most exciting developments is the potential to incorporate people’s thoughts and behaviours. In doing so, we can begin to model the impacts of people’s choices on present circumstances, and the future.

For example, we might want to know how changes to the road layout might affect crime rates in certain areas. By modelling the activities of individuals who might try to commit a crime, we can see how altering the urban environment influences how people move around the city, the types of houses that they become aware of, and consequently which places have the greatest risk of becoming the targets of burglary.

To fully realise the goal of simulating cities in this way, models need a huge amount of data. For example, to model the daily flow of people around a city, we need to know what kinds of things people spend their time doing, where they do them, who they do them with, and what drives their behaviour.

Without good-quality, high-resolution data, we have no way of knowing whether our models are producing realistic results. Big data could offer researchers a wealth of information to meet these twin needs. The kinds of data that are exciting urban modellers include:

  • Electronic travel cards that tell us how people move around a city.
  • Twitter messages that provide insight into what people are doing and thinking.
  • The density of mobile telephones that hint at the presence of crowds.
  • Loyalty and credit-card transactions to understand consumer behaviour.
  • Participatory mapping of hitherto unknown urban spaces, such as Open Street Map.

These data can often be refined to the level of a single person. As a result, models of urban phenomena no longer need to rely on assumptions about the population as a whole – they can be tailored to capture the diversity of a city full of individuals, who often think and behave differently from one another….(More)