State of the Commons


Creative Commons: “Creative Commoners have known all along that collaboration, sharing, and cooperation are a driving force for human evolution. And so for many it will come as no surprise that in 2015 we achieved a tremendous milestone: over 1.1 billion CC licensed photos, videos, audio tracks, educational materials, research articles, and more have now been contributed to the shared global commons…..

Whether it’s open education, open data, science, research, music, video, photography, or public policy, we are putting sharing and collaboration at the heart of the Web. In doing so, we are much closer to realizing our vision: unlocking the full potential of the Internet to drive a new era of development, growth, and productivity.

I am proud to share with you our 2015 State of the Commons report, our best effort to measure the immeasurable scope of the commons by looking at the CC licensed content, along with content marked as public domain, that comprise the slice of the commons powered by CC tools. We are proud to be a leader in the commons movement, and we hope you will join us as we celebrate all we have accomplished together this year. ….Report at https://stateof.creativecommons.org/2015/”

New frontiers in social innovation research


Geoff Mulgan: “Nesta has published a new book with Palgrave which contains an introduction by me and many important chapters from leading academics around the world. I hope that many people will read it, and think about it, because it challenges, in a highly constructive way, many of the rather tired assumptions of the London media/political elite of both left and right.

The essay is by Roberto Mangabeira Unger, perhaps the world’s most creative and important contemporary intellectual. He is Professor of Law at Harvard (where he taught Obama); a philosopher and political theorist; author of one of the most interesting recent books on religion; co-author of an equally ground-breaking recent book on theoretical physics; and serves as strategy minister in the Brazilian government.

His argument is that a radically different way of thinking about politics, government and social change is emerging, which has either not been noticed by many political leaders, or misinterpreted. The essence of the argument is that practice is moving faster than theory; that systematic experimentation is a faster way to solve problems than clever authorship of pamphlets, white papers and plans; and that societies have the potential to be far more active agents of their own future than we assume.

The argument has implications for many fields. One is think-tanks. Twenty years ago I set up a think-tank, Demos. At that time the dominant model for policy making was to bring together some clever people in a capital city to write pamphlets, white papers and then laws. In the 1950s to 1970s a primary role was played by professors in universities, or royal commissions. Then it shifted to think-tanks. Sometimes teams within governments played a similar role – and I oversaw several of these, including the Strategy Unit in government. All saw policy as an essentially paper-based process, involving a linear transmission from abstract theories and analyses to practical implementation.

There’s still an important role to be played by think-tanks. But an opposite approach has now become common, and is promoted by Unger. In this approach, practice precedes theory. Experiment in the real world drives the development of new ideas – in business, civil society, and on the edges of the public sector. Learning by doing complements, and often leads analysis. The role of the academics and think-tanks shifts from inventing ideas to making sense of what’s emerging, and generalising it. Policies don’t try to specify every detail but rather set out broad directions and then enable a process of experiment and discovery.

As Unger shows, this approach has profound philosophical roots (reaching back to the 19th century pragmatists and beyond), and profound political implications (it’s almost opposite to the classic Marxist view, later adopted by the neoliberal right, in which intellectuals define solutions in theory which are then translated into practice). It also has profound implications for civil society – which he argues should adopt a maximalist rather than a minimalist view of social innovation.

The Unger approach doesn’t work for everything – for example, constitutional reform. But it is a superior method for improving most of the fields where governments have power – from welfare and health, to education and economic policy, and it has worked well for Nesta – evolving new models of healthcare, working with dozens of governments to redesign business policy, testing out new approaches to education.

The several hundred public sector labs and innovation teams around the world – from Chile to China, south Africa to Denmark – share this ethos too, as do many political leaders. Michael Bloomberg has been an exemplar, confident enough to innovate and experiment constantly in his time as New York Mayor. Won Soon Park in Korea is another…..

Unger’s chapter should be required reading for anyone aspiring to play a role in 21st century politics. You don’t have to agree with what he says. But you do need to work out where you disagree and why….(New Frontiers in Social Innovation Research)

The ‘data revolution’ will be open


Martin Tisne at Devex: “There is a huge amount of talk about a “data revolution.” The phrase emerged in the years preceding this September’s announcement of the Sustainable Development Goals, and has recently been strongly reaffirmed by the launch of a Global Partnership on Sustainable Development Data.

The importance of data in measuring, assessing and verifying the new SDGs has been powerfully made and usually includes a mention of the data needing to be “open.” However, the role of “open” has not been clearly articulated. Fundamentally, the discussion focuses on the role of data (statistics, for example) in decision-making, and not on the benefits of that data being open to the public. Until this case is made, difficult decisions to make data open will go by the wayside.

Much of the debate justly focuses on why data matters for decision-making. Knowing how many boys and girls are in primary and secondary schools, how good their education is, and the number of teachers in their schools, are examples of relevant data used in shaping education delivery, and perhaps policy. Likewise, new satellite and cellphone data can help us prevent and understand the causes of death by HIV and AIDS, tuberculosis, and malaria.

Proponents of the data revolution make powerful points, such as that 1 in 3 births go unregistered. If you are uncounted, you will be ignored. If you don’t have an identity, you do not exist.

Yet as important as this information is, I still can’t help but think: Do we change the course of history with the mere existence of more data or because people access it, mobilize and press for change?

We need an equally eloquent narrative for why open data matters and what it means.

To my thinking, we need the data to be open because we need to hold governments accountable for their promises under the SDGs, in order to incentivize action. The data needs to be available, accessible and comparable to enable journalists and civil society to prod, push and test the validity of these promises. After all, what good are the goals if governments do not deliver, beginning with the funding to implement? We will need to know what financial resources, both public and private, will be put to work and what budget allocations governments will make in their draft budgets. We need to have those debates in the open, not in smoke-filled rooms.

Second, the data needs to be open in order to be verified, quality-checked and improved. …(More)”

Can we achieve effective economic diplomacy without innovation diplomacy?


DFAT’s innovationXchange is seeing throughout our conversations with other countries that so many of us are looking at the issue of innovation.  …There is a great deal of interest in exploring how we can share information across borders, how we use that information to trigger new ideas, and how we leverage the skills and knowledge of others to achieve better outcomes. Innovation is fast becoming a common objective, something we all aim to embed in our respective organisations, but which we know we cannot do alone. The problems we seek to solve are global and a collaborative, innovative approach to solve them is needed….

This makes me think, is innovation the new diplomatic tool on which we can base new or enhanced relationships on?  Can we use the shared goal of doing things better, more cost effectively harnessing the knowledge and capital that sits outside governments to not only have a better impact but bring countries closer together in a collaborative partnership?  Could these collaborative partnerships even contribute to increased regional stability?

Innovation is fuelled by collaboration – taking an idea, sharing with others, using their knowledge and creativity to improve the idea, building on it, testing it, adapting and testing again.  This collaborative process aligns very well with the intent behind diplomacy – the act of a state seeking toachieve its aims, in relation to those of others, through dialogue and negotiation.

This is already happening to some extent with like-mindeds, like UK and US.  But innovation is about risk taking, trying new things and stepping outside of the familiar and comfortable. The emergence of new groupings, like MIKTA, and the increasing engagement of the private sector in partnering for social impact expands the opportunities to learn about other approaches and find complementary skills and knowledge.

This is all about making collaboration, co-creation and through that, innovation a way of working – an approach we can take to working with other states and other organisations. While innovation is the latest buzzword in government and in the development community, it will remain just a buzzword, easily replaced by the next trend, unless we look for opportunities to work with others to co-create and innovate to solve shared problems….(More)”

Big Data as Governmentality – Digital Traces, Algorithms, and the Reconfiguration of Data in International Development


Paper by Flyverbom, Mikkel and Madsen, Anders Klinkby and Rasche, Andreas: “This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects of international development agendas to algorithms that synthesize large-scale data, (3) novel ways of rationalizing knowledge claims that underlie development efforts, and (4) shifts in professional and organizational identities of those concerned with producing and processing data for development. Our discussion shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact….(More)

Decoding the Future for National Security


George I. Seffers at Signal: “U.S. intelligence agencies are in the business of predicting the future, but no one has systematically evaluated the accuracy of those predictions—until now. The intelligence community’s cutting-edge research and development agency uses a handful of predictive analytics programs to measure and improve the ability to forecast major events, including political upheavals, disease outbreaks, insider threats and cyber attacks.

The Office for Anticipating Surprise at the Intelligence Advanced Research Projects Activity (IARPA) is a place where crystal balls come in the form of software, tournaments and throngs of people. The office sponsors eight programs designed to improve predictive analytics, which uses a variety of data to forecast events. The programs all focus on incidents outside of the United States, and the information is anonymized to protect privacy. The programs are in different stages, some having recently ended as others are preparing to award contracts.

But they all have one more thing in common: They use tournaments to advance the state of the predictive analytic arts. “We decided to run a series of forecasting tournaments in which people from around the world generate forecasts about, now, thousands of real-world events,” says Jason Matheny, IARPA’s new director. “All of our programs on predictive analytics do use this tournament style of funding and evaluating research.” The Open Source Indicators program used a crowdsourcing technique in which people across the globe offered their predictions on such events as political uprisings, disease outbreaks and elections.

The data analyzed included social media trends, Web search queries and even cancelled dinner reservations—an indication that people are sick. “The methods applied to this were all automated. They used machine learning to comb through billions of pieces of data to look for that signal, that leading indicator, that an event was about to happen,” Matheny explains. “And they made amazing progress. They were able to predict disease outbreaks weeks earlier than traditional reporting.” The recently completed Aggregative Contingent Estimation (ACE) program also used a crowdsourcing competition in which people predicted events, including whether weapons would be tested, treaties would be signed or armed conflict would break out along certain borders. Volunteers were asked to provide information about their own background and what sources they used. IARPA also tested participants’ cognitive reasoning abilities. Volunteers provided their forecasts every day, and IARPA personnel kept score. Interestingly, they discovered the “deep domain” experts were not the best at predicting events. Instead, people with a certain style of thinking came out the winners. “They read a lot, not just from one source, but from multiple sources that come from different viewpoints. They have different sources of data, and they revise their judgments when presented with new information. They don’t stick to their guns,” Matheny reveals. …

The ACE research also contributed to a recently released book, Superforecasting: The Art and Science of Prediction, according to the IARPA director. The book was co-authored, along with Dan Gardner, by Philip Tetlock, the Annenberg University professor of psychology and management at the University of Pennsylvania who also served as a principal investigator for the ACE program. Like ACE, the Crowdsourcing Evidence, Argumentation, Thinking and Evaluation program uses the forecasting tournament format, but it also requires participants to explain and defend their reasoning. The initiative aims to improve analytic thinking by combining structured reasoning techniques with crowdsourcing.

Meanwhile, the Foresight and Understanding from Scientific Exposition (FUSE) program forecasts science and technology breakthroughs….(More)”

Government’s innovative approach to skills sharing


Nicole Blake Johnson at GovLoop: “For both managers and employees, it often seems there aren’t enough hours in the day to tackle every priority project.

But what if there was another option — a way for federal managers to get the skills they need internally and for employees to work on projects they’re interested in but unaware of?

Maybe you’re the employee who is really into data analytics or social media, but that’s not a part of your regular job duties. What if you had the support of your supervisor to help out on an analytics project down the hall or in a field office across the country?

I’m not making up hypothetical scenarios. These types of initiatives are actually taking shape at federal agencies, including the Environmental Protection Agency, Social Security Administration, Health and Human Services and Commerce departments.

Many agencies are in the pilot phase of rolling out their programs, which are versions of a governmentwide initiative called GovConnect. The initiative was inspired by an EPA program called Skills Marketplace that dates back to 2011.(Read more about GovConnect here.)

“We felt like we had something really promising at EPA, and we wanted to share it with other government agencies,” said Noha Gaber, EPA’s Director of Internal Communications. “So we actually pitched it to OPM and several other agencies, and that ended up becoming GovConnect.”

“The goal of GovConnect is to develop federal workforce skills through cross-agency collaboration and teamwork, to enable more agile response to mission demands without being unnecessarily limited by organizational silos,” said Melissa Kline Lee, who serves as Program Manager of GovConnect at the Office of Personnel Management. “As part of the President’s Management Agenda, the Office of Personnel Management and Environmental Protection Agency are using the GovConnect pilot to help agencies test and scale new approaches to workforce development.”…

Managers post projects or tasks in the online marketplace, which was developed using the agency’s existing SharePoint environment. Projects include clear tasks that employees can accomplish using up to 20 percent of their workweek or less. Projects cannot be open-ended and should not exceed one year.

From there, any employee can view the projects, evaluate what skills or competencies are needed and apply for the position. Managers review the applications and conduct interviews before selecting a candidate. Here are the latest stats for Skills Marketplace as of November 2015:

  • Managers posted 358 projects in the marketplace
  • Employees submitted 577 applications
  • More than 750 people have created profiles for the marketplace

Gaber shared one example involving an employee from the Office of Pesticide Programs and staff from the Office of Environmental Information (OEI), which is the main IT office at EPA. The employee brought to the team technical expertise and skills in geographic information systems to support OEI’s Toxic Release Inventory Program, which tracks data on toxic chemicals being produced by different facilities.

The benefits were twofold: The employee established new connections in a different part of the agency, and his home office benefited from the experiences and knowledge he gleaned while working on the project….(More)

Smart Cities as Democratic Ecologies


Book edited by Daniel Araya: “The concept of the ‘smart city’ as the confluence of urban planning and technological innovation has become a predominant feature of public policy discourse. Despite its expanding influence, however, there is little consensus on the precise meaning of a ‘smart city’. One reason for this ambiguity is that the term means different things to different disciplines. For some, the concept of the ‘smart city’ refers to advances in sustainability and green technologies. For others, it refers to the deployment of information and communication technologies as next generation infrastructure.

This volume focuses on a third strand in this discourse, specifically technology driven changes in democracy and civic engagement. In conjunction with issues related to power grids, transportation networks and urban sustainability, there is a growing need to examine the potential of ‘smart cities’ as ‘democratic ecologies’ for citizen empowerment and user-driven innovation. What is the potential of ‘smart cities’ to become platforms for bottom-up civic engagement in the context of next generation communication, data sharing, and application development? What are the consequences of layering public spaces with computationally mediated technologies? Foucault’s notion of the panopticon, a metaphor for a surveillance society, suggests that smart technologies deployed in the design of ‘smart cities’ should be evaluated in terms of the ways in which they enable, or curtail, new urban literacies and emergent social practices….(More)”

Using prizes to spur innovation and government savings


New report by R-Street: “In myriad sectors of the U.S. economy, from military technology to medical care, the federal government serves as the single-largest spender. As such, many of the innovations, inventions and discoveries that could propel economic growth in the future also would have a direct and measurable impact on federal spending.

To offer an incentive to research and development that yields significant taxpayer savings, we propose an “innovation savings program” that would serve as an alternative to the traditional patent system. The program would reward teams or individuals who develop discoveries or technologies that produce federal budget savings. In effect, a portion of those savings would be set aside for the discoverers. To be eligible for these rewards, the researchers and inventors would not receive patents on their discoveries or processes.

This perpetual, self-funded federal prize system would be based, in part, on the successful False Claims Act and Medicare Recovery Audit programs. Payouts would be administered by an independent or executive agency, verified by the Government Accountability Office and overseen by Congress to ensure fair and effective implementation.

New technologies developed through this process would be available immediately for generic commercialization, free of royalty fees. This could encourage innovation in sectors where patents and traditional research spending have lagged, while also bringing those innovations to market more quickly and affordably. Prize systems of this type have been in operation in the United States for more than 150 years, in the form of the False Claims Act, and date back to “qui tam” actions from the 13th century, thus predating the patent system by several hundred years. (Download PDF)

Meeting the Challenges of Big Data


Opinion by the European Data Protection Supervisor: “Big data, if done responsibly, can deliver significant benefits and efficiencies for society and individuals not only in health, scientific research, the environment and other specific areas. But there are serious concerns with the actual and potential impact of processing of huge amounts of data on the rights and freedoms of individuals, including their right to privacy. The challenges and risks of big data therefore call for more effective data protection.

Technology should not dictate our values and rights, but neither should promoting innovation and preserving fundamental rights be perceived as incompatible. New business models exploiting new capabilities for the massive collection, instantaneous transmission, combination and reuse of personal information for unforeseen purposes have placed the principles of data protection under new strains, which calls for thorough consideration on how they are applied.

European data protection law has been developed to protect our fundamental rights and values, including our right to privacy. The question is not whether to apply data protection law to big data, but rather how to apply it innovatively in new environments. Our current data protection principles, including transparency, proportionality and purpose limitation, provide the base line we will need to protect more dynamically our fundamental rights in the world of big data. They must, however, be complemented by ‘new’ principles which have developed over the years such as accountability and privacy by design and by default. The EU data protection reform package is expected to strengthen and modernise the regulatory framework .

The EU intends to maximise growth and competitiveness by exploiting big data. But the Digital Single Market cannot uncritically import the data-driven technologies and business models which have become economic mainstream in other areas of the world. Instead it needs to show leadership in developing accountable personal data processing. The internet has evolved in a way that surveillance – tracking people’s behaviour – is considered as the indispensable revenue model for some of the most successful companies. This development calls for critical assessment and search for other options.

In any event, and irrespective of the business models chosen, organisations that process large volumes of personal information must comply with applicable data protection law. The European Data Protection Supervisor (EDPS) believes that responsible and sustainable development of big data must rely on four essential elements:

  • organisations must be much more transparent about how they process personal data;
  • afford users a higher degree of control over how their data is used;
  • design user friendly data protection into their products and services; and;
  • become more accountable for what they do….(More)