Why governments need guinea pigs for policies


Jonathan Breckon in the Guardian:”People are unlikely to react positively to the idea of using citizens as guinea pigs; many will be downright disgusted. But there are times when government must experiment on us in the search for knowledge and better policy….

Though history calls into question the ethics of experimentation, unless we try things out, we will never learn. The National Audit Office says that £66bn worth of government projects have no plans to evaluate their impact. It is unethical to roll out policies in this arbitrary way. We have to experiment on a small scale to have a better understanding of how things work before rolling out policies across the UK. This is just as relevant to social policy, as it is to science and medicine, as set out in a new report by the Alliance for Useful Evidence.

Whether it’s the best ways to teach our kids to read, designing programmes to get unemployed people back to work, or encouraging organ donation – if the old ways don’t work, we have to test new ones. And that testing can’t always be done by a committee in Whitehall or in a university lab.

Experimentation can’t happen in isolation. What works in Lewisham or Londonnery, might not work in Lincoln – or indeed across the UK. For instance, there is a huge amount debate around the current practice of teaching children to read and spell using phonics, which was based on a small-scale study in Clackmannanshire, as well as evidence from the US. A government-commissioned review on the evidence for phonics led professor Carole Torgerson, then at York University, to warn against making national policy off the back of just one small Scottish trial.

One way round this problem is to do larger experiments. The increasing use of the internet in public services allows for more and faster experimentation, on a larger scale for lower cost – the randomised controlled trial on voter mobilisation that went to 61 million users in the 2010 US midterm elections, for example. However, the use of the internet doesn’t get us off the ethical hook. Facebook had to apologise after a global backlash to secret psychological tests on their 689,000 users.

Contentious experiments should be approved by ethics committees – normal practice for trials in hospitals and universities.

We are also not interested in freewheeling trial-and-error; robust and appropriate research techniques to learn from experiments are vital. It’s best to see experimentation as a continuum, ranging from the messiness of attempts to try something new to experiments using the best available social science, such as randomised controlled trials.

Experimental government means avoiding an approach where everything is fixed from the outset. What we need is “a spirit of experimentation, unburdened by promises of success”, as recommended by the late professor Roger Jowell, author of the 2003 Cabinet Office report, Trying it out [pdf]….(More)”

UNESCO demonstrates global impact through new transparency portal


“Opendata.UNESCO.org  is intended to present comprehensive, quality and timely information about UNESCO’s projects, enabling users to find information by country/region, funding source, and sector and providing comprehensive project data, including budget, expenditure, completion status, implementing organization, project documents, and more. It publishes program and financial information that are in line with UN system-experience of the IATI (International Aid Transparency Initiative) standards and other relevant transparency initiatives. UNESCO is now part of more than 230 organizations that have published to the IATI Registry, which brings together donor and developing countries, civil society organizations and other experts in aid information who are committed to working together to increase the transparency of aid.

Since its creation 70 years ago, UNESCO has tirelessly championed the causes of education, culture, natural sciences, social and human sciences, communication and information, globally. For instance – started in March 2010, the program for the Enhancement of Literacy in Afghanistan (ELA) benefited from a $19.5 million contribution by Japan. It aimed to improve the level of literacy, numeracy and vocational skills of the adult population in 70 districts of 15 provinces of Afghanistan. Over the next three years, until April 2013, the ELA programme helped some 360,000 adult learners in General Literacy compotency. An interactive map allows for an easy identification of UNESCO’s high-impact programs, and up-to-date information of current and future aid allocations within and across countries.

Public participation and interactivity are key to the success of any open data project. http://Opendata.UNESCO.org will evolve as Member States and partners will get involved, by displaying data on their own websites and sharing data among different networks, building and sharing applications, providing feedback, comments, and recommendations. …(More)”

How to Fight the Next Epidemic


Bill Gates in the New York Times: “The Ebola Crisis Was Terrible. But Next Time Could Be Much Worse….Much of the public discussion about the world’s response to Ebola has focused on whether the World Health Organization, the Centers for Disease Control and Prevention and other groups could have responded more effectively. These are worthwhile questions, but they miss the larger point. The problem isn’t so much that the system didn’t work well enough. The problem is that we hardly have a system at all.

To begin with, most poor countries, where a natural epidemic is most likely to start, have no systematic disease surveillance in place. Even once the Ebola crisis was recognized last year, there were no resources to effectively map where cases occurred, or to use people’s travel patterns to predict where the disease might go next….

Data is another crucial problem. During the Ebola epidemic, the database that tracks cases has not always been accurate. This is partly because the situation is so chaotic, but also because much of the case reporting has been done on paper and then sent to a central location for data entry….

I believe that we can solve this problem, just as we’ve solved many others — with ingenuity and innovation.

We need a global warning and response system for outbreaks. It would start with strengthening poor countries’ health systems. For example, when you build a clinic to deliver primary health care, you’re also creating part of the infrastructure for fighting epidemics. Trained health care workers not only deliver vaccines; they can also monitor disease patterns, serving as part of the early warning systems that will alert the world to potential outbreaks. Some of the personnel who were in Nigeria to fight polio were redeployed to work on Ebola — and that country was able to contain the disease very quickly.

We also need to invest in disease surveillance. We need a case database that is instantly accessible to the relevant organizations, with rules requiring countries to share their information. We need lists of trained personnel, from local leaders to global experts, prepared to deal with an epidemic immediately. … (More)”

31 cities agree to use EU-funded open innovation platform for better smart cities’ services


European Commission Press Release: “At CEBIT, 25 cities from 6 EU countries (Belgium, Denmark, Finland, Italy, Portugal and Spain) and 6 cities from Brazil will present Open & Agile Smart Cities Task Force (OASC), an initiative making it easier for city councils  and startups to improve smart city services (such as transport, energy efficiency, environmental or e-health services). This will be achieved thanks to FIWARE, an EU-funded, open source platform and cloud-based building blocks developed in the EU that can be used to develop a huge range of applications, from Smart Cities to eHealth, and from transport to disaster management. Many applications have already been built using FIWARE – from warnings of earthquakes to preventing food waste to Smartaxi apps. Find a full list of cities in the Background.

The OASC deal will allow cities to share their open data (collected from sensors measuring, for example, traffic flows) so that startups can develop apps and tools that benefit all citizens (for example, an app with traffic information for people on the move). Moreover, these systems will be shared between cities (so, an app with transport information developed in city A can be also adopted by city B, without the latter having to develop it from scratch); FIWARE will also give startups and app developers in these cities access to a global market for smart city services.

Cities from across the globe are trying to make the most of open innovation. This will allow them to include a variety of stakeholders in their activities (services are increasingly connected to other systems and innovative startups are a big part of this trend) and encourage a competitive yet attractive market for developers, thus reducing costs, increasing quality and avoiding vendor lock-in….(More)”

Data democracy – increased supply of geospatial information and expanded participatory processes in the production of data


Paper by Max Craglia & Lea Shanley: “The global landscape in the supply, co-creation and use of geospatial data is changing very rapidly with new satellites, sensors and mobile devices reconfiguring the traditional lines of demand and supply and the number of actors involved. In this paper we chart some of these technology-led developments and then focus on the opportunities they have created for the increased participation of the public in generating and contributing information for a wide range of uses, scientific and non. Not all this information is open or geospatial, but sufficiently large portions of it are to make it one of the most significant phenomena of the last decade. In fact, we argue that while satellite and sensors have exponentially increased the volumes of geospatial information available, the participation of the public is transformative because it expands the range of participants and stakeholders in society using and producing geospatial information, with opportunities for more direct participation in science, politics and social action…(View full text)”

Study to examine Australian businesses’ use of government data


ComputerWorld: “The New York University’s GovLab and the federal Department of Communications have embarked on a study of how Australian organisations are employing government data sets.

The ‘Open Data 500’ study was launched today at the Locate15 conference. It aims to provide a basis for assessing the value of open data and encourage the development of new businesses based on open data, as well as encourage discussion about how to make government data more useful to businesses and not-for-profit organisations.

The study is part of a series of studies taking place under the auspices of the OD500 Global Network.

“This study will help ensure the focus of Government is on the publication of high value datasets, with an emphasis on quality rather than quantity,” a statement issued by the Department of Communications said.

“Open Data 500 advances the government’s policy of increasing the number of high value public datasets in Australia in an effort to drive productivity and innovation, as well as its commitment to greater consultation with private sector stakeholders on open data,” Communications Minister Malcolm Turnbull said in remarks prepared for the Locate 15 conference….(More)”

Budgets for the People


Collective Intelligence or Group Think?


Paper analyzing “Engaging Participation Patterns in World without Oil” by Nassim JafariNaimi and Eric M. Meyers: “This article presents an analysis of participation patterns in an Alternate Reality Game, World Without Oil. This game aims to bring people together in an online environment to reflect on how an oil crisis might affect their lives and communities as a way to both counter such a crisis and to build collective intelligence about responding to it. We present a series of participation profiles based on a quantitative analysis of 1554 contributions to the game narrative made by 322 players. We further qualitatively analyze a sample of these contributions. We outline the dominant themes, the majority of which engage the global oil crisis for its effects on commute options and present micro-sustainability solutions in response. We further draw on the quantitative and qualitative analysis of this space to discuss how the design of the game, specifically its framing of the problem, feedback mechanism, and absence of subject-matter expertise, counter its aim of generating collective intelligence, making it conducive to groupthink….(More)”

New research project to map the impact of open budget data


Jonathan Gray at Open Knowledge: “…a new research project to examine the impact of open budget data, undertaken as a collaboration between Open Knowledge and the Digital Methods Initiative at the University of Amsterdam, supported by the Global Initiative for Financial Transparency (GIFT).

The project will include an empirical mapping of who is active around open budget data around the world, and what the main issues, opportunities and challenges are according to different actors. On the basis of this mapping it will provide a review of the various definitions and conceptions of open budget data, arguments for why it matters, best practises for publication and engagement, as well as applications and outcomes in different countries around the world.

As well as drawing on Open Knowledge’s extensive experience and expertise around open budget data (through projects such as Open Spending), it will utilise innovative tools and methods developed at the University of Amsterdam to harness evidence from the web, social media and collections of documents to inform and enrich our analysis.

As part of this project we’re launching a collaborative bibliography of existing research and literature on open budget data and associated topics which we hope will become a useful resource for other organisations, advocates, policy-makers, and researchers working in this area. If you have suggestions for items to add, please do get in touch.

This project follows on from other research projects we’ve conducted around this area – including on data standards for fiscal transparency, on technology for transparent and accountable public finance, and on mapping the open spending community….(More)”

Crowdsourcing America’s cybersecurity is an idea so crazy it might just work


at the Washington Post: “One idea that’s starting to bubble up from Silicon Valley is the concept of crowdsourcing cybersecurity. As Silicon Valley venture capitalist Robert R. Ackerman, Jr. has pointed out, due to “the interconnectedness of our society in cyberspace,” cyber networks are best viewed as an asset that we all have a shared responsibility to protect. Push on that concept hard enough and you can see how many of the core ideas from Silicon Valley – crowdsourcing, open source software, social networking, and the creative commons – can all be applied to cybersecurity.

Silicon Valley venture capitalists are already starting to fund companies that describe themselves as crowdsourcing cybersecurity. For example, take Synack, a “crowd security intelligence” company that received $7.5 million in funding from Kleiner Perkins (one of Silicon Valley’s heavyweight venture capital firms), Allegis Ventures, and Google Ventures in 2014. Synack’s two founders are ex-NSA employees, and they are using that experience to inform an entirely new type of business model. Synack recruits and vets a global network of “white hat hackers,” and then offers their services to companies worried about their cyber networks. For a fee, these hackers are able to find and repair any security risks.

So how would crowdsourced national cybersecurity work in practice?

For one, there would be free and transparent sharing of computer code used to detect cyber threats between the government and private sector. In December, the U.S. Army Research Lab added a bit of free source code, a “network forensic analysis network” known as Dshell, to the mega-popular code sharing site GitHub. Already, there have been 100 downloads and more than 2,000 unique visitors. The goal, says William Glodek of the U.S. Army Research Laboratory, is for this shared code to “help facilitate the transition of knowledge and understanding to our partners in academia and industry who face the same problems.”

This open sourcing of cyber defense would be enhanced with a scaled-up program of recruiting “white hat hackers” to become officially part of the government’s cybersecurity efforts. Popular annual events such as the DEF CON hacking conference could be used to recruit talented cyber sleuths to work alongside the government.

There have already been examples of communities where people facing a common cyber threat gather together to share intelligence. Perhaps the best-known example is the Conficker Working Group, a security coalition that was formed in late 2008 to share intelligence about malicious Conficker malware. Another example is the Financial Services Information Sharing and Analysis Center, which was created by presidential mandate in 1998 to share intelligence about cyber threats to the nation’s financial system.

Of course, there are some drawbacks to this crowdsourcing idea. For one, such a collaborative approach to cybersecurity might open the door to government cyber defenses being infiltrated by the enemy. Ackerman makes the point that you never really know who’s contributing to any community. Even on a site such as Github, it’s theoretically possible that an ISIS hacker or someone like Edward Snowden could download the code, reverse engineer it, and then use it to insert “Trojan Horses” intended for military targets into the code….  (More)