Digital Government: Turning the Rhetoric into Reality


Miguel Carrasco and Peter Goss at BCG Perspectives: “Getting better—but still plenty of room for improvement: that’s the current assessment by everyday users of their governments’ efforts to deliver online services. The public sector has made good progress, but most countries are not moving nearly as quickly as users would like. Many governments have made bold commitments, and a few countries have determined to go “digital by default.” Most are moving more modestly, often overwhelmed by complexity and slowed by bureaucratic skepticism over online delivery as well as by a lack of digital skills. Developing countries lead in the rate of online usage, but they mostly trail developed nations in user satisfaction.
Many citizens—accustomed to innovation in such sectors as retailing, media, and financial services—wish their governments would get on with it. Of the services that can be accessed online, many only provide information and forms, while users are looking to get help and transact business. People want to do more. Digital interaction is often faster, easier, and more efficient than going to a service center or talking on the phone, but users become frustrated when the services do not perform as expected. They know what good online service providers offer. They have seen a lot of improvement in recent years, and they want their governments to make even better use of digital’s capabilities.
Many governments are already well on the way to improving digital service delivery, but there is often a gap between rhetoric and reality. There is no shortage of government policies and strategies relating to “digital first,” “e-government,” and “gov2.0,” in addition to digital by default. But governments need more than a strategy. “Going digital” requires leadership at the highest levels, investments in skills and human capital, and cultural and behavioral change. Based on BCG’s work with numerous governments and new research into the usage of, and satisfaction with, government digital services in 12 countries, we see five steps that most governments will want to take:

1. Focus on value. Put the priority on services with the biggest gaps between their importance to constituents and constituents’ satisfaction with digital delivery. In most countries, this will mean services related to health, education, social welfare, and immigration.

2. Adopt service design thinking. Governments should walk in users’ shoes. What does someone encounter when he or she goes to a government service website—plain language or bureaucratic legalese? How easy is it for the individual to navigate to the desired information? How many steps does it take to do what he or she came to do? Governments can make services easy to access and use by, for example, requiring users to register once and establish a digital credential, which can be used in the future to access online services across government.

3. Lead users online, keep users online. Invest in seamless end-to-end capabilities. Most government-service sites need to advance from providing information to enabling users to transact their business in its entirety, without having to resort to printing out forms or visiting service centers.

4. Demonstrate visible senior-leadership commitment. Governments can signal—to both their own officials and the public—the importance and the urgency that they place on their digital initiatives by where they assign responsibility for the effort.

5. Build the capabilities and skills to execute. Governments need to develop or acquire the skills and capabilities that will enable them to develop and deliver digital services.

This report examines the state of government digital services through the lens of Internet users surveyed in Australia, Denmark, France, Indonesia, the Kingdom of Saudi Arabia, Malaysia, the Netherlands, Russia, Singapore, the United Arab Emirates (UAE), the UK, and the U.S. We investigated 37 different government services. (See Exhibit 1.)…”

Open for Business: How Open Data Can Help Achieve the G20 Growth Target


New Report commissioned by Omydiar Network on the Business Case for Open Data: “Economic analysis has confirmed the significant contribution to economic growth and productivity achievable through an open data agenda. Governments, the private sector, individuals and communities all stand to benefit from the innovation and information that will inform investment, drive the creation of new industries, and inform decision making and research. To mark a step change in the way valuable information is created and reused, the G20 should release information as open data.
In May 2014, Omidyar Network commissioned Lateral Economics to undertake economic analysis on the potential of open data to support the G20’s 2% growth target and illustrate how an open data agenda can make a significant contribution to economic growth and productivity. Combining all G20 economies, output could increase by USD 13 trillion cumulatively over the next five years. Implementation of open data policies would thus boost cumulative G20 GDP by around 1.1 percentage points (almost 55%) of the G20’s 2% growth target over five years.
Recommendations
Importantly, open data cuts across a number of this year’s G20 priorities: attracting private infrastructure investment, creating jobs and lifting participation, strengthening tax systems and fighting corruption. This memo suggests an open data thread that runs across all G20 priorities. The more data is opened, the more it can be used, reused, repurposed and built on—in combination with other data—for everyone’s benefit.
We call on G20 economies to sign up to the Open Data Charter.
The G20 should ensure that data released by G20 working groups and themes is in line with agreed open data standards. This will lead to more accountable, efficient, effective governments who are going further to expose inadequacy, fight corruption and spur innovation.
Data is a national resource and open data is a ‘win-win’ policy. It is about making more of existing resources. We know that the cost of opening data is smaller than the economic returns, which could be significant. Methods to respect privacy concerns must be taken into account. If this is done, as the public and private sector share of information grows, there will be increasing positive returns.
The G20 opportunity
This November, leaders of the G20 Member States will meet in Australia to drive forward commitments made in the St Petersburg G20 Leaders Declaration last September and to make firm progress on stimulating growth. Actions across the G20 will include increasing investment, lifting employment and participation, enhancing trade and promoting competition.
The resulting ‘Brisbane Action Plan’ will encapsulate all of these commitments with the aim of raising the level of G20 output by at least 2% above the currently projected level over the next five years. There are major opportunities for cooperative and collective action by G20 governments.
Governments should intensify the release of existing public sector data – both government and publicly funded research data. But much more can be done to promote open data than simply releasing more government data. In appropriate circumstances, governments can mandate public disclosure of private sector data (e.g. in corporate financial reporting).
Recommendations for action

  • G20 governments should adopt the principles of the Open Data Charter to encourage the building of stronger, more interconnected societies that better meet the needs of our citizens and allow innovation and prosperity to flourish.
  • G20 governments should adopt specific open data targets under each G20 theme, as illustrated below, such as releasing open data related to beneficial owners of companies, as well revenues from extractive industries
  • G20 governments should consider harmonizing licensing regimes across the G20
  • G20 governments should adopt metrics for measuring the quantity and quality of open data publication, e.g. using the Open Data Institute’s Open Data Certificates as a bottom-up mechanism for driving the adoption of common standards.

Illustrative G20 examples
Fiscal and monetary policy
Governments possess rich real time data that is not open or accessed by government macro-economic managers. G20 governments should:

  • Open up models that lie behind economic forecasts and help assess alternative policy settings;
  • Publish spending and contractual data to enable comparative shopping by government between government suppliers.

Anti corruption
Open data may directly contribute to reduced corruption by increasing the likelihood corruption will be detected. G20 governments should:

  • Release open data related to beneficial owners of companies as well as revenues from extractive industries,
  • Collaborate on harmonised technical standards that permit the tracing of international money flows – including the tracing of beneficial owners of commercial entities, and the comparison and reconciliation of transactions across borders.

Trade
Obtaining and using trade data from multiple jurisdictions is difficult. Access fees, specific licenses, and non-machine readable formats all involve large transaction costs. G20 governments should:

  • Harmonise open data policies related to trade data.
  • Use standard trade schema and formats.

Employment
Higher quality information on employment conditions would facilitate better matching of employees to organizations, producing greater job-satisfaction and improved productivity. G20 governments should:

  • Open up centralised job vacancy registers to provide new mechanisms for people to find jobs.
  • Provide open statistical information about the demand for skills in particular areas to help those supporting training and education to hone their offerings.

Energy
Open data will help reduce the cost of energy supply and improve energy efficiency. G20 governments should:

  • Provide incentives for energy companies to publish open data from consumers and suppliers to enable cost savings through optimizing energy plans.
  • Release energy performance certifications for buildings
  • Publish real-time energy consumption for government buildings.

Infrastructure
Current infrastructure asset information is fragmented and inefficient. Exposing current asset data would be a significant first step in understanding gaps and providing new insights. G20 governments should:

  • Publish open data on governments’ infrastructure assets and plans to better understand infrastructure gaps, enable greater efficiency and insights in infrastructure development and use and analyse cost/benefits.
  • Publish open infrastructure data, including contracts via Open Contracting Partnership, in a consistent and harmonised way across G20 countries…”

New Book on 25 Years of Participatory Budgeting


Tiago Peixoto at Democracy Spot: “A little while ago I mentioned the launch of the Portuguese version of the book organized by Nelson Dias, “Hope for Democracy: 25 Years of Participatory Budgeting Worldwide”.

The good news is that the English version is finally out. Here’s an excerpt from the introduction:

This book represents the effort  of more than forty authors and many other direct and indirect contributions that spread across different continents seek to provide an overview on the Participatory Budgeting (PB) in the World. They do so from different backgrounds. Some are researchers, others are consultants, and others are activists connected to several groups and social movements. The texts reflect this diversity of approaches and perspectives well, and we do not try to influence that.
(….)
The pages that follow are an invitation to a fascinating journey on the path of democratic innovation in very diverse cultural, political, social and administrative settings. From North America to Asia, Oceania to Europe, from Latin America to Africa, the reader will find many reasons to closely follow the proposals of the different authors.

The book can be downloaded here [PDF]. I had the pleasure of being one of the book’s contributors, co-authoring an article with Rafael Sampaio on the use of ICT in PB processes: “Electronic Participatory Budgeting: False Dilemmas and True Complexities” [PDF]...”

Democracy and open data: are the two linked?


Molly Shwartz at R-Street: “Are democracies better at practicing open government than less free societies? To find out, I analyzed the 70 countries profiled in the Open Knowledge Foundation’s Open Data Index and compared the rankings against the 2013 Global Democracy Rankings. As a tenet of open government in the digital age, open data practices serve as one indicator of an open government. Overall, there is a strong relationship between democracy and transparency.
Using data collected in October 2013, the top ten countries for openness include the usual bastion-of-democracy suspects: the United Kingdom, the United States, mainland Scandinavia, the Netherlands, Australia, New Zealand and Canada.
There are, however, some noteworthy exceptions. Germany ranks lower than Russia and China. All three rank well above Lithuania. Egypt, Saudi Arabia and Nepal all beat out Belgium. The chart (below) shows the democracy ranking of these same countries from 2008-2013 and highlights the obvious inconsistencies in the correlation between democracy and open data for many countries.
transparency
There are many reasons for such inconsistencies. The implementation of open-government efforts – for instance, opening government data sets – often can be imperfect or even misguided. Drilling down to some of the data behind the Open Data Index scores reveals that even countries that score very well, such as the United States, have room for improvement. For example, the judicial branch generally does not publish data and houses most information behind a pay-wall. The status of legislation and amendments introduced by Congress also often are not available in machine-readable form.
As internationally recognized markers of political freedom and technological innovation, open government initiatives are appealing political tools for politicians looking to gain prominence in the global arena, regardless of whether or not they possess a real commitment to democratic principles. In 2012, Russia made a public push to cultivate open government and open data projects that was enthusiastically endorsed by American institutions. In a June 2012 blog post summarizing a Russian “Open Government Ecosystem” workshop at the World Bank, one World Bank consultant professed the opinion that open government innovations “are happening all over Russia, and are starting to have genuine support from the country’s top leaders.”
Given the Russian government’s penchant for corruption, cronyism, violations of press freedom and increasing restrictions on public access to information, the idea that it was ever committed to government accountability and transparency is dubious at best. This was confirmed by Russia’s May 2013 withdrawal of its letter of intent to join the Open Government Partnership. As explained by John Wonderlich, policy director at the Sunlight Foundation:

While Russia’s initial commitment to OGP was likely a surprising boon for internal champions of reform, its withdrawal will also serve as a demonstration of the difficulty of making a political commitment to openness there.

Which just goes to show that, while a democratic government does not guarantee open government practices, a government that regularly violates democratic principles may be an impossible environment for implementing open government.
A cursory analysis of the ever-evolving international open data landscape reveals three major takeaways:

  1. Good intentions for government transparency in democratic countries are not always effectively realized.
  2. Politicians will gladly pay lip-service to the idea of open government without backing up words with actions.
  3. The transparency we’ve established can go away quickly without vigilant oversight and enforcement.”

The Universe Is Programmable. We Need an API for Everything


Keith Axline in Wired: “Think about it like this: In the Book of Genesis, God is the ultimate programmer, creating all of existence in a monster six-day hackathon.
Or, if you don’t like Biblical metaphors, you can think about it in simpler terms. Robert Moses was a programmer, shaping and re-shaping the layout of New York City for more than 50 years. Drug developers are programmers, twiddling enzymes to cure what ails us. Even pickup artists and conmen are programmers, running social scripts on people to elicit certain emotional results.

Keith Axline in Wired: “Everyone is becoming a programmer. The next step is to realize that everything is a program.

The point is that, much like the computer on your desk or the iPhone in your hand, the entire Universe is programmable. Just as you can build apps for your smartphones and new services for the internet, so can you shape and re-shape almost anything in this world, from landscapes and buildings to medicines and surgeries to, well, ideas — as long as you know the code.
That may sound like little more than an exercise in semantics. But it’s actually a meaningful shift in thinking. If we look at the Universe as programmable, we can start treating it like software. In short, we can improve almost everything we do with the same simple techniques that have remade the creation of software in recent years, things like APIs, open source code, and the massively popular code-sharing service GitHub.
The great thing about the modern software world is that you don’t have to build everything from scratch. Apple provides APIs, or application programming interfaces, that can help you build apps on their devices. And though Tim Cook and company only give you part of what you need, you can find all sorts of other helpful tools elsewhere, thanks to the open source software community.
The same is true if you’re building, say, an online social network. There are countless open source software tools you can use as the basic building blocks — many of them open sourced by Facebook. If you’re creating almost any piece of software, you can find tools and documentation that will help you fashion at least a small part of it. Chances are, someone has been there before, and they’ve left some instructions for you.
Now we need to discover and document the APIs for the Universe. We need a standard way of organizing our knowledge and sharing it with the world at large, a problem for which programmers already have good solutions. We need to give everyone a way of handling tasks the way we build software. Such a system, if it can ever exist, is still years away — decades at the very least — and the average Joe is hardly ready for it. But this is changing. Nowadays, programming skills and the DIY ethos are slowly spreading throughout the population. Everyone is becoming a programmer. The next step is to realize that everything is a program.

What Is an API?

The API may sound like just another arcane computer acronym. But it’s really one of the most profound metaphors of our time, an idea hiding beneath the surface of each piece of tech we use everyday, from iPhone apps to Facebook. To understand what APIs are and why they’re useful, let’s look at how programmers operate.
If I’m building a smartphone app, I’m gonna need — among so many other things — a way of validating a signup form on a webpage to make sure a user doesn’t, say, mistype their email address. That validation has nothing to do with the guts of my app, and it’s surprisingly complicated, so I don’t really want to build it from scratch. Apple doesn’t help me with that, so I start looking on the web for software frameworks, plugins, Software Developer Kits (SDKs) — anything that will help me build my signup tool.
Hopefully, I’ll find one. And if I do, chances are it will include some sort of documentation or “Readme file” explaining how this piece of code is supposed to be used so that I can tailor it to my app. This Readme file should contain installation instructions as well as the API for the code. Basically, an API lays out the code’s inputs and outputs. It shows what me what I have to send the code and what it will spit back out. It shows how I bolt it onto my signup form. So the name is actually quite explanatory: Application Programming Interface. An API is essentially an instruction manual for a piece of software.
Now, let’s combine this with the idea that everything is an application: molecules, galaxies, dogs, people, emotional states, abstract concepts like chaos. If you do something to any these things, they’ll respond in some way. Like software, they have inputs and outputs. What we need to do is discover and document their APIs.
We aren’t dealing with software code here. Inputs and outputs can themselves be anything. But we can closely document these inputs and their outputs — take what we know about how we interface with something and record it in a standard way that it can be used over and over again. We can create a Readme file for everything.
We can start by doing this in small, relatively easy ways. How about APIs for our cities? New Zealand just open sourced aerial images of about 95 percent of its land. We could write APIs for what we know about building in those areas, from properties of the soil to seasonal weather patterns to zoning laws. All this knowledge exists but it hasn’t been organized and packaged for use by anyone who is interested. And we could go still further — much further.
For example, between the science community, the medical industry and the billions of human experiences, we could probably have a pretty extensive API mapped out of the human stomach — one that I’d love to access when I’m up at 3am with abdominal pains. Maybe my microbiome is out of whack and there’s something I have on-hand that I could ingest to make it better. Or what if we cracked the API for the signals between our eyes and our brain? We wouldn’t need to worry about looking like Glassholes to get access to always-on augmented reality. We could just get an implant. Yes, these APIs will be slightly different for everyone, but that brings me to the next thing we need.

A GitHub for Everything

We don’t just need a Readme for the Universe. We need a way of sharing this Readme and changing it as need be. In short, we need a system like GitHub, the popular online service that lets people share and collaborate on software code.
Let’s go back to the form validator I found earlier. Say I made some modifications to it that I think other programmers would find useful. If the validator is on GitHub, I can create a separate but related version — a fork — that people can find and contribute to, in the same way I first did with the original software.

This creates a tree of knowledge, with giant groups of people creating and merging branches, working on their small section and then giving it back to the whole.

GitHub not only enables this collaboration, but every change is logged into separate versions. If someone were so inclined, they could go back and replay the building of the validator, from the very first save all the way up to my changes and whoever changes it after me. This creates a tree of knowledge, with giant groups of people creating and merging branches, working on their small section and then giving it back to the whole.
We should be able to funnel all existing knowledge of how things work — not just software code — into a similar system. That way, if my brain-eye interface needs to be different, I (or my personal eye technician) can “fork” the API. In a way, this sort of thing is already starting to happen. People are using GitHub to share government laws, policy documents, Gregorian chants, and the list goes on. The ultimate goal should be to share everything.
Yes, this idea is similar to what you see on sites like Wikipedia, but the stuff that’s shared on Wikipedia doesn’t let you build much more than another piece of text. We don’t just need to know what things are. We need to know how they work in ways that let us operate on them.

The Open Source Epiphany

If you’ve never programmed, all this can sound a bit, well, abstract. But once you enter the coding world, getting a loose grasp on the fundamentals of programming, you instantly see the utility of open source software. “Oooohhh, I don’t have to build this all myself,” you say. “Thank God for the open source community.” Because so many smart people contribute to open source, it helps get the less knowledgeable up to speed quickly. Those acolytes then pay it forward with their own contributions once they’ve learned enough.
Today, more and more people are jumping on this train. More and more people are becoming programmers of some shape or form. It wasn’t so long ago that basic knowledge of HTML was considered specialized geek speak. But now, it’s a common requirement for almost any desk job. Gone are the days when kids made fun of their parents for not being able to set the clock on the VCR. Now they get mocked for mis-cropping their Facebook profile photos.
These changes are all part of the tech takeover of our lives that is trickling down to the masses. It’s like how the widespread use of cars brought a general mechanical understanding of engines to dads everywhere. And this general increase in aptitude is accelerating along with the technology itself.
Steps are being taken to make programming a skill that most kids get early in school along with general reading, writing, and math. In the not too distant future, people will need to program in some form for their daily lives. Imagine the world before the average person knew how to write a letter, or divide two numbers, compared to now. A similar leap is around the corner…”

The promise and perils of giving the public a policy ‘nudge’


Nicholas Biddle and Katherine Curchin at the Conversation: “…These behavioural insights are more than just intellectual curiosities. They are increasingly being used by policymakers inspired by Richard Thaler and Cass Sunstein’s bestselling manifesto for libertarian paternalism, Nudge.
The British and New South Wales governments have set up behavioural insights units. Many other governments around Australia are following their lead.
Most of the attention so far has been on how behavioural insights could be employed to make people slimmer, greener, more altruistic or better savers. However, it’s time we started thinking and talking about the impact these ideas could have on social policy – programs and payments that aim to reduce disadvantage and narrow divergence in opportunity.
While applying behavioural insights can potentially improve the efficiency and effectiveness of social policy, unscrupulous or poorly thought through applications could be disturbing and damaging. It would appear behavioural insights inspired the UK government’s so-called “Nudge Unit” to force job seekers to undergo bogus personality tests – on pain of losing benefits if they refused.
The idea seemed to be that because people readily believe that any vaguely worded combination of character traits applies to them – which is why people connect with their star sign – the results of a fake psychometric test can dupe them into believing they have a go-getting personality.
In our view, this is not how behavioural insights should be applied. This UK example seems to be a particularly troubling case of the use of “nudges” in conjunction with, rather than instead of, coercion. This is the worst of both worlds: not libertarian paternalism, but authoritarian paternalism.
Ironically, this instance betrays a questionable understanding of behavioural insights or at the very least a very short-term focus. Research tells us that co-operative behaviour depends on the perception of fairness and successful framing requires trust.
Dishonest interventions, which make the government seem both unfair and untrustworthy, should have the longer-term effect of undermining its ability to elicit cooperation and successfully frame information.
Some critics have assumed nudge is inherently conservative or neoliberal. Yet these insights could inform progressive reform in many ways.
For example, taking behavioural insights seriously would encourage a redesign of employment services. There is plenty of scope for thinking more rigorously about how job seekers’ interactions with employment services unintentionally inhibit their motivation to search for work.

Beware accidental nudges

More than just a nudge here or there, behavioural insights can be used to reflect on almost all government decisions. Too often governments accidentally nudge citizens in the opposite direction to where they want them to go.
Take the disappointing take-up of the Matched Savings Scheme, which is part of New Income Management in the Northern Territory. It matches welfare recipients’ savings dollar-for-dollar up to a maximum of A$500 and is meant to get people into the habit of saving regularly.
No doubt saving is extremely hard for people on very low incomes. But another reason so few people embraced the savings program may be a quirk in its design: people had to save money out of their non-income-managed funds, but the $500 reward they received from the government went into their income-managed account.
To some people this appears to have signalled the government’s bad faith. It said to them: even if you demonstrate your responsibility with money, we still won’t trust you.
The Matched Savings Scheme was intended to be a carrot, not a stick. It was supposed to complement the coercive element of income management by giving welfare recipients an incentive to improve their budgeting. Instead it was perceived as an invitation to welfare recipients to be complicit in their own humiliation.
The promise of an extra $500 would have been a strong lure for Homo economicus, but it wasn’t for Homo sapiens. People out of work or on income support are no more or less rational than merchant bankers or economics professors. Their circumstances and choices are different though.
The idiosyncrasies of human decision-making don’t mean that the human brain is fundamentally flawed. Most of the biases that we mentioned earlier are adaptive. But they do mean that policy makers need to appreciate how we differ from rational utility maximisers.”
Real humans are not worse than economic man. We’re just different and we deserve policies made for Homo sapiens, not Homo economicus.

Democracy in Retreat


Book by Joshua Kurlantzick (Council on Foreign Relations) on “The Revolt of the Middle Class and the Worldwide Decline of Representative Government”: “Since the end of the Cold War, most political theorists have assumed that as countries develop economically, they will also become more democratic—especially if a vibrant middle class takes root. The triumph of democracy, once limited to a tiny number of states and now spread across the globe, has been considered largely inevitable.
In Democracy in Retreat: The Revolt of the Middle Class and the Worldwide Decline of Representative Government, CFR Fellow for Southeast Asia Joshua Kurlantzick identifies forces that threaten democracy and shows that conventional wisdom has blinded world leaders to a real crisis. “Today a constellation of factors, from the rise of China to the lack of economic growth in new democracies to the West’s financial crisis, has come together to hinder democracy throughout the developing world,” he writes. “Absent radical and unlikely changes in the international system, that combination of antidemocratic factors will have serious staying power.”
Kurlantzick pays particular attention to the revolt of middle class citizens, traditionally proponents of reform, who have turned against democracy in countries such as Venezuela, Pakistan, and Taiwan. He observes that countries once held up as model new democracies, such as Hungary and the Czech Republic, have since curtailed social, economic, and political freedoms. Military coups have grabbed power from Honduras to Thailand to Fiji. The number of representative governments has fallen, and the quality of democracy has deteriorated in many states where it had been making progress, including Russia, Kenya, Argentina, and Nigeria.
The renewed strength of authoritarian rule, warns Kurlantzick, means that billions of people around the world continue to live under repressive regimes.”

The GovLab Index: Open Data


Please find below the latest installment in The GovLab Index series, inspired by Harper’s Index. “The GovLab Index: Open Data — December 2013” provides an update on our previous Open Data installment, and highlights global trends in Open Data and the release of public sector information. Previous installments include Measuring Impact with Evidence, The Data Universe, Participation and Civic Engagement and Trust in Institutions.
Value and Impact

  • Potential global value of open data estimated by McKinsey: $3 trillion annually
  • Potential yearly value for the United States: $1.1 trillion 
  • Europe: $900 billion
  • Rest of the world: $1.7 trillion
  • How much the value of open data is estimated to grow per year in the European Union: 7% annually
  • Value of releasing UK’s geospatial data as open data: 13 million pounds per year by 2016
  • Estimated worth of business reuse of public sector data in Denmark in 2010: more than €80 million a year
  • Estimated worth of business reuse of public sector data across the European Union in 2010: €27 billion a year
  • Total direct and indirect economic gains from easier public sector information re-use across the whole European Union economy, as of May 2013: €140 billion annually
  • Economic value of publishing data on adult cardiac surgery in the U.K., as of May 2013: £400 million
  • Economic value of time saved for users of live data from the Transport for London apps, as of May 2013: between £15 million and £58 million
  • Estimated increase in GDP in England and Wales in 2008-2009 due to the adoption of geospatial information by local public services providers: +£320m
  • Average decrease in borrowing costs in sovereign bond markets for emerging market economies when implementing transparent practices (measured by accuracy and frequency according to IMF policies, across 23 countries from 1999-2002): 11%
  • Open weather data supports an estimated $1.5 billion in applications in the secondary insurance market – but much greater value comes from accurate weather predictions, which save the U.S. annually more than $30 billion
  • Estimated value of GPS data: $90 billion

Efforts and Involvement

  • Number of U.S. based companies identified by the GovLab that use government data in innovative ways: 500
  • Number of open data initiatives worldwide in 2009: 2
  • Number of open data initiatives worldwide in 2013: over 300
  • Number of countries with open data portals: more than 40
  • Countries who share more information online than the U.S.: 14
  • Number of cities globally that participated in 2013 International Open Data Hackathon Day: 102
  • Number of U.S. cities with Open Data Sites in 2013: 43
  • U.S. states with open data initiatives: 40
  • Membership growth in the Open Government Partnership in two years: from 8 to 59 countries
  • Number of time series indicators (GDP, foreign direct investment, life expectancy, internet users, etc.) in the World Bank Open Data Catalog: over 8,000
  • How many of 77 countries surveyed by the Open Data Barometer have some form of Open Government Data Initiative: over 55%
  • How many OGD initiatives have dedicated resources with senior level political backing: over 25%
  • How many countries are in the Open Data Index: 70
    • How many of the 700 key datasets in the Index are open: 84
  • Number of countries in the Open Data Census: 77
    • How many of the 727 key datasets in the Census are open: 95
  • How many countries surveyed have formal data policies in 2013: 55%
  • Those who have machine-readable data available: 25%
  • Top 5 countries in Open Data rankings: United Kingdom, United States, Sweden, New Zealand, Norway
  • The different levels of Open Data Certificates a data user or publisher can achieve “along the way to world-class open data”: 4 levels, Raw, Pilot, Standard and Expert
  • The number of data ecosystems categories identified by the OECD: 3, data producers, infomediaries, and users

Examining Datasets
FULL VERSION AT http://thegovlab.org/govlab-index-open-data-updated/
 

AU: Govt finds one third of open data was "junk"


IT News: “The number of datasets available on the Government’s open data website has slimmed by more than half after the agency discovered one third of the datasets were junk.
Since its official launch in 2011 data.gov.au grew to hold 1200 datasets from government agencies for public consumption.
In July this year the Deaprtment of Finance migrated the portal to a new open source platform – the Open Knowledge Foundation CKAN platform – for greater ease of use and publishing ability.
Since July the number of datasets fell from 1200 to 500.
Australian Government CTO John Sheridan said in his blog late yesterday the agency had needed to review the 1200 datasets as a result of the CKAN migration, and discovered a significant amount of them were junk.
“We unfortunately found that a third of the “datasets” were just links to webpages or files that either didn’t exist anymore, or redirected somewhere not useful to genuine seekers of data,” Sheridan said.
“In the second instance, the original 1200 number included each individual file. On the new platform, a dataset may have multiple files. In one case we have a dataset with 200 individual files where before it was counted as 200 datasets.”
The number of datasets following the clean out now sits at 529. Around 123 government bodies contributed data to the portal.
Sheridan said the number was still too low.
“A lot of momentum has built around open data in Australia, including within governments around the country and we are pleased to report that a growing number of federal agencies are looking at how they can better publish data to be more efficient, improve policy development and analysis, deliver mobile services and support greater transparency and public innovation,” he said….
The Federal Government’s approach to open data has previously been criticised as “patchy” and slow, due in part to several shortcomings in the data.gov.au website as well as slow progress in agencies adopting an open approach by default.
The Australian Information Commissioner’s February report on open data in government outlined the manual uploading and updating of datasets, lack of automated entry for metadata and a lack of specific search functions within data.gov.au as obstacles affecting the efforts pushing a whole-of-government approach to open data.
The introduction of the new CKAN platform is expected to go some way to addressing the highlighted concerns.”

Why This Company Is Crowdsourcing, Gamifying The World's Most Difficult Problems


FastCompany: “The biggest consultancy firms–the McKinseys and Janeses of the world–make many millions of dollars predicting the future and writing what-if reports for clients. This model is built on the idea that those companies know best–and that information and ideas should be handed down from on high.
But one consulting house, Wikistrat, is upending the model: Instead of using a stable of in-house analysts, the company crowdsources content and pays the crowd for its time. Wikistrat’s hundreds of analysts–primarily consultants, academics, journalists, and retired military personnel–are compensated for participating in what they call “crowdsourced simulations.” In other words, make money for brainstorming.

According to Joel Zamel, Wikistrat’s founder, approximately 850 experts in various fields rotate in and out of different simulations and project exercises for the company. While participating in a crowdsourced simulation, consultants are are paid a flat fee plus performance bonuses based on a gamification engine where experts compete to win extra cash. The company declined revealing what the fee scale is, but as of 2011 bonus money appears to be in the $10,000 range.
Zamel characterizes the company’s clients as a mix of government agencies worldwide and multinational corporations. The simulations are semi-anonymous for players; consultants don’t know who their paper is being written for or who the end consumer is, but clients know which of Wikistrat’s contestants are participating in the brainstorm exercise. Once an exercise is over, the discussions from the exercise are taken by full-time employees at Wikistrat and converted into proper reports for clients.
“We’ve developed a quite significant crowd network and a lot of functionality into the platform,” Zamel tells Fast Company. “It uses a gamification engine we created that incentivizes analysts by ranking them at different levels for the work they do on the platform. They are immediately rewarded through the engine, and we also track granular changes made in real time. This allows us to track analyst activity and encourages them to put time and energy into Wiki analysis.” Zamel says projects typically run between three and four weeks, with between 50 and 100 analysts working on a project for generally between five and 12 hours per week. Most of the analysts, he says, view this as a side income on top of their regular work at day jobs but some do much more: Zamel cited one PhD candidate in Australia working 70 hours a week on one project instead of 10 to 15 hours.
Much of Wikistrat’s output is related to current events. Although Zamel says the bulk of their reports are written for clients and not available for public consumption, Wikistrat does run frequent public simulations as a way of attracting publicity and recruiting talent for the organization. Their most recent crowdsourced project is called Myanmar Moving Forward and runs from November 25 to December 9. According to Wikistrat, they are asking their “Strategic community to map out Myanmar’s current political risk factor and possible futures (positive, negative, or mixed) for the new democracy in 2015. The simulation is designed to explore the current social, political, economic, and geopolitical threats to stability–i.e. its political risk–and to determine where the country is heading in terms of its social, political, economic, and geopolitical future.”…