Opening Governance – Change, Continuity and Conceptual Ambiguity


Introduction to special issue of IDS Bulletin by Rosemary McGee and Duncan Edwards: “Open government and open data are new areas of research, advocacy and activism that have entered the governance field alongside the more established areas of transparency and accountability. This article reviews recent scholarship in these areas, pinpointing contributions to more open, transparent, accountable and responsive governance via improved practice, projects and programmes. The authors set the rest of the articles from this IDS Bulletin in the context of the ideas, relationships, processes, behaviours, policy frameworks and aid funding practices of the last five years, and critically discuss questions and weaknesses that limit the effectiveness and impact of this work. Identifying conceptual ambiguity as a key problem, they offer a series of definitions to help overcome the technical and political difficulties this causes. They also identify hype and euphemism, and offer a series of conclusions to help restore meaning and ideological content to work on open government and open data in transparent and accountable governance….(More)”

When is your problem a ‘Challenge’?


Ed Parkes at NESTA: “More NGOs, Government Departments and city governments are using challenge prizes to help develop new products and services which ‘solve’ a problem they have identified. There have been several high profile prizes (for instance, Nesta’s Longitude Prize or the recently announced $7 million ocean floor Xprize) and a growing number of platforms for running them (such as Challenge.gov or OpenIdeo). Due to this increased profile, challenge prizes are more often seen by public sector strategists and policy owners as holding the potential to solve their tricky strategic issues.

To characterise, the starting point is often “If only we could somehow get new, smart, digitally-informed organisations to solve the underfunded, awkward strategic issues we’ve been grappling with, wouldn’t it be great?”.

This approach is especially tantalising for public sector organisations as it means they can be seen to take action on an issue through ‘market shaping’, rather than resorting to developing policy or intervening with regulation or legislation.

Having worked on a series of challenge prizes on open data over the last couple of years, as well as subsequently working with organisations on how our design principles could be applied to their objectives, I’ve spent some time thinking about when it’s appropriate to run a challenge prize. The design and practicalities of running a successful challenge prize are not always straightforward. Thankfully there has already been some useful broad guidance on this from Nesta’s Centre for Challenge Prizes in their Challenge Prize Practice Guide and McKinsey and Deloitte have also published guides.

Nevertheless despite this high quality guidance, like many things in life, the most difficult part is knowing where to start. Organisations struggle to understand whether they have the right problem in the first place. In many instances running a challenge prize is not the appropriate organisational response to an issue and it’s best to discover this early on. From my experience, there are two key questions which are worth asking when you’re trying to work out if your problem is suitable:

1. Is your problem an issue for anyone other than your own organisation?…

2. Will other people see solving this problem as an investment opportunity or worth funding?…

These two considerations come down to one thing – incentive. Firstly, does anyone other than your organisation care about this issue and secondly, do they care enough about it to pay to solve it…..(More)’

Met Office warns of big data floods on the horizon


 at V3: “The amount of data being collected by departments and agencies mean government services will not be able to implement truly open data strategies, according to Met Office CIO Charles Ewen.

Ewen said the rapidly increasing amount of data being stored by companies and government departments mean it will not be technologically possible able to share all their data in the near future.

During a talk at the Cloud World Forum on Wednesday, he said: “The future will be bigger and bigger data. Right now we’re talking about petabytes, in the near future it will be tens of petabytes, then soon after it’ll be hundreds of petabytes and then we’ll be off into imaginary figure titles.

“We see a future where data has gotten so big the notion of open data and the idea ‘lets share our data with everybody and anybody’ just won’t work. We’re struggling to make it work already and by 2020 the national infrastructure will not exist to shift this stuff [data] around in the way anybody could access and make use of it.”

Ewen added that to deal with the shift he expects many departments and agencies will adapt their processes to become digital curators that are more selective about the data they share, to try and ensure it is useful.

“This isn’t us wrapping our arms around our data and saying you can’t see it. We just don’t see how we can share all this big data in the way you would want it,” he said.

“We see a future where a select number of high-capacity nodes become information brokers and are used to curate and manage data. These curators will be where people bring their problems. That’s the future we see.”

Ewan added that the current expectations around open data are based on misguided views about the capabilities of cloud technology to host and provide access to huge amounts of data.

“The trendy stuff out there claims to be great at everything, but don’t get carried away. We don’t see cloud as anything but capability. We’ve been using appropriate IT and what’s available to deliver our mission services for over 50 to 60 years, and cloud is playing an increasing part of that, but purely for increased capability,” he said.

“It’s just another tool. The important thing is having the skill and knowledge to not just believe vendors but to look and identify the problem and say ‘we have to solve this’.”

The Met Office CIO’s comments follow reports from other government service providers that people’s desire for open data is growing exponentially….(More)”

Open data set to reshape charity and activism in 2016


The Guardian: “In 2015 the EU launched the world’s first international data portal, the Chinese government pledged to make state data public, and the UK lost its open data crown to Taiwan. Troves of data were unlocked by governments around the world last year, but the usefulness of much of that data is still to be determined by the civic groups, businesses and governments who use it. So what’s in the pipeline? And how will the open data ecosystem grow in 2016? We asked the experts.

1. Data will be seen as infrastructure (Heather Savory, director general for data capability, Office for National Statistics)….

2. Journalists, charities and civil society bodies will embrace open data (Hetan Shah, executive director, the Royal Statistical Society)…3. Activists will take it upon themselves to create data

3. Activists will take it upon themselves to create data (Pavel Richter, chief executive, Open Knowledge International)….

 

4. Data illiteracy will come at a heavy price (Sir Nigel Shadbolt, principal, Jesus College, Oxford, professorial research fellow in computer science, University of Oxford and chairman and co-founder of the Open Data Institute…)

5. We’ll create better tools to build a web of data (Dr Elena Simperl, associate professor, electronics and computer science, University of Southampton) …(More)”

This Is How Visualizing Open Data Can Help Save Lives


Alexander Howard at the Huffington Post: “Cities are increasingly releasing data that they can use to make life better for their residents online — enabling journalists and researchers to better inform the public.

Los Angeles, for example, has analyzed data about injuries and deaths on its streets and published it online. Now people can check its conclusions and understand why LA’s public department prioritizes certain intersections.

The impact from these kinds of investments can lead directly to saving lives and preventing injuries. The work is part of a broader effort around the world to make cities safer.

Like New York City, San Francisco and Portland, Oregon, Los Angeles has adopted Sweden’s “Vision Zero” program as part of its strategy for eliminating traffic deathsCalifornia led the nation in bicycle deaths in 2014.

At visionzero.lacity.org, you can see that the City of Los Angeles is using data visualization to identify the locations of “high injury networks,” or the 6 percent of intersections that account for 65 percent of the severe injuries in the area.

CITY OF LOS ANGELES

The work is the result of LA’s partnership with University of South California graduate students. As a result of these analyses, the Los Angeles Police Department has been cracking down on jaywalking near the University of Southern California.

Abhi Nemani, the former chief data officer for LA, explained why the city needed to “go back to school” for help.

“In resource-constrained environments — the environment most cities find themselves in these days — you often have to beg, borrow, and steal innovation; particularly so, when it comes to in-demand resources such as data science expertise,” he told the Huffington Post.

“That’s why in Los Angeles, we opted to lean on the community for support: both the growing local tech sector and the expansive academic base. The academic community, in particular, was eager to collaborate with the city. In fact, most — if not all — local institutions reached out to me at some point asking to partner on a data science project with their graduate students.”

The City of Los Angeles is now working with another member of its tech sector toeliminate traffic deaths. DataScience, based in Culver City, California, received $22 million dollars in funding in December to make predictive insights for customers.

“The City of Los Angeles is very data-driven,” DataScience CEO Ian Swanson told HuffPost. “I commend Mayor Eric Garcetti and the City of Los Angeles on the openness, transparency, and availability of city data initiatives, like Vision Zero, put the City of Los Angeles‘ data into action and improve life in this great city.”

DataScience created an interactive online map showing the locations of collisions involving bicycles across the city….(More)”

Opening up government data for public benefit


Keiran Hardy at the Mandarin (Australia): “…This post explains the open data movement and considers the benefits and risks of releasing government data as open data. It then outlines the steps taken by the Labor and Liberal governments in accordance with this trend. It argues that the Prime Minister’stask, while admirably intentioned, is likely to prove difficult due to ongoing challenges surrounding the requirements of privacy law and a public service culture that remains reluctant to release government data into the public domain….

A key purpose of releasing government data is to improve the effectiveness and efficiency of services delivered by the government. For example, data on crops, weather and geography might be analysed to improve current approaches to farming and industry, or data on hospital admissions might be analysed alongside demographic and census data to improve the efficiency of health services in areas of need. It has been estimated that such innovation based on open data could benefit the Australian economy by up to $16 billion per year.

Another core benefit is that the open data movement is making gains in transparency and accountability, as a greater proportion of government decisions and operations are being shared with the public. These democratic values are made clear in the OGP’s Open Government Declaration, which aims to make governments ‘more open, accountable, and responsive to citizens’.

Open data can also improve democratic participation by allowing citizens to contribute to policy innovation. Events like GovHack, an annual Australian competition in which government, industry and the general public collaborate to find new uses for open government data, epitomise a growing trend towards service delivery informed by user input. The winner of the “Best Policy Insights Hack” at GovHack 2015 developed a software program for analysing which suburbs are best placed for rooftop solar investment.

At the same time, the release of government data poses significant risks to the privacy of Australian citizens. Much of the open data currently available is spatial (geographic or satellite) data, which is relatively unproblematic to post online as it poses minimal privacy risks. However, for the full benefits of open data to be gained, these kinds of data need to be supplemented with information on welfare payments, hospital admission rates and other potentially sensitive areas which could drive policy innovation.

Policy data in these areas would be de-identified — that is, all names, addresses and other obvious identifying information would be removed so that only aggregate or statistical data remains. However, debates continue as to the reliability of de-identification techniques, as there have been prominent examples of individuals being re-identified by cross-referencing datasets….

With regard to open data, a culture resistant to releasing government informationappears to be driven by several similar factors, including:

  • A generational preference amongst public service management for maintaining secrecy of information, whereas younger generations expect that data should be made freely available;
  • Concerns about the quality or accuracy of information being released;
  • Fear that mistakes or misconduct on behalf of government employees might be exposed;
  • Limited understanding of the benefits that can be gained from open data; and
  • A lack of leadership to help drive the open data movement.

If open data policies have a similar effect on public service culture as FOI legislation, it may be that open data policies in fact hinder transparency by having a chilling effect on government decision-making for fear of what might be exposed….

These legal and cultural hurdles will pose ongoing challenges for the Turnbull government in seeking to release greater amounts of government data as open data….(More)

OpenFDA: an innovative platform providing access to a wealth of FDA’s publicly available data


Paper by Taha A Kass-Hout et al in JAMIA: “The objective of openFDA is to facilitate access and use of big important Food and Drug Administration public datasets by developers, researchers, and the public through harmonization of data across disparate FDA datasets provided via application programming interfaces (APIs).

Materials and Methods: Using cutting-edge technologies deployed on FDA’s new public cloud computing infrastructure, openFDA provides open data for easier, faster (over 300 requests per second per process), and better access to FDA datasets; open source code and documentation shared on GitHub for open community contributions of examples, apps and ideas; and infrastructure that can be adopted for other public health big data challenges.

Results:Since its launch on June 2, 2014, openFDA has developed four APIs for drug and device adverse events, recall information for all FDA-regulated products, and drug labeling. There have been more than 20 million API calls (more than half from outside the United States), 6000 registered users, 20,000 connected Internet Protocol addresses, and dozens of new software (mobile or web) apps developed. A case study demonstrates a use of openFDA data to understand an apparent association of a drug with an adverse event. Conclusion With easier and faster access to these datasets, consumers worldwide can learn more about FDA-regulated products

Conclusion: With easier and faster access to these datasets, consumers worldwide can learn more about FDA-regulated products…(More)”

Open Data Index 2015


Open Knowledge: “….This year’s Index showed impressive gains from non-OECD countries with Taiwan topping the Index and Colombia and Uruguay breaking into the top ten at four and seven respectively. Overall, the Index evaluated 122 places and 1586 datasets and determined that only 9%, or 156 datasets, were both technically and legally open.

The Index ranks countries based on the availability and accessibility of data in thirteen key categories, including government spending, election results, procurement, and pollution levels. Over the summer, we held a public consultation, which saw contributions from individuals within the open data community as well as from key civil society organisations across an array of sectors. As a result of this consultation, we expanded the 2015 Index to include public procurement data, water quality data, land ownership data and weather data; we also decided to removed transport timetables due to the difficulties faced when comparing transport system data globally.

Open Knowledge International began to systematically track the release of open data by national governments in 2013 with the objective of measuring if governments were releasing the key datasets of high social and democratic value as open data. That enables us to better understand the current state of play and in turn work with civil society actors to address the gaps in data release. Over the course of the last three years, the Global Open Data Index has become more than just a benchmark – we noticed that governments began to use the Index as a reference to inform their open data priorities and civil society actors began to use the Index advocacy tool to encourage governments to improve their performance in releasing key datasets.

Furthermore, indices such as the Global Open Data Index are not without their challenges. The Index measures the technical and legal openness of datasets deemed to be of critical democratic and social value – it does not measure the openness of a given government. It should be clear that the release of a few key datasets is not a sufficient measure of the openness of a government. The blurring of lines between open data and open government is nothing new and has been hotly debated by civil society groups and transparency organisations since the sharp rise in popularity of open data policies over the last decade. …Index at http://index.okfn.org/”

State of the Commons


Creative Commons: “Creative Commoners have known all along that collaboration, sharing, and cooperation are a driving force for human evolution. And so for many it will come as no surprise that in 2015 we achieved a tremendous milestone: over 1.1 billion CC licensed photos, videos, audio tracks, educational materials, research articles, and more have now been contributed to the shared global commons…..

Whether it’s open education, open data, science, research, music, video, photography, or public policy, we are putting sharing and collaboration at the heart of the Web. In doing so, we are much closer to realizing our vision: unlocking the full potential of the Internet to drive a new era of development, growth, and productivity.

I am proud to share with you our 2015 State of the Commons report, our best effort to measure the immeasurable scope of the commons by looking at the CC licensed content, along with content marked as public domain, that comprise the slice of the commons powered by CC tools. We are proud to be a leader in the commons movement, and we hope you will join us as we celebrate all we have accomplished together this year. ….Report at https://stateof.creativecommons.org/2015/”

The $50 Million Competition to Remake the American City


Alex Davies at Wired: “IN THE NEXT 30 years, the American population will rise by 70 million people. This being the future, those people will love ordering stuff online even more than people do now, which will prompt a 45 percent rise in freight volume. The nation’s roads, already crumbling because Congress likes bickering more than legislating, will be home to 65 percent more trucks.

That’s just one of the ways a report, released earlier this year by the US Department of Transportation, says a growing population will strain an already overloaded highway system. Eager to avert some of these problems and get people thinking about the mobility of tomorrow, today the DOT is launching the Smart City Challenge, a contest that invites American cities to take advantage of new technologies that could change how we move.

Open data, smart gadgets, autonomous vehicles, and connected cars are among the tech already revolutionizing the road, while companies ranging from Apple and Google to Uber and Lyft promise to revolutionize how people and goods get around. The city that offers the most compelling plan gets $50 million to begin making it happen.

The challenge represents a new way of working for the DOT, one tailored to a rapidly changing world….(More)” See also >www.transportation.gov/smartcity<.