New Open Data Executive Order and Policy


The White House: “The Obama Administration today took groundbreaking new steps to make information generated and stored by the Federal Government more open and accessible to innovators and the public, to fuel entrepreneurship and economic growth while increasing government transparency and efficiency.
Today’s actions—including an Executive Order signed by the President and an Open Data Policy released by the Office of Management and Budget and the Office of Science and Technology Policy—declare that information is a valuable national asset whose value is multiplied when it is made easily accessible to the public.  The Executive Order requires that, going forward, data generated by the government be made available in open, machine-readable formats, while appropriately safeguarding privacy, confidentiality, and security.
The move will make troves of previously inaccessible or unmanageable data easily available to entrepreneurs, researchers, and others who can use those files to generate new products and services, build businesses, and create jobs….
Along with the Executive Order and Open Data Policy, the Administration announced a series of complementary actions:
• A new Data.Gov.  In the months ahead, Data.gov, the powerful central hub for open government data, will launch new services that include improved visualization, mapping tools, better context to help locate and understand these data, and robust Application Programming Interface (API) access for developers.
• New open source tools to make data more open and accessible.  The US Chief Information Officer and the US Chief Technology Officer are releasing free, open source tools on Github, a site that allows communities of developers to collaboratively develop solutions.  This effort, known as Project Open Data, can accelerate the adoption of open data practices by providing plug-and-play tools and best practices to help agencies improve the management and release of open data.  For example, one tool released today automatically converts simple spreadsheets and databases into APIs for easier consumption by developers.  Anyone, from government agencies to private citizens to local governments and for-profit companies, can freely use and adapt these tools starting immediately.
• Building a 21st century digital government.  As part of the Administration’s Digital Government Strategy and Open Data Initiatives in health, energy, education, public safety, finance, and global development, agencies have been working to unlock data from the vaults of government, while continuing to protect privacy and national security.  Newly available or improved data sets from these initiatives will be released today and over the coming weeks as part of the one year anniversary of the Digital Government Strategy.
• Continued engagement with entrepreneurs and innovators to leverage government data.  The Administration has convened and will continue to bring together companies, organizations, and civil society for a variety of summits to highlight how these innovators use open data to positively impact the public and address important national challenges.  In June, Federal agencies will participate in the fourth annual Health Datapalooza, hosted by the nonprofit Health Data Consortium, which will bring together more than 1,800 entrepreneurs, innovators, clinicians, patient advocates, and policymakers for information sessions, presentations, and “code-a-thons” focused on how the power of data can be harnessed to help save lives and improve healthcare for all Americans.
For more information on open data highlights across government visit: http://www.whitehouse.gov/administration/eop/ostp/library/docsreports”

Open government data shines a light on hospital billing and health care costs


hospital-costsAlex Howard: “If transparency is the best disinfectant, casting sunlight upon the cost of care in hospitals across the United States will make the health care system itself healthier.
The Department of Health and Human Services has released open data that compares the billing for the 100 most common treatments and procedures performed at more than 3000 hospital in the U.S. The Medicare provider charge data shows significant variation within communies and across the country for the same procedures.
One hospital charged $8,000, another $38,000 — for the same condition. This data is enabling newspapers like the Washington Post to show people the actual costs of health care and create  interactive features that enable  people to search for individual hospitals and see how they compare. The New York Times explored the potential reasons behind wild disparities in billing at length today, from sicker patients to longer hospitalizations to higher labor costs.”

The Uncertain Relationship Between Open Data and Accountability


Tiago Peixoto’s Response to Yu and Robinson’s paper on  The New Ambiguity of “ Open Government ”: “By looking at the nature of data that may be disclosed by governments, Harlan Yu and David Robinson provide an analytical framework that evinces the ambiguities underlying the term “open government data.” While agreeing with their core analysis, I contend that the authors ignore the enabling conditions under which transparency may lead to accountability, notably the publicity and political agency conditions. I argue that the authors also overlook the role of participatory mechanisms as an essential element in unlocking the potential for open data to produce better government decisions and policies. Finally, I conduct an empirical analysis of the publicity and political agency conditions in countries that have launched open data efforts, highlighting the challenges associated with open data as a path to accountability.”
 

Measuring Impact of Open and Transparent Governance


opengovMark Robinson @ OGP blog: “Eighteen months on from the launch of the Open Government Partnership in New York in September 2011, there is growing attention to what has been achieved to date.  In the recent OGP Steering Committee meeting in London, government and civil society members were unanimous in the view that the OGP must demonstrate results and impact to retain its momentum and wider credibility.  This will be a major focus of the annual OGP conference in London on 31 October and 1 November, with an emphasis on showcasing innovations, highlighting results and sharing lessons.
Much has been achieved in eighteen months.  Membership has grown from 8 founding governments to 58.  Many action plan commitments have been realised for the majority of OGP member countries. The Independent Reporting Mechanism has been approved and launched. Lesson learning and sharing experience is moving ahead….
The third type of results are the trickiest to measure: What has been the impact of openness and transparency on the lives of ordinary citizens?  In the two years since the OGP was launched it may be difficult to find many convincing examples of such impact, but it is important to make a start in collecting such evidence.
Impact on the lives of citizens would be evident in improvements in the quality of service delivery, by making information on quality, access and complaint redressal public. A related example would be efficiency savings realised from publishing government contracts.  Misallocation of public funds exposed through enhanced budget transparency is another. Action on corruption arising from bribes for services, misuse of public funds, or illegal procurement practices would all be significant results from these transparency reforms.  A final example relates to jobs and prosperity, where the utilisation of government data in the public domain by the private sector to inform business investment decisions and create employment.
Generating convincing evidence on the impact of transparency reforms is critical to the longer-term success of the OGP. It is the ultimate test of whether lofty public ambitions announced in country action plans achieve real impacts to the benefit of citizens.”

Department of Better Technology


logo-250Next City reports: “…opening up government can get expensive. That’s why two developers this week launched the Department of Better Technology, an effort to make open government tools cheaper, more efficient and easier to engage with.

As founder Clay Johnson explains in a post on the site’s blog, a federal website that catalogues databases on government contracts, which launched last year, cost $181 million to build — $81 million more than a recent research initiative to map the human brain.

“I’d like to say that this is just a one-off anomaly, but government regularly pays millions of dollars for websites,” writes Johnson, the former director of Sunlight Labs at the Sunlight Foundation and author the 2012 book The Information Diet.

The first undertaking of Johnson and his partner, GovHub co-founder Adam Becker, is a tool meant to make it simpler for businesses to find government projects to bid on, as well as help officials streamline the process of managing procurements. In a pilot experiment, Johnson writes, the pair found that not only were bids coming in faster and at a reduced price, but more people were doing the bidding.

Per Johnson, “many of the bids that came in were from businesses that had not ordinarily contracted with the federal government before.”
The Department of Better Technology will accept five cities to test a beta version of this tool, called Procure.io, in 2013.”

Open Data Research Announced


WWW Foundation Press Release:  “Speaking at an Open Government Partnership reception last night in London, Sir Tim Berners-Lee, founder of the World Wide Web Foundation (Web Foundation) and inventor of the Web, unveiled details of the first ever in-depth study into how the power of open data could be harnessed to tackle social challenges in the developing world. The 14 country study is funded by Canada’s International Development Research Centre (IDRC) and will be overseen by the Web Foundation’s world-leading open data experts. An interim progress update will be made at an October 2013 meeting of the Open Government Partnership, with in-depth results expected in 2014…

Sir Tim Berners-Lee, founder of the World Wide Web Foundation and inventor of the Web said:

“Open Data, accessed via a free and open Web, has the potential to create a better world. However, best practice in London or New York is not necessarily best practice in Lima or Nairobi.  The Web Foundation’s research will help to ensure that Open Data initiatives in the developing world will unlock real improvements in citizens’ day-to-day lives.”

José M. Alonso, program manager at the World Wide Web Foundation, added:

“Through this study, the Web Foundation hopes not only to contribute to global understanding of open data, but also to cultivate the ability of developing world researchers and development workers to understand and apply open data for themselves.”

Further details on the project, including case study outlines are available here: http://oddc.opendataresearch.org/

Taking Open Government to the Next Level


Carl Fillichio who heads the Labor Department’s Office of Public Affairs at (Work in Progress):  “Since we published a department-wide API two years ago, developers across the country have used it to create apps that educate users about workplace safety and health, employers’ compliance with wage and hour laws, and improving employment opportunities for disabled workers, just to name a few!
Releasing data through an API was a big step forward, but it was not exactly groundbreaking.  However, since then, my team has been working hard to develop software development kits that are truly innovative because they make using our API even easier.
These kits (also known as SDKs) contain application code for six different platforms − iOS, Android, Blackberry, .Net, PHP and Ruby − that anyone creating a mobile or Web-based app using our data could incorporate. By using the kits, experienced developers will save time and novice developers will be able to work with DOL data in just a few minutes…. All of these kits can be downloaded from our developer site. Additionally, in keeping with the federal digital government strategy, each has been published as an open source project on github, a popular code-sharing site. For a list of federal APIs that are supported by our kits, check the github repository’s wiki page. This list will be updated as the kits are tested with additional federal APIs.”
 

Mental Geography, Wonky Maps and a Long Way Ahead


Paper by Alan Dix at GeoHCI Workshop at CHI 2013, April 27–28, 2013: ” It has never been easier to create your own maps, creating data mashups with Google Maps and similar tools and embedding them in web pages. This has benefited tourism and commerce, and has also revolutionised many areas of social activism, allowing open government data and other public (or leaked) data to be visualised in ways that may subvert or offer alternative views to the official narrative. However, like all maps, digital mapping  embodies a particular politics and world view…”

Interpretative Communities


Communities to sense locally of open government data.

A new paper on “Local governance in the new information ecology” in the journal Public Money & Management calls for the creation of “interpretative communities” to make sense locally of open government data. In particular, they argue that:

“The availability of this open government data… solves nothing: as many writers have pointed out, such data needs to be interpreted and interpretation is always a function of a collective—what has been called an ‘interpretative’ or ‘epistemic’ community.”

The call mirrors the emerging view that the next stage regarding open data is to focus on making sense of the data and using it to serve the public good.

The authors identify “three different models of ‘interpretative communities’ that have emerged over the last few decades, drawing on, respectively, literary theory, science and technology studies and international politics”—including reference groups, epistemic communities, and expert networks. As to developing these communities locally, the paper states that it will require us…

“to rethink the resources and institutions that could support a local use of OGD and other resources. Such institutional support for local interpretation cannot be wholly local but needs to draw, in a critical and interactive manner, on wider knowledge bases. In the end, the model of the local that needs to be mobilized is not one based on spatial propinquity alone, or on a bounded sense of local, but one in which the local is seen as relational, connected and dynamic…we might look to the new technologies, and the newly-emerged social networks or Web 2.0 in particular, for such a balance of the local and the extra local”.

The paper ultimately is a critique of the current movement to provide open data that is presented as objective knowledge but is based on a “view from nowhere”…

“To make it into the view from somewhere will require the construction of powerful, yet open, interpretative communities to enact local governance with all of the subsequent questions arising about current modes of democratic representation, sectional interests in the third and private sector and centrallocal government dynamics. The prospects of such an information ecology to support interpretative community building emerging in the current environment in anything other than a piecemeal and reactive way do not appear promising.”