UK launches Information Economy Strategy


Open Data Institute: “The Information Economy Strategy sets out a range of key actions, including:

  • Digitally transforming 25 of the top 50 UK public services over the next 300 days, including plans to give businesses a single, online view of their tax records
  • Launching a new programme to help 1.6 million SMEs scale up their business online over the next five years.
  • Publishing a data capability strategy in October 2013, developed in partnership with government, industry and academia. The strategy will build on the recommendations in Stephan Shakespeare’s review of Public Sector Information and the Prime Minister’s Council for Science and Technology’s report on algorithms, and will be published alongside the Open Government Partnership National Action Plan.
  • Establishing the world’s first facility for testing state of the art 5G mobile technology, working with industry and the University of Surrey.”

Open Data Is Re-Defining Government in the 21st Century


Kevin Merritt, the founder and CEO of Socrata, in Next Gov: ….a new movement, spurred by digital and social activism, is taking root to renovate and redefine the public sector.
This movement is based on democratizing the vast treasure trove of data that governments have accumulated over the years, transparently releasing it so citizens and companies can drive meaningful change and solve problems that government, on its own, cannot solve…. This emerging digital collaboration between the public sector and scores of entrepreneurs across the nation has the potential to profoundly transform the role of government….
As a software entrepreneur, I see open data as the transformation of governments from monolithic service providers to open innovation platforms, fueled by data. This shift may hold the answers to some age-old problems in government, like chronic inefficiency and a citizen experience that’s out of step with the modern consumer era.
“This is the right way to frame the question of Government 2.0,” explains O’Reilly, a leading open data advocate. “How does government become an open platform that allows people inside and outside government to innovate? How do you design a system in which all of the outcomes aren’t specified beforehand, but, instead, evolve through interactions between government and its citizens, as a service provider enabling its user community?”
The answers to these questions are still taking shape; but one thing we do know is that the strategic use of data is clearly re-defining government’s role in the 21st century.

The five elements of an open source city


Jason Hibbets in Open Source.com: “How can you apply the concepts of open source to a living, breathing city? An open source city is a blend of open culture, open government policies, and economic development. I derived these characteristics based on my experiences and while writing my book, The foundation for an open source city.  Characteristics such as collaboration, participation, transparency, rapid prototyping, and many others can be applied to any city that wants to create an open source culture. Let’s take a look at these characteristics in more detail.

Five characteristics of an open source city

  1. Fostering a culture of citizen participation
  2. Having an effective open government policy
  3. Having an effective open data initiative
  4. Promoting open source user groups and conferences
  5. Being a hub for innovation and open source businesses

In my book, I take a look at how these five principles are being actively applied in Raleigh, North Carolina. I also incorporate other experiences from my open government adventures such as CityCamps and my first Code for America Summit. Although Raleigh is the case study, the book is a guide for how cities across the country, and world, can implement the open source city brand.”

The Use of Data Visualization in Government


Report by Genie Stowers for The IBM Center for The Business of Government: “The purpose of this report is to help public sector managers understand one of the more important areas of data analysis today—data visualization. Data visualizations are more sophisticated, fuller graphic designs than the traditional spreadsheet charts, usually with more than two variables and, typically, incorporating interactive features. Data are here to stay, growing exponentially, and data analysis is taking off, pushed forward as a result of the convergence of:
• New technologies
• Open data and big data movements
• The drive to more effectively engage citizens
• The creation and distribution of more and more data…
This report contains numerous examples of visualizations that include geographical and health data, or population and time data, or financial data represented in both absolute and relative terms—and each communicates more than simply the data that underpin it.In addition to these many examples of visualizations, the report discusses the history of this technique, and describes tools that can be used to create visualizations from many different kinds of data sets. Government managers can use these tools—including Many Eyes, Tableau, and HighCharts—to create their own visualizations from their agency’s data.
The report presents case studies on how visualization techniques are now being used by two local governments, one state government,and three federal government agencies. Each case study discusses the audience for visualization. Understanding audience is important, as government organizations provide useful visualizations to different audiences, including the media, political oversight organizations, constituents, and internal program teams.To assist in effectively communicating to these audiences, the report details attributes of meaningful visualizations: relevance,meaning, beauty, ease of use, legibility, truthfulness, accuracy,and consistency among them.”

Smart Citizen Kit enables crowdsourced environmental monitoring


Emma Hutchings at PSFK: “The Smart Citizen Kit is a crowdsourced environmental monitoring platform. By scattering devices around the world, the creators hope to build a global network of sensors that report local environmental conditions like CO and NO2 levels, light, noise, temperature and humidity.
Organized by the Fab Lab at the Institute for Advanced Architecture of Catalonia, a team of scientists, architects, and engineers are paving the way to humanize environmental monitoring. The open-source platform consists of arduino-compatible hardware, data visualization web API and a mobile app. Users are invited to take part in the interactive global environmental database, visualizing their data and comparing it with others around the world.”
Smart Citizen Kit Calls For Environmental Monitoring

If My Data Is an Open Book, Why Can’t I Read It?


Natasha Singer in the New York Times: “Never mind all the hoopla about the presumed benefits of an “open data” society. In our day-to-day lives, many of us are being kept in the data dark.

“The fact that I am producing data and companies are collecting it to monetize it, if I can’t get a copy myself, I do consider it unfair,” says Latanya Sweeney, the director of the Data Privacy Lab at Harvard, where she is a professor of government and technology….

In fact, a few companies are challenging the norm of corporate data hoarding by actually sharing some information with the customers who generate it — and offering tools to put it to use. It’s a small but provocative trend in the United States, where only a handful of industries, like health care and credit, are required by federal law to provide people with access to their records.

Last year, San Diego Gas and Electric, a utility, introduced an online energy management program in which customers can view their electricity use in monthly, daily or hourly increments. There is even a practical benefit: customers can earn credits by reducing energy consumption during peak hours….

Deepbills project


Cato Institute: “The Deepbills project takes the raw XML of Congressional bills (available at FDsys and Thomas) and adds additional semantic information to them in inside the text.

You can download the continuously-updated data at http://deepbills.cato.org/download

Congress already produces machine-readable XML of almost every bill it proposes, but that XML is designed primarily for formatting a paper copy, not for extracting information. For example, it’s not currently possible to find every mention of an Agency, every legal reference, or even every spending authorization in a bill without having a human being read it….
Currently the following information is tagged:

  • Legal citations…
  • Budget Authorities (both Authorizations of Appropriations and Appropriations)…
  • Agencies, bureaus, and subunits of the federal government.
  • Congressional committees
  • Federal elective officeholders (Congressmen)”

Introducing: Project Open Data


White House Blog: “Technology evolves rapidly, and it can be challenging for policy and its implementation to evolve at the same pace.  Last week, President Obama launched the Administration’s new Open Data Policy and Executive Order aimed at ensuring that data released by the government will be as accessible and useful as possible.  To make sure this tech-focused policy can keep up with the speed of innovation, we created Project Open Data.
Project Open Data is an online, public repository intended to foster collaboration and promote the continual improvement of the Open Data Policy. We wanted to foster a culture change in government where we embrace collaboration and where anyone can help us make open data work better. The project is published on GitHub, an open source platform that allows communities of developers to collaboratively share and enhance code.  The resources and plug-and-play tools in Project Open Data can help accelerate the adoption of open data practices.  For example, one tool instantly converts spreadsheets and databases into APIs for easier consumption by developers.  The idea is that anyone, from Federal agencies to state and local governments to private citizens, can freely use and adapt these open source tools—and that’s exactly what’s happening.
Within the first 24 hours after Project Open Data was published, more than two dozen contributions (or “pull requests” in GitHub speak) were submitted by the public. The submissions included everything from fixing broken links, to providing policy suggestions, to contributing new code and tools. One pull request even included new code that translates geographic data from locked formats into open data that is freely available for use by anyone…”

IRS: Turn over a new leaf, Open up Data


Beth Simone Noveck and Stefaan Verhulst in Forbes: “The core task for Danny Werfel, the new acting commissioner of the IRS, is to repair the agency’s tarnished reputation and achieve greater efficacy and fairness in IRS investigations. Mr. Werfel can show true leadership by restructuring how the IRS handles its tax-exempt enforcement processes.
One of Mr. Werfel’s first actions on the job should be the immediate implementation of the groundbreaking Presidential Executive Order and Open Data policy, released last week, that requires data captured and generated by the government be made available in open, machine-readable formats. Doing so will make the IRS a beacon to other agencies in how to use open data to screen any wrongdoing and strengthen law enforcement.
By sharing readily available IRS data on tax-exempt organizations, encouraging Congress to pass a budget proposal that mandates release of all tax-exempt returns in a machine-readable format, and increasing the transparency of its own processes, the agency can begin to turn the page on this scandal and help rebuild trust and partnership between government and its citizens.”
See full article here.

Economic effects of open data policy still 'anecdotal'


Adam Mazmanian in FCW:’ A year after the launch of the government’s digital strategy, there’s no official tally of the economic activity generated by the release of government datasets for use in commercial applications.
“We have anecdotal examples, but nothing official yet,” said federal CIO Steven VanRoekel in an invitation-only meeting with reporters at the FOSE conference on May 15. “It’s an area where we have an opportunity to start to talk about this, because it’s starting to tick up a bit, and the numbers are looking pretty good.” (Related story: APIs help agencies say yes)…
The Obama administration is banking on an explosion in the use of federal datasets for commercial and government applications alike. Last week’s executive order and accompanying directive from the Office of Management and Budget tasks agencies with making open and machine readable data the new default setting for government information.
VanRoekel said that the merits of the open data standard don’t necessarily need to be justified by economic activity….
The executive order also spells out privacy concerns arising from the so-called “mosaic effect,’ by which information from disparate datasets can be overlaid to decipher personally identifiable information.”