Open Data: A 21st Century Asset for Small and Medium Sized Enterprises


“The economic and social potential of open data is widely acknowledged. In particular, the business opportunities have received much attention. But for all the excitement, we still know very little about how and under what conditions open data really works.

To broaden our understanding of the use and impact of open data, the GovLab has a variety of initiatives and studies underway. Today, we share publicly our findings on how Small and Medium Sized Enterprises (SMEs) are leveraging open data for a variety of purposes. Our paper “Open Data: A 21st Century Asset for Small and Medium Sized Enterprises” seeks to build a portrait of the lifecycle of open data—how it is collected, stored and used. It outlines some of the most important parameters of an open data business model for SMEs….

The paper analyzes ten aspects of open data and establishes ten principles for its effective use by SMEs. Taken together, these offer a roadmap for any SME considering greater use or adoption of open data in its business.

Among the key findings included in the paper:

  • SMEs, which often lack access to data or sophisticated analytical tools to process large datasets, are likely to be one of the chief beneficiaries of open data.
  • Government data is the main category of open data being used by SMEs. A number of SMEs are also using open scientific and shared corporate data.
  • Open data is used primarily to serve the Business-to-Business (B2B) markets, followed by the Business-to-Consumer (B2C) markets. A number of the companies studied serve two or three market segments simultaneously.
  • Open data is usually a free resource, but SMEs are monetizing their open-data-driven services to build viable businesses. The most common revenue models include subscription-based services, advertising, fees for products and services, freemium models, licensing fees, lead generation and philanthropic grants.
  • The most significant challenges SMEs face in using open data include those concerning data quality and consistency, insufficient financial and human resources, and issues surrounding privacy.

This is just a sampling of findings and observations. The paper includes a number of additional observations concerning business and revenue models, product development, customer acquisition, and other subjects of relevance to any company considering an open data strategy.”

President Obama Signs Executive Order Making Presidential Innovation Fellows Program Permanent


White House Press Release: “My hope is this continues to encourage a culture of public service among our innovators, and tech entrepreneurs, so that we can keep building a government that’s as modern, as innovative, and as engaging as our incredible tech sector is.  To all the Fellows who’ve served so far – thank you.  I encourage all Americans with bold ideas to apply.  And I can’t wait to see what those future classes will accomplish on behalf of the American people.” –- President Barack Obama

Today, President Obama signed an executive order that makes the Presidential Innovation Fellows Program a permanent part of the Federal government going forward. The program brings executives, entrepreneurs, technologists, and other innovators into government, and teams them up with Federal employees to improve programs that serve more than 150 million Americans.

The Presidential Innovation Fellows Program is built on four key principles:

  • Recruit the best our nation has to offer: Fellows include entrepreneurs, startup founders, and innovators with experience at large technology companies and startups, each of whom leverage their proven skills and technical expertise to create huge value for the public.
  • Partner with innovators inside government: Working as teams, the Presidential Innovation Fellows and their partners across the government create products and services that are responsive, user-friendly, and help to improve the way the Federal government interacts with the American people.
  • Deploy proven private sector strategies: Fellows leverage best practices from the private sector to deliver better, more effective programs and policies across the Federal government.
  • Focus on some of the Nation’s biggest and most pressing challenges: Projects focus on topics such as improving access to education, fueling job creation and the economy, and expanding the public’s ability to access their personal health data.

Additional Details on Today’s Announcements

The Executive Order formally establishes the Presidential Innovation Fellows Program within the General Services Administration (GSA), where it will continue to serve departments and agencies throughout the Executive Branch. The Presidential Innovation Fellow Program will be administered by a Director and guided by a newly-established Advisory Board. The Director will outline steps for the selection, hiring, and deployment of Fellows within government….

Fellows have partnered with leaders at more than 25 government agencies, delivering impressive results in months, not years, driving extraordinary work and innovative solutions in areas such as health care; open data and data science; crowd-sourcing initiatives; education; veterans affairs; jobs and the economy; and disaster response and recovery. Examples of projects include:

Open Data

When government acts as a platform, entrepreneurs, startups, and the private sector can build value-added services and tools on top of federal datasets supported by federal policies. Taking this approach, Fellows and agency stakeholders have supported the creation of new products and services focused on education, health, the environment, and social justice. As a result of their efforts and the agencies they have worked with:….

Jobs and the Economy

Fellows continue to work on solutions that will give the government better access to innovative tools and services. This is also helping small and medium-sized companies create jobs and compete for Federal government contracts….

Digital Government

The Presidential Innovation Fellows Program is a part of the Administration’s strategy to create lasting change across the Federal Government by improving how it uses technology. The Fellows played a part in launching 18F within the General Services Administration (GSA) and the U.S. Digital Services (USDS) team within the Office of Management and Budget….

Supporting Our Veterans

  • …Built a one-stop shop for finding employment opportunities. The Veterans Employment Center was developed by a team of Fellows working with the Department of Veterans Affairs in connection with the First Lady’s Joining Forces Initiative and the Department of Labor. This is the first interagency website connecting Veterans, transitioning Servicemembers, and their spouses to meaningful employment opportunities. The portal has resulted in cost savings of over $27 million to the Department of Veterans Affairs.

Education

  • …More than 1,900 superintendents pledged to more effectively leverage education technology in their schools. Fellows working at the Department of Education helped develop the idea of Future Ready, which later informed the creation of the Future Ready District Pledge. The Future Ready District Pledge is designed to set out a roadmap to achieve successful personalized digital learning for every student and to commit districts to move as quickly as possible towards our shared vision of preparing students for success. Following the President’s announcement of this effort in 2014, more than 1,900 superintendents have signed this pledge, representing 14 million students.

Health and Patient Care

  • More than 150 million Americans are able to access their health records online. Multiple rounds of Fellows have worked with the Department of Health and Human Services (HHS) and the Department of Veterans Affairs (VA) to expand the reach of theBlue Button Initiative. As a result, patients are able to access their electronic health records to make more informed decisions about their own health care. The Blue Button Initiative has received more than 600 commitments from organizations to advance health information access efforts across the country and has expanded into other efforts that support health care system interoperability….

Disaster Response and Recovery

  • Communities are piloting crowdsourcing tools to assess damage after disasters. Fellows developed the GeoQ platform with FEMA and the National Geospatial-Intelligence Agency that crowdsources photos of disaster-affected areas to assess damage over large regions.  This information helps the Federal government better allocate critical response and recovery efforts following a disaster and allows local governments to use geospatial information in their communities…. (More)

One way traffic: The open data initiative project and the need for an effective demand side initiative in Ghana


Paper by Frank L. K. Ohemeng and Kwaku Ofosu-Adarkwa in the Government Information Quarterly: “In recent years the necessity for governments to develop new public values of openness and transparency, and thereby increase their citizenries’ sense of inclusiveness, and their trust in and confidence about their governments, has risen to the point of urgency. The decline of trust in governments, especially in developing countries, has been unprecedented and continuous. A new paradigm that signifies a shift to citizen-driven initiatives over and above state- and market-centric ones calls for innovative thinking that requires openness in government. The need for this new synergy notwithstanding, Open Government cannot be considered truly open unless it also enhances citizen participation and engagement. The Ghana Open Data Initiative (GODI) project strives to create an open data community that will enable government (supply side) and civil society in general (demand side) to exchange data and information. We argue that the GODI is too narrowly focused on the supply side of the project, and suggest that it should generate an even platform to improve interaction between government and citizens to ensure a balance in knowledge sharing with and among all constituencies….(More)”

Making data open for everyone


Kathryn L.S. Pettit and Jonathan Schwabis at UrbanWire: “Over the past few years, there have been some exciting developments in open source tools and programming languages, business intelligence tools, big data, open data, and data visualization. These trends, and others, are changing the way we interact with and consume information and data. And that change is driving more organizations and governments to consider better ways to provide their data to more people.

The World Bank, for example, has a concerted effort underway to open its data in better and more visual ways. Google’s Public Data Explorer brings together large datasets from around the world into a single interface. For-profit providers like OpenGov and Socrata are helping local, state, and federal governments open their data (both internally and externally) in newer platforms.

We are firm believers in open data. (There are, of course, limitations to open data because of privacy or security, but that’s a discussion for another time). But open data is not simply about putting more data on the Internet. It’s not just only about posting files and telling people where to find them. To allow and encourage more people to use and interact with data, that data needs to be useful and readable not only by researchers, but also by the dad in northern Virginia or the student in rural Indiana who wants to know more about their public libraries.

Open data should be easy to access, analyze, and visualize

Many are working hard to provide more data in better ways, but we have a long way to go. Take, for example, the Congressional Budget Office (full disclosure, one of us used to work at CBO). Twice a year, CBO releases its Budget and Economic Outlook, which provides the 10-year budget projections for the federal government. Say you want to analyze 10-year budget projections for the Pell Grant program. You’d need to select “Get Data” and click on “Baseline Projections for Education” and then choose “Pell Grant Programs.” This brings you to a PDF report, where you can copy the data table you’re looking for into a format you can actually use (say, Excel). You would need to repeat the exercise to find projections for the 21 other programs for which the CBO provides data.

In another case, the Bureau of Labor Statistics has tried to provide users with query tools that avoid the use of PDFs, but still require extra steps to process. You can get the unemployment rate data through their Java Applet (which doesn’t work on all browsers, by the way), select the various series you want, and click “Get Data.” On the subsequent screen, you are given some basic formatting options, but the default display shows all of your data series as separate Excel files. You can then copy and paste or download each one and then piece them together.

Taking a step closer to the ideal of open data, the Institute of Museum and Library Services (IMLS)followed President Obama’s May 2013 executive order to make their data open in a machine-readable format. That’s great, but it only goes so far. The IMLS platform, for example, allows you to explore information about your own public library. But the data are labeled with variable names such as BRANLIB and BKMOB that are not intuitive or clear. Users then have to find the data dictionary to understand what data fields mean, how they’re defined, and how to use them.

These efforts to provide more data represent real progress, but often fail to be useful to the average person. They move from publishing data that are not readable (buried in PDFs or systems that allow the user to see only one record at a time) to data that are machine-readable (libraries of raw data files or APIs, from which data can be extracted using computer code). We now need to move from a world in which data are simply machine-readable to one in which data are human-readable….(More)”

Open data can unravel the complex dealings of multinationals


 in The Guardian: “…Just like we have complementary currencies to address shortcomings in national monetary systems, we now need to encourage an alternative accounting sector to address shortcomings in global accounting systems.

So what might this look like? We already are seeing the genesis of this in the corporate open data sector. OpenCorporates in London has been a pioneer in this field, creating a global unique identifier system to make it easier to map corporations. Groups like OpenOil in Berlin are now using the OpenCorporates classification system to map companies like BP. Under the tagline “Imagine an open oil industry”, they have also begun mapping ground-level contract and concession data, and are currently building tools to allow the public to model the economics of particular mines and oil fields. This could prove useful in situations where doubt is cast on the value of particular assets controlled by public companies in politically fragile states.

 OpenOil’s objective is not just corporate transparency. Merely disclosing information does not advance understanding. OpenOil’s real objective is to make reputable sources of information on oil companies usable to the general public. In the case of BP, company data is already deposited in repositories like Companies House, but in unusable, jumbled and jargon-filled pdf formats. OpenOil seeks to take such transparency, and turn it into meaningful transparency.

According to OpenOil’s Anton Rühling, a variety of parties have started to use their information. “During the recent conflicts in Yemen we had a sudden spike in downloads of our Yemeni oil contract information. We traced this to UAE, where a lot of financial lawyers and investors are based. They were clearly wanting to see how the contracts could be affected.” Their BP map even raised interest from senior BP officials. “We were contacted by finance executives who were eager to discuss the results.”

Open mapping

Another pillar of the alternative accounting sector that is emerging are supply chain mapping systems. The supply chain largely remains a mystery. In standard corporate accounts suppliers appear as mere expenses. No information is given about where the suppliers are based and what their standards are. In the absence of corporate management volunteering that information, Sourcemap has created an open platform for people to create supply chain maps themselves. Progressively-minded companies – such as Fairphone – have now begun to volunteer supply chain information on the platform.

One industry forum that is actively pondering alternative accounting is ICAEW’s AuditFutures programme. They recently teamed up with the Royal College of Art’s service design programme to build design thinking into accounting practice. AuditFuture’s Martin Martinoff wants accountants’ to perceive themselves as being creative innovators for the public interest. “Imagine getting 10,000 auditors online together to develop an open crowdsourced audit platform.”…(More)

Push, Pull, and Spill: A Transdisciplinary Case Study in Municipal Open Government


New paper by Jan Whittington et al: “Cities hold considerable information, including details about the daily lives of residents and employees, maps of critical infrastructure, and records of the officials’ internal deliberations. Cities are beginning to realize that this data has economic and other value: If done wisely, the responsible release of city information can also release greater efficiency and innovation in the public and private sector. New services are cropping up that leverage open city data to great effect.

Meanwhile, activist groups and individual residents are placing increasing pressure on state and local government to be more transparent and accountable, even as others sound an alarm over the privacy issues that inevitably attend greater data promiscuity. This takes the form of political pressure to release more information, as well as increased requests for information under the many public records acts across the country.

The result of these forces is that cities are beginning to open their data as never before. It turns out there is surprisingly little research to date into the important and growing area of municipal open data. This article is among the first sustained, cross-disciplinary assessments of an open municipal government system. We are a team of researchers in law, computer science, information science, and urban studies. We have worked hand-in-hand with the City of Seattle, Washington for the better part of a year to understand its current procedures from each disciplinary perspective. Based on this empirical work, we generate a set of recommendations to help the city manage risk latent in opening its data….(More)”

Open Data and Sub-national Governments: Lessons from Developing Countries


WebFoundation: “Open government data (OGD) as a concept is gaining currency globally due to the strong advocacy of global organisations as Open Government Partnership. In recent years, there has been increased commitment on the part of national governments to proactively disclose information. However, much of the discussion on OGD is at the national level, especially in developing countries where commitments of proactive disclosure is conditioned by the commitments of national governments as expressed through the OGP national action plans. However, the local is important in the context of open data. In decentralized contexts, the local is where data is collected and stored, where there is strong feasibility that data will be published, and where data can generate the most impact when used. This synthesis paper wants to refocus the discussion of open government data in sub-national contexts by analysing nine country papers produced through the Open Data in Developing Countries research project.

Using a common research framework that focuses on context, governance setting, and open data initiatives, the study found out that there is substantial effort on the part of sub-national governments to proactively disclose data, however, the design delimits citizen participation, and eventually, use. Second, context demands diff erent roles for intermediaries and diff erent types of initiatives to create an enabling environment for open data. Finally, data quality will remain a critical challenge for sub-national governments in developing countries and it will temper potential impact that open data will be able to generate. Download the full research paper here

100 parliaments as open data, ready for you to use


Myfanwy Nixon at mySociety’s blog and OpeningParliament: “If you need data on the people who make up your parliament, another country’s parliament, or indeed all parliaments, you may be in luck.

Every Politician, the latest Poplus project, aims to collect, store and share information about every parliament in the world, past and present—and it already contains 100 of them.

What’s more, it’s all provided as Open Data to anyone who would like to use it to power a civic tech project. We’re thinking parliamentary monitoring organisations, journalists, groups who run access-to-democracy sites like our own WriteToThem, and especially researchers who want to do analysis across multiple countries.

But isn’t that data already available?

Yes and no. There’s no doubt that you can find details of most parliaments online, either on official government websites, on Wikipedia, or on a variety of other places online.

But, as you might expect from data that’s coming from hundreds of different sources, it’s in a multitude of different formats. That makes it very hard to work with in any kind of consistent fashion.

Every Politician standardises all of its data into the Popolo standard and then provides it in two simple downloadable formats:

  • csv, which contains basic data that’s easy to work with on spreadsheets
  • JSON which contains richer data on each person, and is ideal for developers

This standardisation means that it should now be a lot easier to work on projects across multiple countries, or to compare one country’s data with another. It also means that data works well with other Poplus Components….(More)”

Local open data ecosystems – a prototype map


Ed Parkes and Gail Dawes at Nesta: “It is increasingly recognised that some of the most important open data is published by local authorities (LAs) – data which is important to us like bin collection days, planning applications and even where your local public toilet is. Also given the likely move towards greater decentralisation, firstly through devolution to cities, the importance of the publication of local open data could arguably become more important over the next couple of years. In addition, as of 1st April, there is a new transparency code for local government requiring local authorities to publish further information on things like spending to local land assets. To pre-empt this likely renewed focus on local open data we have begun to develop a prototype map to highlight the UK’s local open data ecosystem.

Already there is some great practice in the publication of open data at a local level – such as Leeds Data Mill, London Datastore, and Open Data Sheffield. This regional activity is also characterised not just by high quality data publication, but also by pulling together through hackdays, challenges and meetups a community interested in the power of open data. This creates an ecosystem of publishers and re-users at a local level. Some of the best practice in relation to developing such an ecosystem was recognised by the last government in the announcement of a group of Local Authority Open Data Champions. Some of these were also recipients of the funding for projects from both the Cabinet Office and through the Open Data User Group.

Outside of this best practice it isn’t always easy to understand how developed smaller, less urban open data agendas are. Other than looking at each councils’ website or increasingly on the data portals that forwarding thinking councils are providing, there is a surprisingly large number of places that local authorities could make their open data available. The most well known of these is the Openly Local project but at the time of writing this now seems to be retired. Perhaps the best catalogue of local authority data is on Data.gov.uk itself. This has 1,449 datasets published by LAs across 200 different organisations. Following that there is the Open Data Communities website which hosts links to LA linked datasets. Using data from the latter, Steve Peters has developed the local data dashboard (which was itself based on the UK Local Government Open Data resource map from Owen Boswarva). In addition, local authorities can also register their open data in the LGA’s Open Data Inventory Service and take it through the ODI’s data certification process.

Prototype map of local open data eco-systems

To try to highlight patterns in local authority open data publication we decided to make a map of activity around the country (although in the first instance we’ve focused on England)….(More)

Yelp’s Consumer Protection Initiative: ProPublica Partnership Brings Medical Info to Yelp


Yelp Official Blog: “…exists to empower and protect consumers, and we’re continually focused on how we can enhance our service while enhancing the ability for consumers to make smart transactional decisions along the way.

A few years ago, we partnered with local governments to launch the LIVES open data standard. Now, millions of consumers find restaurant inspection scores when that information is most relevant: while they’re in the middle of making a dining decision (instead of when they’re signing the check). Studies have shown that displaying this information more prominently has a positive impact.

Today we’re excited to announce we’ve joined forces with ProPublica to incorporate health care statistics and consumer opinion survey data onto the Yelp business pages of more than 25,000 medical treatment facilities. Read more in today’s Washington Post story.

We couldn’t be more excited to partner with ProPublica, the Pulitzer Prize winning non-profit newsroom that produces investigative journalism in the public interest.

The information is compiled by ProPublica from their own research and the Centers for Medicare and Medicaid Services (CMS) for 4,600 hospitals, 15,000 nursing homes, and 6,300 dialysis clinics in the US and will be updated quarterly. Hover text on the business page will explain the statistics, which include number of serious deficiencies and fines per nursing home and emergency room wait times for hospitals. For example, West Kendall Baptist Hospital has better than average doctor communication and an average 33 minute ER wait time, Beachside Nursing Center currently has no deficiencies, and San Mateo Dialysis Center has a better than average patient survival rate.

Now the millions of consumers who use Yelp to find and evaluate everything from restaurants to retail will have even more information at their fingertips when they are in the midst of the most critical life decisions, like which hospital to choose for a sick child or which nursing home will provide the best care for aging parents….(More)