Putting Government Data to Work


U.S. Department of Commerce Press Release: “The Governance Lab (GovLab) at New York University today released “Realizing The Potential of Open Government Data: A Roundtable with the U.S. Department of Commerce,” a report on findings and recommendations for ways the U.S. Commerce Department can improve its data management, dissemination and use. The report summarizes a June 2014 Open Data Roundtable, co-hosted by The GovLab and the White House Office of Science and Technology Policy with the Commerce Department, which brought together Commerce data providers and 25 representatives from the private sector and nonprofit organizations for an action-oriented dialogue on data issues and potential solutions. The GovLab is convening a series of other Open Data Roundtables in its mission to help make government more effective and connected to the public through technology.

“We were honored to work with the White House and the Department of Commerce to convene this event,” said Joel Gurin, senior advisor at The GovLab and project director of the Open Data 500 and the Roundtable Series. “The Department’s commitment to engaging with its data customers opens up great opportunities for public-private collaboration.”
Under Secretary of Commerce for Economic Affairs Mark Doms said, “At the Commerce Department, we are only at the beginning of our open data effort. We share the goals and objectives embodied by the call of the Open Data 500: to deliver data that is valuable to industry and that provides greater economic opportunity for millions of Americans.” …”

Big Thinkers. Big Data. Big Opportunity: Announcing The LinkedIn Economic Graph Challeng


at Linkedin Official Blog: “LinkedIn’s vision is to create economic opportunity for every member of the global workforce. Facilitating economic empowerment is a big task that will require bold thinking by smart, passionate individuals and groups. Today, we’re kicking off an initiative that aims to encourage this type of big thinking: the LinkedIn Economic Graph Challenge.
The LinkedIn Economic Graph Challenge is an idea that emerged from the development of the Economic Graph, a digital mapping of the global economy, comprised of a profile for every professional, company, job opportunity, the skills required to obtain those opportunities, every higher education organization, and all the professionally relevant knowledge associated with each of these entities. With these elements in place, we can connect talent with opportunity at massive scale.
We are launching the LinkedIn Economic Graph Challenge to encourage researchers, academics, and data-driven thinkers to propose how they would use data from LinkedIn to solve some of the most challenging economic problems of our times. We invite anyone who is interested to submit your most innovative, ambitious ideas. In return, we will recognize the three strongest proposals for using data from LinkedIn to generate a positive impact on the global economy, and present the team and/or individual with a $25,000 (USD) research award and the resources to complete their proposed research, with the potential to have it published….
We look forward to your submissions! For more information, please visit the LinkedIn Economic Graph Challenge website….”

Canada's Action Plan on Open Government 2014-2016


Draft action plan: “Canada’s second Action Plan on Open Government consists of twelve commitments that will advance open government principles in Canada over the next two years and beyond. The Directive on Open Government, new policy direction to federal departments and agencies on open government, will provide foundational support for each of the additional commitments which fall under three streams: Open Data, Open Information, and Open Dialogue.
Figure 1: Our Commitments
Open Government Directive Diagram

 

More:

Table of Contents

 
 

Data revolution: How the UN is visualizing the future


Kate Krukiel at Microsoft Government: “…world leaders met in New York for the 69th session of the United Nations (UN) General Assembly. Progress toward achieving the eight Millennium Development Goals (MDGs) by the December 2015 target date—just 454 days away—was top of mind. So was the post-2015 agenda, which will pick up where the MDGs leave off. Ahead of the meetings, the UN Millennium Campaign asked Microsoft to build real-time visualizations of the progress on each goal—based on data spanning 21 targets, 60 indicators, and about 190 member countries. With the data visualizations we created (see them at http://www.mdgleaders.org/), UN and global leaders can decide where to focus in the next 15 months and, more importantly, where change needs to happen post-2015. Their experience offers three lessons for governments:

1. Data has a shelf life.

Since the MDGs were launched in 2000, the UN has relied on annual reports to assess its progress. But in August, UN Secretary-General Ban Ki-moon called for a “data revolution for sustainable development”, which in effect makes real-time data visualization a requirement, not just for tracking the MDGs, but for everything from Ebola to climate change….

2.Governments need visualization tools.

Just as the UN is using data visualization to track its progress and plan for the future, you can use the technology to better understand the massive amounts of data you collect—on everything from water supply and food prices to child mortality and traffic jams. Data visualization technology makes it possible to pull insights from historical data, develop forecasts, and spot gaps in your data far easier than you can with raw data. As they say, a picture is worth a thousand words. To get a better idea of what’s possible, check out the MDG visualizations Microsoft created for the UN using our Power BI tool.

3.The private sector can help.

The UN called on the private sector to assist in determining the exact MDG progress and inspire ongoing global efforts. …

Follow the UN’s lead and join the #datarevolution now, if you haven’t already. It’s an opportunity to work across silos and political boundaries to address the world’s most pressing problems. It takes citizens’ points of view into account through What People Want. And it extends to the private sector, where expertise in using technology to create a sustainable future already exists. I encourage all government leaders to engage. To follow where the UN takes its revolution, watch for updates on the Data Revolution Group website or follow them on Twitter @data_rev….”

Brazilian Government Develops Toolkit to Guide Institutions in both Planning and Carrying Out Open Data Initatives


Nitai Silva at the Open Knowledge Blog: “Recently Brazilian government released the Kit de Dados Abertos (open data toolkit). The toolkit is made up of documents describing the process, methods and techniques for implementing an open data policy within an institution. Its goal is to both demystify the logic of opening up data and to share with public employees observed best practices that have emerged from a number of Brazilian government initiatives.

The toolkit focuses on the Plano de Dados Abertos – PDA (Open Data Plan) as the guiding instrument where commitments, agenda and policy implementation cycles in the institution are registered. We believe that making each public agency build it’s own PDA is a way to perpetuate the open data policy, making it a state policy and not just a transitory governmental action.
It is organized to facilitate the implementation of the main activities cycles that must be observed in an institution and provides links and manuals to assist in these activities. Emphasis is given to the actors/roles involved in each step and their responsibilities. It also helps to define a central person to monitor and maintain the PDA. The following diagram summarizes the macro steps of implementing an open data policy in an institution:
 

Hey Uncle Sam, Eat Your Own Dogfood


It’s been five years since Tim O’Reilly published his screed on Government as Platform. In that time, we’ve seen “civic tech” and “open data” gain in popularity and acceptance. The Federal Government has an open data platform, data.gov. And so too do states and municipalities across America. Code for America is the hottest thing around, and the healthcare.gov fiasco landed fixing public technology as a top concern in government. We’ve successfully laid the groundwork for a new kind of government technology. We’re moving towards a day when, rather than building user facing technology, the government opens up interfaces to data that allows the private sector to create applications and websites that consume public data and surface it to users.

However, we appear to have stalled out a bit in our progress towards government as platform. It’s incredibly difficult to ingest the data for successful commercial products. The kaleidoscope of data formats in open data portals like data.gov might politely be called ‘obscure’, and perhaps more accurately, ‘perversely unusable’. Some of the data hasn’t been updated since first publication, and is quite positively too stale to use. If documentation exists, most of the time it’s incomprehensible….

What we actually need, is for Uncle Sam to start dogfooding his own open data.

For those of you who aren’t familiar with the term, dogfooding is a slang term used by engineers who are using their own product. So, for example, Google employees use Gmail and Google Drive to organize their own work. This term also applies to engineering teams that consume their public APIs to access internal data. Dogfooding helps teams deeply understand their own work from the same perspective as external users. It also provides a keen incentive to make products work well.

Dogfooding is the golden rule of platforms. And currently, open government portals are flagrantly violating this golden rule. I’ve asked around, and I can’t find a single example of a government entity consuming the data they publish…”

Open Data Beyond the Big City


at PBS MediaShift: “…Open data is the future — of how we govern, of how public services are delivered, of how governments engage with those that they serve. And right now, it is unevenly distributed. I think there is a strong argument to be made that data standards can provide a number of benefits to small and midsized municipal governments and could provide a powerful incentive for these governments to adopt open data.
One way we can use standards to drive the adoption of open data is to partner with companies like YelpZillowGoogle and others that can use open data to enhance their services. But how do we get companies with 10s and 100s of millions of users to take an interest in data from smaller municipal governments?
In a word – standards.

Why do we care about cities?

When we talk about open data, it’s important to keep in mind that there is a lot of good work happening at the federal, state and local levels all over the country — plenty of states and even counties doing good things on the open data front, but for me it’s important to evaluate where we are on open data with respect to cities.
States typically occupy a different space in the service delivery ecosystem than cities, and the kinds of data that they typically make available can be vastly different from city data. State capitals are often far removed from our daily lives and we may hear about them only when a budget is adopted or when the state legislature takes up a controversial issue.
In cities, the people that represent and serve us us can be our neighbors — the guy behind you at the car wash, or the woman who’s child is in you son’s preschool class. Cities matter.
As cities go, we need to consider carefully that importance of smaller cities — there are a lot more of them than large cities and a non-trivial number of people live in them….”

New Technology and the Prevention of Violence and Conflict


Report edited by Francesco Mancini for the International Peace Institute: “In an era of unprecedented interconnectivity, this report explores the ways in which new technologies can assist international actors, governments, and civil society organizations to more effectively prevent violence and conflict. It examines the contributions that cell phones, social media, crowdsourcing, crisis mapping, blogging, and big data analytics can make to short-term efforts to forestall crises and to long-term initiatives to address the root causes of violence.
Five case studies assess the use of such tools in a variety of regions (Africa, Asia, Latin America) experiencing different types of violence (criminal violence, election-related violence, armed conflict, short-term crisis) in different political contexts (restrictive and collaborative governments).
Drawing on lessons and insights from across the cases, the authors outline a how-to guide for leveraging new technology in conflict-prevention efforts:
1. Examine all tools.
2. Consider the context.
3. Do no harm.
4. Integrate local input.
5. Help information flow horizontally.
6. Establish consensus regarding data use.
7. Foster partnerships for better results.”

The Web Observatory: A Middle Layer for Broad Data


New paper by Tiropanis Thanassis, Hall Wendy, Hendler James, and de Larrinaga Christian in Big Data: “The Web Observatory project1 is a global effort that is being led by the Web Science Trust,2 its network of WSTnet laboratories, and the wider Web Science community. The goal of this project is to create a global distributed infrastructure that will foster communities exchanging and using each other’s web-related datasets as well as sharing analytic applications for research and business web applications.3 It will provide the means to observe the digital planet, explore its processes, and understand their impact on different sectors of human activity.
The project is creating a network of separate web observatories, collections of datasets and tools for analyzing data about the Web and its use, each with their own use community. This allows researchers across the world to develop and share data, analytic approaches, publications related to their datasets, and tools (Fig. 1). The network of web observatories aims to bridge the gap that currently exists between big data analytics and the rapidly growing web of “broad data,”4 making it difficult for a large number of people to engage with them….”