Innovating and changing the policy-cycle: Policy-makers be prepared!


Marijn Janssen and Natalie Helbig in Government Information Quarterly: “Many policy-makers are struggling to understand participatory governance in the midst of technological changes. Advances in information and communication technologies (ICTs) continue to have an impact on the ways that policy-makers and citizens engage with each other throughout the policy-making process. A set of developments in the areas of opening government data, advanced analytics, visualization, simulation, and gaming, and ubiquitous citizen access using mobile and personalized applications is shaping the interactions between policy-makers and citizens. Yet the impact of these developments on the policy-makers is unclear. The changing roles and need for new capabilities required from the government are analyzed in this paper using two case studies. Salient new roles for policy-makers are outlined focused on orchestrating the policy-making process. Research directions are identified including understand the behavior of users, aggregating and analyzing content from scattered resources, and the effective use of the new tools. Understanding new policy-makers roles will help to bridge the gap between the potential of tools and technologies and the organizational realities and political contexts. We argue that many examples are available that enable learning from others, in both directions, developed countries experiences are useful for developing countries and experiences from the latter are valuable for the former countries…(More)”

The Open (Data) Market


Sean McDonald at Medium: “Open licensing privatizes technology and data usability. How does that effect equality and accessibility?…The open licensing movement(open data, open source software, etc.) predicates its value on increasing accessibility and transparency by removing legal and ownership restrictions on use. The groups that advocate for open data and open sourcecode, especially in government and publicly subsidized industries, often come from transparency, accountability, and freedom of information backgrounds. These efforts, however, significantly underestimate the costs of refining, maintaining, targeting, defining a value proposition, marketing, and presenting both data and products in ways that are effective and useful for the average person. Recent research suggests the primary beneficiaries of civic technologies — those specifically built on government data or services — are privileged populations. The World Banks recent World Development Report goes further to point out that public digitization can be a driver of inequality.

The dynamic of self-replicating privilege in both technology and openmarkets is not a new phenomenon. Social science research refers to it as the Matthew Effect, which says that in open or unregulated spaces, theprivileged tend to become more privileged, while the poor become poorer.While there’s no question the advent of technology brings massive potential,it is already creating significant access and achievement divides. Accordingto the Federal Communication Commission’s annual Broadband Progressreport in 2015, 42% of students in the U.S. struggle to do their homeworkbecause of web access — and 39% of rural communities don’t even have abroadband option. Internet access skews toward urban, wealthycommunities, with income, geography, and demographics all playing a rolein adoption. Even further, research suggests that the rich and poor use technology differently. This runs counter to narrative of Interneteventualism, which insist that it’s simply a (small) matter of time beforethese access (and skills) gaps close. Evidence suggests that for upper andmiddle income groups access is almost universal, but the gaps for lowincome groups are growing…(More)”

Open Data Is Changing the World in Four Ways…


 at The GovLab Blog: “New repository of case studies documents the impact of open data globally: odimpact.org.

odimpact-tweet-3

Despite global commitments to and increasing enthusiasm for open data, little is actually known about its use and impact. What kinds of social and economic transformation has open data brought about, and what is its future potential? How—and under what circumstances—has it been most effective? How have open data practitioners mitigated risks and maximized social good?

Even as proponents of open data extol its virtues, the field continues to suffer from a paucity of empiricalevidence. This limits our understanding of open data and its impact.

Over the last few months, The GovLab (@thegovlab), in collaboration with Omidyar Network(@OmidyarNetwork), has worked to address these shortcomings by developing 19 detailed open data case studies from around the world. The case studies have been selected for their sectoral and geographic representativeness. They are built in part from secondary sources (“desk research”), and also from more than60 first-hand interviews with important players and key stakeholders. In a related collaboration withOmidyar Network, Becky Hogge(@barefoot_techie), an independent researcher, has developed an additional six open data case studies, all focused on the United Kingdom.  Together, these case studies, seek to provide a more nuanced understanding of the various processes and factors underlying the demand, supply, release, use and impact of open data.

Today, after receiving and integrating comments from dozens of peer reviewers through a unique open process, we are delighted to share an initial batch of 10 case studies, as well three of Hogge’s UK-based stories. These are being made available at a new custom-built repository, Open Data’s Impact (http://odimpact.org), that will eventually house all the case studies, key findings across the studies, and additional resources related to the impact of open data. All this information will be stored in machine-readable HTML and PDF format, and will be searchable by area of impact, sector and region….(More)

Iowa fights snow with data


Patrick Marshall at GCN: “Most residents of the Mid-Atlantic states, now digging out from the recent record-setting snowstorm, probably don’t know how soon their streets will be clear.  If they lived in Iowa, however, they could simply go to the state’s Track a Plow website to see in near real time where snow plows are and in what direction they’re heading.

In fact, the Track a Plow site — the first iteration of which launched three years ago — shows much more than just the location and direction of the state’s more than 900 plows. Because they are equipped with geolocation equipment and a variety of sensors,  the plows also provide information on road conditions, road closures and whether trucks are applying liquid or solid materials to counter snow and ice.  That data is regularly uploaded to Track a Plow, which also offers near-real-time video and photos of conditions.

Track a Plow screenshot

According to Eric Abrams, geospatial manager at the Iowa Department of Transportation, the service is very popular and is being used for a variety of purposes.  “It’s been one of the greatest public interface things that DOT has ever done,” he said.  In addition to citizens considering travel, Abrams said the, site’s heavy users include news stations, freight companies routing vehicles and school districts determining whether to delay opening or cancel classes.

How it works

While Track a Plow launched with just location information, it has been frequently enhanced over the past two years, beginning with the installation of video cameras.  “The challenge was to find a cost-effective way to put cams in the plows and then get those images not just to supervisors but to the public,” Abrams said.  The solution he arrived at was dashboard-mounted iPhones that transmit time and location data in addition to images.  These were especially cost-effective because they were free with the department’s Verizon data plan. “Our IT division built a custom iPhone app that is configurable for how often it sends pictures back to headquarters here, where we process them and get them out to the feed,” he explained….(More)”

Opening Governance – Change, Continuity and Conceptual Ambiguity


Introduction to special issue of IDS Bulletin by Rosemary McGee and Duncan Edwards: “Open government and open data are new areas of research, advocacy and activism that have entered the governance field alongside the more established areas of transparency and accountability. This article reviews recent scholarship in these areas, pinpointing contributions to more open, transparent, accountable and responsive governance via improved practice, projects and programmes. The authors set the rest of the articles from this IDS Bulletin in the context of the ideas, relationships, processes, behaviours, policy frameworks and aid funding practices of the last five years, and critically discuss questions and weaknesses that limit the effectiveness and impact of this work. Identifying conceptual ambiguity as a key problem, they offer a series of definitions to help overcome the technical and political difficulties this causes. They also identify hype and euphemism, and offer a series of conclusions to help restore meaning and ideological content to work on open government and open data in transparent and accountable governance….(More)”

When is your problem a ‘Challenge’?


Ed Parkes at NESTA: “More NGOs, Government Departments and city governments are using challenge prizes to help develop new products and services which ‘solve’ a problem they have identified. There have been several high profile prizes (for instance, Nesta’s Longitude Prize or the recently announced $7 million ocean floor Xprize) and a growing number of platforms for running them (such as Challenge.gov or OpenIdeo). Due to this increased profile, challenge prizes are more often seen by public sector strategists and policy owners as holding the potential to solve their tricky strategic issues.

To characterise, the starting point is often “If only we could somehow get new, smart, digitally-informed organisations to solve the underfunded, awkward strategic issues we’ve been grappling with, wouldn’t it be great?”.

This approach is especially tantalising for public sector organisations as it means they can be seen to take action on an issue through ‘market shaping’, rather than resorting to developing policy or intervening with regulation or legislation.

Having worked on a series of challenge prizes on open data over the last couple of years, as well as subsequently working with organisations on how our design principles could be applied to their objectives, I’ve spent some time thinking about when it’s appropriate to run a challenge prize. The design and practicalities of running a successful challenge prize are not always straightforward. Thankfully there has already been some useful broad guidance on this from Nesta’s Centre for Challenge Prizes in their Challenge Prize Practice Guide and McKinsey and Deloitte have also published guides.

Nevertheless despite this high quality guidance, like many things in life, the most difficult part is knowing where to start. Organisations struggle to understand whether they have the right problem in the first place. In many instances running a challenge prize is not the appropriate organisational response to an issue and it’s best to discover this early on. From my experience, there are two key questions which are worth asking when you’re trying to work out if your problem is suitable:

1. Is your problem an issue for anyone other than your own organisation?…

2. Will other people see solving this problem as an investment opportunity or worth funding?…

These two considerations come down to one thing – incentive. Firstly, does anyone other than your organisation care about this issue and secondly, do they care enough about it to pay to solve it…..(More)’

Met Office warns of big data floods on the horizon


 at V3: “The amount of data being collected by departments and agencies mean government services will not be able to implement truly open data strategies, according to Met Office CIO Charles Ewen.

Ewen said the rapidly increasing amount of data being stored by companies and government departments mean it will not be technologically possible able to share all their data in the near future.

During a talk at the Cloud World Forum on Wednesday, he said: “The future will be bigger and bigger data. Right now we’re talking about petabytes, in the near future it will be tens of petabytes, then soon after it’ll be hundreds of petabytes and then we’ll be off into imaginary figure titles.

“We see a future where data has gotten so big the notion of open data and the idea ‘lets share our data with everybody and anybody’ just won’t work. We’re struggling to make it work already and by 2020 the national infrastructure will not exist to shift this stuff [data] around in the way anybody could access and make use of it.”

Ewen added that to deal with the shift he expects many departments and agencies will adapt their processes to become digital curators that are more selective about the data they share, to try and ensure it is useful.

“This isn’t us wrapping our arms around our data and saying you can’t see it. We just don’t see how we can share all this big data in the way you would want it,” he said.

“We see a future where a select number of high-capacity nodes become information brokers and are used to curate and manage data. These curators will be where people bring their problems. That’s the future we see.”

Ewan added that the current expectations around open data are based on misguided views about the capabilities of cloud technology to host and provide access to huge amounts of data.

“The trendy stuff out there claims to be great at everything, but don’t get carried away. We don’t see cloud as anything but capability. We’ve been using appropriate IT and what’s available to deliver our mission services for over 50 to 60 years, and cloud is playing an increasing part of that, but purely for increased capability,” he said.

“It’s just another tool. The important thing is having the skill and knowledge to not just believe vendors but to look and identify the problem and say ‘we have to solve this’.”

The Met Office CIO’s comments follow reports from other government service providers that people’s desire for open data is growing exponentially….(More)”

Open data set to reshape charity and activism in 2016


The Guardian: “In 2015 the EU launched the world’s first international data portal, the Chinese government pledged to make state data public, and the UK lost its open data crown to Taiwan. Troves of data were unlocked by governments around the world last year, but the usefulness of much of that data is still to be determined by the civic groups, businesses and governments who use it. So what’s in the pipeline? And how will the open data ecosystem grow in 2016? We asked the experts.

1. Data will be seen as infrastructure (Heather Savory, director general for data capability, Office for National Statistics)….

2. Journalists, charities and civil society bodies will embrace open data (Hetan Shah, executive director, the Royal Statistical Society)…3. Activists will take it upon themselves to create data

3. Activists will take it upon themselves to create data (Pavel Richter, chief executive, Open Knowledge International)….

 

4. Data illiteracy will come at a heavy price (Sir Nigel Shadbolt, principal, Jesus College, Oxford, professorial research fellow in computer science, University of Oxford and chairman and co-founder of the Open Data Institute…)

5. We’ll create better tools to build a web of data (Dr Elena Simperl, associate professor, electronics and computer science, University of Southampton) …(More)”

This Is How Visualizing Open Data Can Help Save Lives


Alexander Howard at the Huffington Post: “Cities are increasingly releasing data that they can use to make life better for their residents online — enabling journalists and researchers to better inform the public.

Los Angeles, for example, has analyzed data about injuries and deaths on its streets and published it online. Now people can check its conclusions and understand why LA’s public department prioritizes certain intersections.

The impact from these kinds of investments can lead directly to saving lives and preventing injuries. The work is part of a broader effort around the world to make cities safer.

Like New York City, San Francisco and Portland, Oregon, Los Angeles has adopted Sweden’s “Vision Zero” program as part of its strategy for eliminating traffic deathsCalifornia led the nation in bicycle deaths in 2014.

At visionzero.lacity.org, you can see that the City of Los Angeles is using data visualization to identify the locations of “high injury networks,” or the 6 percent of intersections that account for 65 percent of the severe injuries in the area.

CITY OF LOS ANGELES

The work is the result of LA’s partnership with University of South California graduate students. As a result of these analyses, the Los Angeles Police Department has been cracking down on jaywalking near the University of Southern California.

Abhi Nemani, the former chief data officer for LA, explained why the city needed to “go back to school” for help.

“In resource-constrained environments — the environment most cities find themselves in these days — you often have to beg, borrow, and steal innovation; particularly so, when it comes to in-demand resources such as data science expertise,” he told the Huffington Post.

“That’s why in Los Angeles, we opted to lean on the community for support: both the growing local tech sector and the expansive academic base. The academic community, in particular, was eager to collaborate with the city. In fact, most — if not all — local institutions reached out to me at some point asking to partner on a data science project with their graduate students.”

The City of Los Angeles is now working with another member of its tech sector toeliminate traffic deaths. DataScience, based in Culver City, California, received $22 million dollars in funding in December to make predictive insights for customers.

“The City of Los Angeles is very data-driven,” DataScience CEO Ian Swanson told HuffPost. “I commend Mayor Eric Garcetti and the City of Los Angeles on the openness, transparency, and availability of city data initiatives, like Vision Zero, put the City of Los Angeles‘ data into action and improve life in this great city.”

DataScience created an interactive online map showing the locations of collisions involving bicycles across the city….(More)”

50 states, 50 public records stories


 at Poynter: “I try to feature journalists who are telling important stories using public records. For my final column of 2015, I wanted to do something big and decided to find public records stories from all 50 states (plus, a bonus: Washington, D.C.).

This is not meant to be a “best of” list. It’s simply a collection of public records stories from the past year that intrigued me. I found many of the stories by searching the National Freedom of Information Coalition’s website, as well as Investigative Reporters & Editors.…check out my list of public records stories from around the country and see what records journalists are requesting.  It’s full of great story ideas:

Alabama

Auburn spent $1.67 million on Outback Bowl trip

(Montgomery Advertiser)

Auburn spent more than $1.6 million on its Outback Bowl trip, according to the Institutional Bowl Expense report summary submitted to the NCAA and released in response to a Freedom of Information Act request.

Alaska

KMXT sues Kodiak City for documents in police brutality case

(KMXT)

The public radio station filed suit against the City of Kodiak to get records from police after three officers handcuffed and pepper-sprayed a man with autism.

Arizona

Legislature redacts, delays and denies access to messages

(Arizona Capitol Times)

The newspaper requested electronic messages sent among top state elected officials of both parties and their top staff. But getting access to those messages was difficult.

Arkansas

Some question email deletion policies

(Arkansas News)

After the state treasurer’s office instituted a policy requiring employees to delete all emails after 30 days, critics questioned whether it was necessary and whether it was consistent with the spirit of open government.

California

Collapsed I-10 bridge given an A rating just last year

(The Desert Sun)

After a bridge collapsed on Interstate 10, the newspaper reviewed Federal Highway Administration data and found that the bridge had been given an “A” rating and one of the highest possible flood safety ratings.

Colorado

Students accuse CU-Boulder of delaying release of debate documents

(Daily Camera)

University of Colorado students accused administrators of dragging their feet on an open records request the students filed to get letters, emails and documents related to the Republican presidential debate held on campus….(More)”