Digital Strategy: Delivering Better Results for the Public


The White House Blog: “Today marks one year since we released the Digital Government Strategy (PDF/ HTML5), as part of the President’s directive to build a 21st Century Government that delivers better services to the American people.
The Strategy is built on the proposition that all Americans should be able to access information from their Government anywhere, anytime, and on any device; that open government data – data that are publicly accessible in easy-to-use formats – can fuel innovation and economic growth; and that technology can make government more transparent, more efficient, and more effective.
A year later, there’s a lot to be proud of:
Information Centric
In twelve months, the Federal Government has significantly shifted how it thinks about digital information – treating data as a valuable national asset that should be open and available to the public, to entrepreneurs, and others, instead of keeping it trapped in government systems. …
Shared Platform
The Federal Government and the American people cannot afford to have each agency build isolated and duplicative technology solutions. Instead, we must use modern platforms for digital services that can be shared across agencies….
Customer-Centric
Citizens shouldn’t have to struggle to access the information they need. To ensure that the American people can easily find government services, we implemented a government-wide Digital Analytics Program across all Federal websites….
Security and Privacy
Throughout all of these efforts, maintaining cyber security and protecting privacy have been paramount….
In the end, the digital strategy is all about connecting people to government resources in useful ways. And by “connecting” we mean a two-way street….
Learn more at: http://www.whitehouse.gov/digitalgov/strategy-milestones and http://www.whitehouse.gov/digitalgov/deliverables.”

Introducing: Project Open Data


White House Blog: “Technology evolves rapidly, and it can be challenging for policy and its implementation to evolve at the same pace.  Last week, President Obama launched the Administration’s new Open Data Policy and Executive Order aimed at ensuring that data released by the government will be as accessible and useful as possible.  To make sure this tech-focused policy can keep up with the speed of innovation, we created Project Open Data.
Project Open Data is an online, public repository intended to foster collaboration and promote the continual improvement of the Open Data Policy. We wanted to foster a culture change in government where we embrace collaboration and where anyone can help us make open data work better. The project is published on GitHub, an open source platform that allows communities of developers to collaboratively share and enhance code.  The resources and plug-and-play tools in Project Open Data can help accelerate the adoption of open data practices.  For example, one tool instantly converts spreadsheets and databases into APIs for easier consumption by developers.  The idea is that anyone, from Federal agencies to state and local governments to private citizens, can freely use and adapt these open source tools—and that’s exactly what’s happening.
Within the first 24 hours after Project Open Data was published, more than two dozen contributions (or “pull requests” in GitHub speak) were submitted by the public. The submissions included everything from fixing broken links, to providing policy suggestions, to contributing new code and tools. One pull request even included new code that translates geographic data from locked formats into open data that is freely available for use by anyone…”

"ambient accountability"


New blog by dieter zinnbauer: “what is ambient accountability?
big words for a simple idea: how to systematically use the built environment and physical space to help people right at the place and time when they need it most to:

  1. understand their rights and entitlements (the what is supposed to happen)
  2. monitor the performance of public officials and service providers (the what is actually happening)
  3. figure out who is responsible and offer easy ways to take action if things go wrong and 1) does not match up with 2)

Ambient accountability is a very elastic concept. It ranges from the very simple (stickers, placards, billboards) to the artistic nifty (murals, projections) and the very futuristic (urban screens, augmented reality). It can include the official advisory, the NGO poster, as well as the bottom-up urban intervention…
For more see blog entries with a quick overview, and the historical backdrop and this working paper with lots of visual examples and a more in-depth account of why ambient accountability has a lot of potential to complement the existing anti-corruption repertoire and at the same time offer a interesting area of application for all those urban computing or open government initiatives.”

OpenData Latinoamérica


Mariano Blejman and Miguel Paz @ IJNet Blog: “We need a central repository where you can share the data that you have proved to be reliable. Our answer to this need: OpenData Latinoamérica, which we are leading as ICFJ Knight International Journalism Fellows.
Inspired by the open data portal created by ICFJ Knight International Journalism Fellow Justin Arenstein in Africa, OpenData Latinoamérica aims to improve the use of data in this region where data sets too often fail to show up where they should, and when they do, are scattered about the web at governmental repositories and multiple independent repositories where the data is removed too quickly.

The portal will be used at two big upcoming events: Bolivia’s first DataBootCamp and the Conferencia de Datos Abiertos (Open Data Conference) in Montevideo, Uruguay. Then, we’ll hold a series of hackathons and scrape-athons in Chile, which is in a period of presidential elections in which citizens increasingly demand greater transparency. Releasing data and developing applications for accountability will be the key.”

D4D Challenge Winners announced


development=prize-pic_0Global Pulse Blog: “The winners of the Data for Development challenge – an international research challenge using a massive anonymized dataset provided by telecommunications company Orange – were announced at the NetMob 2013 Conference in Boston last week….
In this post we’ll look at the winners and how their research could be put to use.

Best Visualization prize winner: “Exploration and Analysis of Massive Mobile Phone Data: A Layered Visual Analytics Approach” –

Best Development prize winner: “AllAboard: a System for Exploring Urban Mobility and Optimizing Public Transport Using Cellphone Data”

Best Scientific prize winner: “Analyzing Social Divisions Using Cell Phone Data”

First prize winner: “Exploiting Cellular Data for Disease Containment and Information Campaigns Strategies in Country-Wide Epidemics””

Measuring Impact of Open and Transparent Governance


opengovMark Robinson @ OGP blog: “Eighteen months on from the launch of the Open Government Partnership in New York in September 2011, there is growing attention to what has been achieved to date.  In the recent OGP Steering Committee meeting in London, government and civil society members were unanimous in the view that the OGP must demonstrate results and impact to retain its momentum and wider credibility.  This will be a major focus of the annual OGP conference in London on 31 October and 1 November, with an emphasis on showcasing innovations, highlighting results and sharing lessons.
Much has been achieved in eighteen months.  Membership has grown from 8 founding governments to 58.  Many action plan commitments have been realised for the majority of OGP member countries. The Independent Reporting Mechanism has been approved and launched. Lesson learning and sharing experience is moving ahead….
The third type of results are the trickiest to measure: What has been the impact of openness and transparency on the lives of ordinary citizens?  In the two years since the OGP was launched it may be difficult to find many convincing examples of such impact, but it is important to make a start in collecting such evidence.
Impact on the lives of citizens would be evident in improvements in the quality of service delivery, by making information on quality, access and complaint redressal public. A related example would be efficiency savings realised from publishing government contracts.  Misallocation of public funds exposed through enhanced budget transparency is another. Action on corruption arising from bribes for services, misuse of public funds, or illegal procurement practices would all be significant results from these transparency reforms.  A final example relates to jobs and prosperity, where the utilisation of government data in the public domain by the private sector to inform business investment decisions and create employment.
Generating convincing evidence on the impact of transparency reforms is critical to the longer-term success of the OGP. It is the ultimate test of whether lofty public ambitions announced in country action plans achieve real impacts to the benefit of citizens.”

An API for "We the People"


WeThePeopleThe White House Blog: “We can’t talk about We the People without getting into the numbers — more than 8 million users, more than 200,000 petitions, more than 13 million signatures. The sheer volume of participation is, to us, a sign of success.
And there’s a lot we can learn from a set of data that rich and complex, but we shouldn’t be the only people drawing from its lessons.
So starting today, we’re making it easier for anyone to do their own analysis or build their own apps on top of the We the People platform. We’re introducing the first version of our API, and we’re inviting you to use it.
Get started here: petitions.whitehouse.gov/developers
This API provides read-only access to data on all petitions that passed the 150 signature threshold required to become publicly-available on the We the People site. For those who don’t need real-time data, we plan to add the option of a bulk data download in the near future. Until that’s ready, an incomplete sample data set is available for download here.”

Department of Better Technology


logo-250Next City reports: “…opening up government can get expensive. That’s why two developers this week launched the Department of Better Technology, an effort to make open government tools cheaper, more efficient and easier to engage with.

As founder Clay Johnson explains in a post on the site’s blog, a federal website that catalogues databases on government contracts, which launched last year, cost $181 million to build — $81 million more than a recent research initiative to map the human brain.

“I’d like to say that this is just a one-off anomaly, but government regularly pays millions of dollars for websites,” writes Johnson, the former director of Sunlight Labs at the Sunlight Foundation and author the 2012 book The Information Diet.

The first undertaking of Johnson and his partner, GovHub co-founder Adam Becker, is a tool meant to make it simpler for businesses to find government projects to bid on, as well as help officials streamline the process of managing procurements. In a pilot experiment, Johnson writes, the pair found that not only were bids coming in faster and at a reduced price, but more people were doing the bidding.

Per Johnson, “many of the bids that came in were from businesses that had not ordinarily contracted with the federal government before.”
The Department of Better Technology will accept five cities to test a beta version of this tool, called Procure.io, in 2013.”

Demystifying data centers


Wired: “If you walk into the lobby of the data center Facebook operates in the high desert in Prineville, Oregon, you’ll find a flatscreen display on the wall where you can check the pulse of this massive computing facility.
The display tracks the efficiency of the operation, which spans 333,400-square feet and tens of thousands of computer servers. Facebook built this data center in an effort to significantly reduce the power and dollars needed to serve up the world’s most popular social network, and — driven by CEO Mark Zuckerberg’s deep-seeded belief in the free exchange of ideas — the company aims to push the computing world in a similar direction. The display — which shows much the same information Facebook engineers use to monitor the facility — is an advertisement for the Facebook way.
Now, the company is taking this idea a step further. On Thursday, Facebook uncloaked a pair of web services that let anyone in the world track the efficiency of the Prineville data center and its sister facility in Forest City, North Carolina. “We’re pulling back the curtain to share some of the same information that our data center technicians view every day,” Facebook’s Lyrica McTiernan said in a blog post. “We think it’s important to demystify data centers and share more about what our operations really look like.”
http://www.wired.com/wiredenterprise/wp-content/uploads/2013/04/facebook-dashboard.png

Surfing Logs Reveal Global Eating Patterns


From the The Physics arXiv Blog:  “The way we view online recipes reveals how our eating habits change over time, say computational sociologists….it’s no surprise that computational sociologists have begun to mine the data associated with our browsing habits to discover more about our diets and eating habits. Last year we looked at some fascinating work examining networks of ingredients and the flavours they contain, gathered from online recipe websites.  It turns out this approach gives fascinating insights into the way recipes vary geographically and into the possibility of unexplored combinations of flavours.
Today, Robert West at Stanford University and Ryen White and Eric Horvitz from Microsoft Research in Redmond, take a deeper look at the electronic trails we leave when we hunt for food on the web. They say the data reveals important trends in the way our diets change with the season, with our geographical location and with certain special days such as Thanksgiving and Christmas. And they conclude that the data could become an important tool for monitoring public health.”
See also : arxiv.org/abs/1304.3742: From Cookies to Cooks: Insights on Dietary Patterns via Analysis of Web Usage Logs