Press Release: “Today the Governance Lab (The GovLab) launches The GovLab Academy at the Open Government Partnership Annual Meeting in London.
Available at www.thegovlabacademy.org, the Academy is a free online community for those wanting to teach and learn how to solve public problems and improve lives using innovations in governance. A partnership between The GovLab at New York University and MIT Media Lab’s Online Learning Initiative, the site launching today offers curated videos, podcasts, readings and activities designed to enable the purpose driven learner to deepen his or her practical knowledge at her own pace.
The GovLab Academy is funded by a grant from the John S. and James L. Knight Foundation. “The GovLab Academy addresses a growing need among policy makers at all levels – city, federal and global – to leverage advances in technology to govern differently,” says Carol Coletta, Vice President of Community and National Initiatives at the Knight Foundation. “By connecting the latest technological innovations to a community of willing mentors, the Academy has the potential to catalyze more experimentation in a sector that badly needs it.”
Initial topics include using data to improve policymaking and cover the role of big data, urban analytics, smart disclosure and open data in governance. A second track focuses on online engagement and includes practical strategies for using crowdsourcing to solicit ideas, organize distributed work and gather data. The site features both curated content drawn from a variety of sources and original interviews with innovators from government, civil society, the tech industry, the arts and academia talking about their work around the world implementing innovations in practice, what worked and what didn’t, to improve real people’s lives.
Beth Noveck, Founder and Director of The GovLab, describes its mission: “The Academy is an experiment in peer production where every teacher is a learner and every learner a teacher. Consistent with The GovLab’s commitment to measuring what works, we want to measure our success by the people contributing as well as consuming content. We invite everyone with ideas, stories, insights and practical wisdom to contribute to what we hope will be a thriving and diverse community for social change”.”
New U.S. Open Government National Action Plan
The White House Fact Sheet: “In September 2011, President Obama joined the leaders of seven other nations in announcing the launch of the Open Government Partnership (OGP) – a global effort to encourage transparent, effective, and accountable governance.
Two years later, OGP has grown to 60 countries that have made more than 1000 commitments to improve the governance of more than two billion people around the globe. OGP is now a global community of government reformers, civil society leaders, and business innovators working together to develop and implement ambitious open government reforms and advance good governance…
Today at the OGP summit in London, the United States announced a new U.S. Open Government National Action Plan that includes six ambitious new commitments that will advance these efforts even further. Those commitments include expanding open data, modernizing the Freedom of Information Act (FOIA), increasing fiscal transparency, increasing corporate transparency, advancing citizen engagement and empowerment, and more effectively managing public resources.
Expand Open Data: Open Data fuels innovation that grows the economy and advances government transparency and accountability. Government data has been used by journalists to uncover variations in hospital billings, by citizens to learn more about the social services provided by charities in their communities, and by entrepreneurs building new software tools to help farmers plan and manage their crops. Building upon the successful implementation of open data commitments in the first U.S. National Action Plan, the new Plan will include commitments to make government data more accessible and useful for the public, such as reforming how Federal agencies manage government data as a strategic asset, launching a new version of Data.gov, and expanding agriculture and nutrition data to help farmers and communities.
Modernize the Freedom of Information Act (FOIA): The FOIA encourages accountability through transparency and represents a profound national commitment to open government principles. Improving FOIA administration is one of the most effective ways to make the U.S. Government more open and accountable. Today, the United States announced a series of commitments to further modernize FOIA processes, including launching a consolidated online FOIA service to improve customers’ experience and making training resources available to FOIA professionals and other Federal employees.
Increase Fiscal Transparency: The Administration will further increase the transparency of where Federal tax dollars are spent by making federal spending data more easily available on USASpending.gov; facilitating the publication of currently unavailable procurement contract information; and enabling Americans to more easily identify who is receiving tax dollars, where those entities or individuals are located, and how much they receive.
Increase Corporate Transparency: Preventing criminal organizations from concealing the true ownership and control of businesses they operate is a critical element in safeguarding U.S. and international financial markets, addressing tax avoidance, and combatting corruption in the United States and abroad. Today we committed to take further steps to enhance transparency of legal entities formed in the United States.
Advance Citizen Engagement and Empowerment: OGP was founded on the principle that an active and robust civil society is critical to open and accountable governance. In the next year, the Administration will intensify its efforts to roll back and prevent new restrictions on civil society around the world in partnership with other governments, multilateral institutions, the philanthropy community, the private sector, and civil society. This effort will focus on improving the legal and regulatory framework for civil society, promoting best practices for government-civil society collaboration, and conceiving of new and innovative ways to support civil society globally.
More Effectively Manage Public Resources: Two years ago, the Administration committed to ensuring that American taxpayers receive every dollar due for the extraction of the nation’s natural resources by committing to join the Extractive Industries Transparency Initiative (EITI). We continue to work toward achieving full EITI compliance in 2016. Additionally, the U.S. Government will disclose revenues on geothermal and renewable energy and discuss future disclosure of timber revenues.
For more information on OGP, please visit www.opengovpartnership.org or follow @opengovpart on Twitter.”
See also White House Plans a Single FOIA Portal Across Government
Open Data Barometer
Press Release by the Open Data Research Network: “New research by World Wide Web Foundation and Open Data Institute shows that 55% of countries surveyed have open data initiatives in place, yet less than 10% of key government datasets across the world are truly open to the public…the Open Data Barometer. This 77-country study, which considers the interlinked areas of policy, implementation and impact, ranks the UK at number one. The USA, Sweden, New Zealand, Denmark and Norway (tied) make up the rest of the top five. Kenya is ranked as the most advanced developing country, outperforming richer countries such as Ireland, Italy and Belgium in global comparisons.
The Barometer reveals that:
-
55% of countries surveyed have formal open data policies in place.
-
Valuable but potentially controversial datasets – such as company registers and land registers – are among the least likely to be openly released. It is unclear whether this stems from reluctance to drop lucrative access charges, or from desire to keep a lid on politically sensitive information, or both. However, the net effect is to severely limit the accountability benefits of open data.
-
When they are released, government datasets are often issued in inaccessible formats. Across the nations surveyed, fewer that than 1 in 10 key datasets that could be used to hold governments to account, stimulate enterprise, and promote better social policy, are available and truly open for re-use.
The research also makes the case that:
-
Efforts should be made to empower civil society, entrepreneurs and members of the public to use government data made available, rather than simply publishing data online.
-
Business activity and innovation can be boosted by strong open data policies. In Denmark, for example, free of charge access to address data has had a significant economic impact. In 2010, an evaluation recorded an estimated financial benefit to society of EUR 62 million against costs of EUR 2million.”
Seizing the data opportunity: UK data capability strategy
New UK Policy Paper by the Department for Business, Innovation & Skills: “In the information economy, the ability to handle and analyse data is essential for the UK’s competitive advantage and business transformation. The volume, velocity and variety of data being created and analysed globally is rising every day, and using data intelligently has the potential to transform public sector organisations, drive research and development, and enable market-changing products and services. The social and economic potential is significant, and the UK is well placed to compete in the global market for data analytics. Through this strategy, the government aims to place the UK at the forefront of this process by building our capability to exploit data for the benefit of citizens, business, and academia. This is our action plan for making the UK a data success story.
Big Data
Special Report on Big Data by Volta – A newsletter on Science, Technology and Society in Europe: “Locating crime spots, or the next outbreak of a contagious disease, Big Data promises benefits for society as well as business. But more means messier. Do policy-makers know how to use this scale of data-driven decision-making in an effective way for their citizens and ensure their privacy?90% of the world’s data have been created in the last two years. Every minute, more than 100 million new emails are created, 72 hours of new video are uploaded to YouTube and Google processes more than 2 million searches. Nowadays, almost everyone walks around with a small computer in their pocket, uses the internet on a daily basis and shares photos and information with their friends, family and networks. The digital exhaust we leave behind every day contributes to an enormous amount of data produced, and at the same time leaves electronic traces that contain a great deal of personal information….
Until recently, traditional technology and analysis techniques have not been able to handle this quantity and type of data. But recent technological developments have enabled us to collect, store and process data in new ways. There seems to be no limitations, either to the volume of data or technology for storing and analyzing them. Big Data can map a driver’s sitting position to identify a car thief, it can use Google searches to predict outbreaks of the H1N1 flu virus, it can data-mine Twitter to predict the price of rice or use mobile phone top-ups to describe unemployment in Asia.
The word ‘data’ means ‘given’ in Latin. It commonly refers to a description of something that can be recorded and analyzed. While there is no clear definition of the concept of ‘Big Data’, it usually refers to the processing of huge amounts and new types of data that have not been possible with traditional tools.
‘The new development is not necessarily that there are so much more data. It’s rather that data is available to us in a new way.’
The notion of Big Data is kind of misleading, argues Robindra Prabhu, a project manager at the Norwegian Board of Technology. “The new development is not necessarily that there are so much more data. It’s rather that data is available to us in a new way. The digitalization of society gives us access to both ‘traditional’, structured data – like the content of a database or register – and unstructured data, for example the content in a text, pictures and videos. Information designed to be read by humans is now also readable by machines. And this development makes a whole new world of data gathering and analysis available. Big Data is exciting not just because of the amount and variety of data out there, but that we can process data about so much more than before.”
Open data: Unlocking innovation and performance with liquid information
New report by McKinsey Global Institute:“Open data—machine-readable information, particularly government data, that’s made available to others—has generated a great deal of excitement around the world for its potential to empower citizens, change how government works, and improve the delivery of public services. It may also generate significant economic value, according to a new McKinsey report.1 Our research suggests that seven sectors alone could generate more than $3 trillion a year in additional value as a result of open data, which is already giving rise to hundreds of entrepreneurial businesses and helping established companies to segment markets, define new products and services, and improve the efficiency and effectiveness of operations.
Although the open-data phenomenon is in its early days, we see a clear potential to unlock significant economic value by applying advanced analytics to both open and proprietary knowledge. Open data can become an instrument for breaking down information gaps across industries, allowing companies to share benchmarks and spread best practices that raise productivity. Blended with proprietary data sets, it can propel innovation and help organizations replace traditional and intuitive decision-making approaches with data-driven ones. Open-data analytics can also help uncover consumer preferences, allowing companies to improve new products and to uncover anomalies and needless variations. That can lead to leaner, more reliable processes.
However, investments in technology and expertise are required to use the data effectively. And there is much work to be done by governments, companies, and consumers to craft policies that protect privacy and intellectual property, as well as establish standards to speed the flow of data that is not only open but also “liquid.” After all, consumers have serious privacy concerns, and companies are reluctant to share proprietary information—even when anonymity is assured—for fear of losing competitive advantage…
See also Executive Summary and Full Report”
Smart Citizens
FutureEverything: “This publication aims to shift the debate on the future of cities towards the central place of citizens, and of decentralised, open urban infrastructures. It provides a global perspective on how cities can create the policies, structures and tools to engender a more innovative and participatory society. The publication contains a series of 23 short essays representing some of the key voices developing an emerging discourse around Smart Citizens. Contributors include:
- Dan Hill, Smart Citizens pioneer and CEO of communications research centre and transdisciplinary studio Fabrica on why Smart Citizens Make Smart Cities.
- Anthony Townsend, urban planner, forecaster and author of Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia on the tensions between place-making and city-making on the role of mobile technologies in changing the way that people interact with their surroundings.
- Paul Maltby, Director of the Government Innovation Group and of the Open Data and Transparency in the UK Cabinet Office on how government can support a smarter society.
- Aditya Dev Sood, Founder and CEO of the Center for Knowledge Societies, presents polarised hypothetical futures for India in 2025 that argues for the use of technology to bridge gaps in social inequality.
- Adam Greenfield, New York City-based writer and urbanist, on Recuperating the Smart City.
Editors: Drew Hemment, Anthony Townsend
Download Here.“
What Government Can and Should Learn From Hacker Culture
Alexis Wichowski in The Atlantic: “Can the open-source model work for federal government? Not in every way—for security purposes, the government’s inner workings will never be completely open to the public. Even in the inner workings of government, fears of triggering the next Wikileaks or Snowden scandal may scare officials away from being more open with one another. While not every area of government can be more open, there are a few areas ripe for change.
Perhaps the most glaring need for an open-source approach is in information sharing. Today, among and within several federal agencies, a culture of reflexive and unnecessary information withholding prevails. This knee-jerk secrecy can backfire with fatal consequences, as seen in the 1998 embassy bombings in Africa, the 9/11 attacks, and the Boston Marathon bombings. What’s most troubling is that decades after the dangers of information-sharing were identified, the problem persists.
What’s preventing reform? The answer starts with the government’s hierarchical structure—though an information-is-power mentality and “need to know” Cold War-era culture contribute too. To improve the practice of information sharing, government needs to change the structure of information sharing. Specifically, it needs to flatten the hierarchy.
Former Obama Administration regulation czar Cass Sunstein’s “nudge” approach shows how this could work. In his book Simpler: The Future of Government, he describes how making even small changes to an environment can affect significant changes in behavior. While Sunstein focuses on regulations, the broader lesson is clear: Change the environment to encourage better behavior and people tend to exhibit better behavior. Without such strict adherence to the many tiers of the hierarchy, those working within it could be nudged towards, rather than fight to, share information.
One example of where this worked is in with the State Department’s annual Religious Engagement Report (RER). In 2011, the office in charge of the RER decided that instead of having every embassy submit their data via email, they would post it on a secure wiki. On the surface, this was a decision to change an information-sharing procedure. But it also changed the information-sharing culture. Instead of sharing information only along the supervisor-subordinate axis, it created a norm of sharing laterally, among colleagues.
Another advantage to flattening information-sharing hierarchies is that it reduces the risk of creating “single points of failure,” to quote technology scholar Beth Noveck. The massive amounts of data now available to us may need massive amounts of eyeballs in order to spot patterns of problems—small pools of supervisors atop the hierarchy cannot be expected to shoulder those burdens alone. And while having the right tech tools to share information is part of the solution—as the wiki made it possible for the RER—it’s not enough. Leadership must also create a culture that nudges their staff to use these tools, even if that means relinquishing a degree of their own power.
Finally, a more open work culture would help connect interested parties across government to let them share the hard work of bringing new ideas to fruition. Government is filled with examples of interesting new projects that stall in their infancy. Creating a large pool of collaborators dedicated to a project increases the likelihood that when one torchbearer burns out, others in the agency will pick up for them.
When Linus Torvalds released Linux, it was considered, in Raymond’s words, “subversive” and “a distinct shock.” Could the federal government withstand such a shock?
Evidence suggests it can—and the transformation is already happening in small ways. One of the winners of the Harvard Kennedy School’s Innovations in Government award is State’s Consular Team India (CTI), which won for joining their embassy and four consular posts—each of which used to have its own distinct set of procedures-into a single, more effective unit who could deliver standardized services. As CTI describes it, “this is no top-down bureaucracy” but shares “a common base of information and shared responsibilities.” They flattened the hierarchy, and not only lived, but thrived.”
Open Data Index provides first major assessment of state of open government data
Press Release from the Open Knowledge Foundation: “In the week of a major international summit on government transparency in London, the Open Knowledge Foundation has published its 2013 Open Data Index, showing that governments are still not providing enough information in an accessible form to their citizens and businesses.
The UK and US top the 2013 Index, which is a result of community-based surveys in 70 countries. They are followed by Denmark, Norway and the Netherlands. Of the countries assessed, Cyprus, St Kitts & Nevis, the British Virgin Islands, Kenya and Burkina Faso ranked lowest. There are many countries where the governments are less open but that were not assessed because of lack of openness or a sufficiently engaged civil society. This includes 30 countries who are members of the Open Government Partnership.
The Index ranks countries based on the availability and accessibility of information in ten key areas, including government spending, election results, transport timetables, and pollution levels, and reveals that whilst some good progress is being made, much remains to be done.
Rufus Pollock, Founder and CEO of the Open Knowledge Foundation said:
Opening up government data drives democracy, accountability and innovation. It enables citizens to know and exercise their rights, and it brings benefits across society: from transport, to education and health. There has been a welcome increase in support for open data from governments in the last few years, but this Index reveals that too much valuable information is still unavailable.
The UK and US are leaders on open government data but even they have room for improvement: the US for example does not provide a single consolidated and open register of corporations, while the UK Electoral Commission lets down the UK’s good overall performance by not allowing open reuse of UK election data.
There is a very disappointing degree of openness of company registers across the board: only 5 out of the 20 leading countries have even basic information available via a truly open licence, and only 10 allow any form of bulk download. This information is critical for range of reasons – including tackling tax evasion and other forms of financial crime and corruption.
Less than half of the key datasets in the top 20 countries are available to re-use as open data, showing that even the leading countries do not fully understand the importance of citizens and businesses being able to legally and technically use, reuse and redistribute data. This enables them to build and share commercial and non-commercial services.
To see the full results: https://index.okfn.org. For graphs of the data: https://index.okfn.org/visualisations.”
Google’s flu fail shows the problem with big data
Adam Kucharski in The Conversation: “When people talk about ‘big data’, there is an oft-quoted example: a proposed public health tool called Google Flu Trends. It has become something of a pin-up for the big data movement, but it might not be as effective as many claim.
The idea behind big data is that large amount of information can help us do things which smaller volumes cannot. Google first outlined the Flu Trends approach in a 2008 paper in the journal Nature. Rather than relying on disease surveillance used by the US Centers for Disease Control and Prevention (CDC) – such as visits to doctors and lab tests – the authors suggested it would be possible to predict epidemics through Google searches. When suffering from flu, many Americans will search for information related to their condition….
Between 2003 and 2008, flu epidemics in the US had been strongly seasonal, appearing each winter. However, in 2009, the first cases (as reported by the CDC) started in Easter. Flu Trends had already made its predictions when the CDC data was published, but it turned out that the Google model didn’t match reality. It had substantially underestimated the size of the initial outbreak.
The problem was that Flu Trends could only measure what people search for; it didn’t analyse why they were searching for those words. By removing human input, and letting the raw data do the work, the model had to make its predictions using only search queries from the previous handful of years. Although those 45 terms matched the regular seasonal outbreaks from 2003–8, they didn’t reflect the pandemic that appeared in 2009.
Six months after the pandemic started, Google – who now had the benefit of hindsight – updated their model so that it matched the 2009 CDC data. Despite these changes, the updated version of Flu Trends ran into difficulties again last winter, when it overestimated the size of the influenza epidemic in New York State. The incidents in 2009 and 2012 raised the question of how good Flu Trends is at predicting future epidemics, as opposed to merely finding patterns in past data.
In a new analysis, published in the journal PLOS Computational Biology, US researchers report that there are “substantial errors in Google Flu Trends estimates of influenza timing and intensity”. This is based on comparison of Google Flu Trends predictions and the actual epidemic data at the national, regional and local level between 2003 and 2013
Even when search behaviour was correlated with influenza cases, the model sometimes misestimated important public health metrics such as peak outbreak size and cumulative cases. The predictions were particularly wide of the mark in 2009 and 2012:

Although they criticised certain aspects of the Flu Trends model, the researchers think that monitoring internet search queries might yet prove valuable, especially if it were linked with other surveillance and prediction methods.
Other researchers have also suggested that other sources of digital data – from Twitter feeds to mobile phone GPS – have the potential to be useful tools for studying epidemics. As well as helping to analysing outbreaks, such methods could allow researchers to analyse human movement and the spread of public health information (or misinformation).
Although much attention has been given to web-based tools, there is another type of big data that is already having a huge impact on disease research. Genome sequencing is enabling researchers to piece together how diseases transmit and where they might come from. Sequence data can even reveal the existence of a new disease variant: earlier this week, researchers announced a new type of dengue fever virus….”