Open Data Barometer


Press Release by the Open Data Research Network: “New research by World Wide Web Foundation and Open Data Institute shows that 55% of countries surveyed have open data initiatives in place, yet less than 10% of key government datasets across the world are truly open to the public…the Open Data Barometer. This 77-country study, which considers the interlinked areas of policy, implementation and impact, ranks the UK at number one. The USA, Sweden, New Zealand, Denmark and Norway (tied) make up the rest of the top five. Kenya is ranked as the most advanced developing country, outperforming richer countries such as Ireland, Italy and Belgium in global comparisons.

The Barometer reveals that:

  • 55% of countries surveyed have formal open data policies in place.

  • Valuable but potentially controversial datasets – such as company registers and land registers – are among the least likely to be openly released. It is unclear whether this stems from reluctance to drop lucrative access charges, or from desire to keep a lid on politically sensitive information, or both. However, the net effect is to severely limit the accountability benefits of open data.

  • When they are released, government datasets are often issued in inaccessible formats. Across the nations surveyed, fewer that than 1 in 10 key datasets that could be used to hold governments to account, stimulate enterprise, and promote better social policy, are available and truly open for re-use.

The research also makes the case that:

  • Efforts should be made to empower civil society, entrepreneurs and members of the public to use government data made available, rather than simply publishing data online.

  • Business activity and innovation can be boosted by strong open data policies.  In Denmark, for example, free of charge access to address data has had a significant economic impact. In 2010, an evaluation recorded an estimated financial benefit to society of EUR 62 million against costs of EUR 2million.”

What Government Can and Should Learn From Hacker Culture


in The Atlantic: “Can the open-source model work for federal government? Not in every way—for security purposes, the government’s inner workings will never be completely open to the public. Even in the inner workings of government, fears of triggering the next Wikileaks or Snowden scandal may scare officials away from being more open with one another. While not every area of government can be more open, there are a few areas ripe for change.

Perhaps the most glaring need for an open-source approach is in information sharing. Today, among and within several federal agencies, a culture of reflexive and unnecessary information withholding prevails. This knee-jerk secrecy can backfire with fatal consequences, as seen in the 1998 embassy bombings in Africa, the 9/11 attacks, and the Boston Marathon bombings. What’s most troubling is that decades after the dangers of information-sharing were identified, the problem persists.
What’s preventing reform? The answer starts with the government’s hierarchical structure—though an information-is-power mentality and “need to know” Cold War-era culture contribute too. To improve the practice of information sharing, government needs to change the structure of information sharing. Specifically, it needs to flatten the hierarchy.
Former Obama Administration regulation czar Cass Sunstein’s “nudge” approach shows how this could work. In his book Simpler: The Future of Government, he describes how making even small changes to an environment can affect significant changes in behavior. While Sunstein focuses on regulations, the broader lesson is clear: Change the environment to encourage better behavior and people tend to exhibit better behavior. Without such strict adherence to the many tiers of the hierarchy, those working within it could be nudged towards, rather than fight to, share information.
One example of where this worked is in with the State Department’s annual Religious Engagement Report (RER). In 2011, the office in charge of the RER decided that instead of having every embassy submit their data via email, they would post it on a secure wiki. On the surface, this was a decision to change an information-sharing procedure. But it also changed the information-sharing culture. Instead of sharing information only along the supervisor-subordinate axis, it created a norm of sharing laterally, among colleagues.
Another advantage to flattening information-sharing hierarchies is that it reduces the risk of creating “single points of failure,” to quote technology scholar Beth Noveck. The massive amounts of data now available to us may need massive amounts of eyeballs in order to spot patterns of problems—small pools of supervisors atop the hierarchy cannot be expected to shoulder those burdens alone. And while having the right tech tools to share information is part of the solution—as the wiki made it possible for the RER—it’s not enough. Leadership must also create a culture that nudges their staff to use these tools, even if that means relinquishing a degree of their own power.
Finally, a more open work culture would help connect interested parties across government to let them share the hard work of bringing new ideas to fruition. Government is filled with examples of interesting new projects that stall in their infancy. Creating a large pool of collaborators dedicated to a project increases the likelihood that when one torchbearer burns out, others in the agency will pick up for them.
When Linus Torvalds released Linux, it was considered, in Raymond’s words, “subversive” and “a distinct shock.” Could the federal government withstand such a shock?
Evidence suggests it can—and the transformation is already happening in small ways. One of the winners of the Harvard Kennedy School’s Innovations in Government award is State’s Consular Team India (CTI), which won for joining their embassy and four consular posts—each of which used to have its own distinct set of procedures-into a single, more effective unit who could deliver standardized services. As CTI describes it, “this is no top-down bureaucracy” but shares “a common base of information and shared responsibilities.” They flattened the hierarchy, and not only lived, but thrived.”

Open Data Index provides first major assessment of state of open government data


Press Release from the Open Knowledge Foundation: “In the week of a major international summit on government transparency in London, the Open Knowledge Foundation has published its 2013 Open Data Index, showing that governments are still not providing enough information in an accessible form to their citizens and businesses.
The UK and US top the 2013 Index, which is a result of community-based surveys in 70 countries. They are followed by Denmark, Norway and the Netherlands. Of the countries assessed, Cyprus, St Kitts & Nevis, the British Virgin Islands, Kenya and Burkina Faso ranked lowest. There are many countries where the governments are less open but that were not assessed because of lack of openness or a sufficiently engaged civil society. This includes 30 countries who are members of the Open Government Partnership.
The Index ranks countries based on the availability and accessibility of information in ten key areas, including government spending, election results, transport timetables, and pollution levels, and reveals that whilst some good progress is being made, much remains to be done.
Rufus Pollock, Founder and CEO of the Open Knowledge Foundation said:

Opening up government data drives democracy, accountability and innovation. It enables citizens to know and exercise their rights, and it brings benefits across society: from transport, to education and health. There has been a welcome increase in support for open data from governments in the last few years, but this Index reveals that too much valuable information is still unavailable.

The UK and US are leaders on open government data but even they have room for improvement: the US for example does not provide a single consolidated and open register of corporations, while the UK Electoral Commission lets down the UK’s good overall performance by not allowing open reuse of UK election data.
There is a very disappointing degree of openness of company registers across the board: only 5 out of the 20 leading countries have even basic information available via a truly open licence, and only 10 allow any form of bulk download. This information is critical for range of reasons – including tackling tax evasion and other forms of financial crime and corruption.
Less than half of the key datasets in the top 20 countries are available to re-use as open data, showing that even the leading countries do not fully understand the importance of citizens and businesses being able to legally and technically use, reuse and redistribute data. This enables them to build and share commercial and non-commercial services.
To see the full results: https://index.okfn.org. For graphs of the data: https://index.okfn.org/visualisations.”

The Decline of Wikipedia


Tom Simonite in MIT Technology Review: “The sixth most widely used website in the world is not run anything like the others in the top 10. It is not operated by a sophisticated corporation but by a leaderless collection of volunteers who generally work under pseudonyms and habitually bicker with each other. It rarely tries new things in the hope of luring visitors; in fact, it has changed little in a decade. And yet every month 10 billion pages are viewed on the English version of Wikipedia alone. When a major news event takes place, such as the Boston Marathon bombings, complex, widely sourced entries spring up within hours and evolve by the minute. Because there is no other free information source like it, many online services rely on Wikipedia. Look something up on Google or ask Siri a question on your iPhone, and you’ll often get back tidbits of information pulled from the encyclopedia and delivered as straight-up facts.
Yet Wikipedia and its stated ambition to “compile the sum of all human knowledge” are in trouble. The volunteer workforce that built the project’s flagship, the English-language Wikipedia—and must defend it against vandalism, hoaxes, and manipulation—has shrunk by more than a third since 2007 and is still shrinking. Those participants left seem incapable of fixing the flaws that keep Wikipedia from becoming a high-quality encyclopedia by any standard, including the project’s own. Among the significant problems that aren’t getting resolved is the site’s skewed coverage: its entries on Pokemon and female porn stars are comprehensive, but its pages on female novelists or places in sub-Saharan Africa are sketchy. Authoritative entries remain elusive. Of the 1,000 articles that the project’s own volunteers have tagged as forming the core of a good encyclopedia, most don’t earn even Wikipedia’s own middle-­ranking quality scores.
The main source of those problems is not mysterious….”

Talking About a (Data) Revolution


Dave Banisar at Article 19: “It is important to recognize the utility that data can bring. Data can ease analysis, reveal important patterns and facilitate comparisons. For example, the Transactional Access Clearing House (TRAC – http://www.trac.org) at Syracuse University uses data sets from the US Department of Justice to analyze how the federal government enforces its criminal and civil laws, showing how laws are applied differently across the US.
The (somewhat ICT-companies manufactured) excitement over “E-government” in the late 1990s imagined a brave new e-world where governments would quickly and easily provide needed information and services to their citizens. This was presented as an alternative to the “reactive” and “confrontational” right to information laws but eventually led to the realization that ministerial web pages and the ability to pay tickets online did not lead to open government. Singapore ranks near the top every year on e-government but is clearly not an ‘open government’. Similarly, it is important to recognize that governments providing data through voluntary measures is not enough.
For open data to promote open government, it needs to operate within a framework of law and regulation that ensures that information is collected, organized and stored and then made public in a timely, accurate and useful form.   The information must be more than just what government bodies find useful to release, but what is important for the public to know to ensure that those bodies are accountable.
Otherwise, it is in danger of just being propaganda, subject to manipulation to make government bodies look good. TRAC has had to sue the USA federal government dozens of times under the Freedom of Information Act to obtain the government data and after they publish it, some government bodies still claim that the information is incorrect.  Voluntary systems of publication usually fail when they potentially embarrass the bodies doing the publication.
In the countries where open data has been most successful such as the USA and UK, there also exists a legal right to demand information which keeps bodies honest. Most open government laws around the world now have requirements for affirmative publication of key information and they are slowly being amended to include open data requirements to ensure that the information is more easily usable.
Where there is no or weak open government laws, many barriers can obstruct open data. In Kenya, which has been championing their open data portal while being slow to adopt a law on freedom of information, a recent review found that the portal was stagnating. In part, the problem was that in the absence of laws mandating openness, there remains a culture of secrecy and fear of releasing information.
Further, mere access to data is not enough to ensure informed participation by citizens and enable their ability to affect decision-making processes.  Legal rights to all information held by governments – right to information laws – are essential to tell the “why”. RTI reveals how and why decisions and policy are made – secret meetings, questionable contracts, dubious emails and other information. These are essential elements for oversight and accountability. Being able to document why a road was built for political reasons is as crucial for change as recognizing that it’s in the wrong place. The TRAC users, mostly journalists, use the system as a starting point to ask questions or why enforcement is so uneven or taxes are not being collected. They need sources and open government laws to ask these questions.
Of course, even open government laws are not enough. There needs to be strong rights for citizen consultation and participation and the ability to enforce those rights, such as is mandated by the UNECE Convention on Access to Environment Information, Public Participation and Access to Justice (Aarhus Convention). A protocol to that convention has led to a Europe-wide data portal on environmental pollution.
For open data to be truly effective, there needs to be a right to information enshrined in law that requires that information is made available in a timely, reliable format that people want, not just what the government body wants to release. And it needs to be backed up with rights of engagement and participation. From this open data can flourish.  The OGP needs to refocus on the building blocks of open government – good law and policy – and not just the flashy apps.”

Crowdsourcing Mobile App Takes the Globe’s Economic Pulse


Tom Simonite in MIT Technology Review: “In early September, news outlets reported that the price of onions in India had suddenly spiked nearly 300 percent over prices a year before. Analysts warned that the jump in price for this food staple could signal an impending economic crisis, and the Research Bank of India quickly raised interest rates.
A startup company called Premise might’ve helped make the response to India’s onion crisis timelier. As part of a novel approach to tracking the global economy from the bottom up, the company has a daily feed of onion prices from stores around India. More than 700 people in cities around the globe use a mobile app to log the prices of key products in local stores each day.

Premise’s cofounder David Soloff says it’s a valuable way to take the pulse of economies around the world, especially since stores frequently update their prices in response to economic pressures such as wholesale costs and consumer confidence. “All this information is hiding in plain sight on store shelves,” he says, “but there’s no way of capturing and aggregating it in any meaningful way.”
That information could provide a quick way to track and even predict inflation measures such as the U.S. Consumer Price Index. Inflation figures influence the financial industry and are used to set governments’ monetary and fiscal policy, but they are typically updated only once a month. Soloff says Premise’s analyses have shown that for some economies, the data the company collects can reliably predict monthly inflation figures four to six weeks in advance. “You don’t look at the weather forecast once a month,” he says….
Premise’s data may have other uses outside the financial industry. As part of a United Nations program called Global Pulse, Cavallo and PriceStats, which was founded after financial professionals began relying on data from an ongoing academic price-indexing effort called the Billion Prices Project, devised bread price indexes for several Latin American countries. Such indexes typically predict street prices and help governments and NGOs spot emerging food crises. Premise’s data could be used in the same way. The information could also be used to monitor areas of the world, such as Africa, where tracking online prices is unreliable, he says.”

Where in the World are Young People Using the Internet?


Georgia Tech: “According to a common myth, today’s young people are all glued to the Internet. But in fact, only 30 percent of the world’s youth population between the ages of 15 and 24 years old has been active online for at least five years. In South Korea, 99.6 percent of young people are active, the highest percentage in the world. The least? The Asian island of Timor Leste with less than 1 percent.

Digital Natives as Percentage of Total Population

Digital natives as a percentage of total population, 2012 (Courtesy: ITU)

Those are among the many findings in a study from the Georgia Institute of Technology and International Telecommunication Union (ITU). The study is the first attempt to measure, by country, the world’s “digital natives.” The term is typically used to categorize young people born around the time the personal computer was introduced and have spent their lives connected with technology.
Nearly 96 percent of American millennials are digital natives. That figure is behind Japan (99.5 percent) and several European countries, including Finland, Denmark and the Netherlands.
But the percentage that Georgia Tech Associate Professor Michael Best thinks is the most important is the number of digital natives as compared to a country’s total population….
The countries with the highest proportion of digital natives among their population are mostly rich nations, which have high levels of overall Internet penetration. Iceland is at the top of the list with 13.9 percent. The United States is sixth (13.1 percent). A big surprise is Malaysia, a middle-income country with one of the highest proportions of digital natives (ranked 4th at 13.4 percent). Malaysia has a strong history of investing in educational technology.
The countries with the smallest estimated proportion of digital natives are Timor-Leste, Myanmar and Sierra Leone. The bottom 10 consists entirely of African or Asian nations, many of which are suffering from conflict and/or have very low Internet availability.”

The Solution Revolution


New book by William D. Eggers and Paul Macmillan from Deloitte: “Where tough societal problems persist, citizens, social enterprises, and yes, even businesses, are relying less and less on government-only solutions. More likely, they are crowd funding, ride-sharing, app- developing or impact- investing to design lightweight solutions for seemingly intractable problems. No challenge is too daunting, from malaria in Africa to traffic congestion in California.
These wavemakers range from edgy social enterprises growing at a clip of 15% a year, to mega-foundations that are eclipsing development aid, to Fortune 500 companies delivering social good on the path to profit. The collective force of these new problem solvers is creating dynamic and rapidly evolving markets for social good. They trade solutions instead of dollars to fill the gap between what government can provide and what citizens need. By erasing public-private sector boundaries, they are unlocking trillions of dollars in social benefit and commercial value.
The Solution Revolution explores how public and private are converging to form the Solution Economy. By examining scores of examples, Eggers and Macmillan reveal the fundamentals of this new – globally prevalent – economic and social order. The book is designed to help guide those willing to invest time, knowledge or capital toward sustainable, social progress.”

Using Participatory Crowdsourcing in South Africa to Create a Safer Living Environment


New Paper by Bhaveer Bhana, Stephen Flowerday, and Aharon Satt in the International Journal of Distributed Sensor Networks: “The increase in urbanisation is making the management of city resources a difficult task. Data collected through observations (utilising humans as sensors) of the city surroundings can be used to improve decision making in terms of managing these resources. However, the data collected must be of a certain quality in order to ensure that effective and efficient decisions are made. This study is focused on the improvement of emergency and non-emergency services (city resources) through the use of participatory crowdsourcing (humans as sensors) as a data collection method (collect public safety data), utilising voice technology in the form of an interactive voice response (IVR) system.
The study illustrates how participatory crowdsourcing (specifically humans as sensors) can be used as a Smart City initiative focusing on public safety by illustrating what is required to contribute to the Smart City, and developing a roadmap in the form of a model to assist decision making when selecting an optimal crowdsourcing initiative. Public safety data quality criteria were developed to assess and identify the problems affecting data quality.
This study is guided by design science methodology and applies three driving theories: the Data Information Knowledge Action Result (DIKAR) model, the characteristics of a Smart City, and a credible Data Quality Framework. Four critical success factors were developed to ensure high quality public safety data is collected through participatory crowdsourcing utilising voice technologies.”

Mobile phone data are a treasure-trove for development


Paul van der Boor and Amy Wesolowski in SciDevNet: “Each of us generates streams of digital information — a digital ‘exhaust trail’ that provides real-time information to guide decisions that affect our lives. For example, Google informs us about traffic by using both its ‘My Location’ feature on mobile phones and third-party databases to aggregate location data. BBVA, one of Spain’s largest banks, analyses transactions such as credit card payments as well as ATM withdrawals to find out when and where peak spending occurs.This type of data harvest is of great value. But, often, there is so much data that its owners lack the know-how to process it and fail to realise its potential value to policymakers.
Meanwhile, many countries, particularly in the developing world, have a dearth of information. In resource-poor nations, the public sector often lives in an analogue world where piles of paper impede operations and policymakers are hindered by uncertainty about their own strengths and capabilities.Nonetheless, mobile phones have quickly pervaded the lives of even the poorest: 75 per cent of the world’s 5.5 billion mobile subscriptions are in emerging markets. These people are also generating digital trails of anything from their movements to mobile phone top-up patterns. It may seem that putting this information to use would take vast analytical capacity. But using relatively simple methods, researchers can analyse existing mobile phone data, especially in poor countries, to improve decision-making.
Think of existing, available data as low-hanging fruit that we — two graduate students — could analyse in less than a month. This is not a test of data-scientist prowess, but more a way of saying that anyone could do it.
There are three areas that should be ‘low-hanging fruit’ in terms of their potential to dramatically improve decision-making in information-poor countries: coupling healthcare data with mobile phone data to predict disease outbreaks; using mobile phone money transactions and top-up data to assess economic growth; and predicting travel patterns after a natural disaster using historical movement patterns from mobile phone data to design robust response programmes.
Another possibility is using call-data records to analyse urban movement to identify traffic congestion points. Nationally, this can be used to prioritise infrastructure projects such as road expansion and bridge building.
The information that these analyses could provide would be lifesaving — not just informative or revenue-increasing, like much of this work currently performed in developed countries.
But some work of high social value is being done. For example, different teams of European and US researchers are trying to estimate the links between mobile phone use and regional economic development. They are using various techniques, such as merging night-time satellite imagery from NASA with mobile phone data to create behavioural fingerprints. They have found that this may be a cost-effective way to understand a country’s economic activity and, potentially, guide government spending.
Another example is given by researchers (including one of this article’s authors) who have analysed call-data records from subscribers in Kenya to understand malaria transmission within the country and design better strategies for its elimination. [1]
In this study, published in Science, the location data of the mobile phones of more than 14 million Kenyan subscribers was combined with national malaria prevalence data. After identifying the sources and sinks of malaria parasites and overlaying these with phone movements, analysis was used to identify likely transmission corridors. UK scientists later used similar methods to create different epidemic scenarios for the Côte d’Ivoire.”