Growing Data Collection Inspires Openness at NGA


at Secrecy News: “A flood of information from the ongoing proliferation of space-based sensors and ground-based data collection devices is promoting a new era of transparency in at least one corner of the U.S. intelligence community.

The “explosion” of geospatial information “makes geospatial intelligence increasingly transparent because of the huge number and diversity of commercial and open sources of information,” said Robert Cardillo, director of the National Geospatial-Intelligence Agency (NGA), in a speech last month.

Hundreds of small satellites are expected to be launched within the next three years — what Mr. Cardillo called a “darkening of the skies” — and they will provide continuous, commercially available coverage of the entire Earth’s surface.

“The challenges of taking advantage of all of that data are daunting for all of us,” Mr. Cardillo said.

Meanwhile, the emerging “Internet of Things” is “spreading rapidly as more people carry more handheld devices to more places” generating an abundance of geolocation data.

This is, of course, a matter of intelligence interest since “Every local, regional, and global challenge — violent extremism in the Middle East and Africa, Russian aggression, the rise of China, Iranian and North Korean nuclear weapons, cyber security, energy resources, and many more — has geolocation at its heart.”

Consequently, “We must open up GEOINT far more toward the unclassified world,” Director Cardillo said in another speech last week.

“In the past, we have excelled in our closed system. We enjoyed a monopoly on sources and methods. That monopoly has long since ended. Today and in the future, we must thrive and excel in the open.”

So far, NGA has already distinguished itself in the area of disaster relief, Mr. Cardillo said.

“Consider Team NGA’s response to the Ebola crisis. We are the first intelligence agency to create a World Wide Web site with access to our relevant unclassified content. It is open to everyone — no passwords, no closed groups.”

NGA provided “more than a terabyte of up-to-date commercial imagery.”

“You can imagine how important it is for the Liberian government to have accurate maps of the areas hardest hit by the Ebola epidemic as well as the medical and transportation infrastructure to combat the disease,” Mr. Cardillo said.

But there are caveats. Just because information is unclassified does not mean that it is freely available.

“Although 99 percent of all of our Ebola data is unclassified, most of that is restricted by our agreements [with commercial providers],” Mr. Cardillo said. “We are negotiating with many sources to release more data.”

Last week, Director Cardillo announced a new project called GEOINT Pathfinder that will attempt “to answer key intelligence questions using only unclassified data.”….(More)

Mission Control: A History of the Urban Dashboard


Futuristic control rooms have proliferated in dozens of global cities. Baltimore has its CitiStat Room, where department heads stand at a podium before a wall of screens and account for their units’ performance.  The Mayor’s office in London’s City Hall features a 4×3 array of iPads mounted in a wooden panel, which seems an almost parodic, Terry Gilliam-esque take on the Brazilian Ops Center. Meanwhile, British Prime Minister David Cameron commissioned an iPad app – the “No. 10 Dashboard” (a reference to his residence at 10 Downing Street) – which gives him access to financial, housing, employment, and public opinion data. As The Guardian reported, “the prime minister said that he could run government remotely from his smartphone.”

This is the age of Dashboard Governance, heralded by gurus like Stephen Few, founder of the “visual business intelligence” and “sensemaking” consultancy Perceptual Edge, who defines the dashboard as a “visual display of the most important information needed to achieve one or more objectives; consolidated and arranged on a single screen so the information can be monitored at a glance.” A well-designed dashboard, he says — one that makes proper use of bullet graphs, sparklines, and other visualization techniques informed by the “brain science” of aesthetics and cognition — can afford its users not only a perceptual edge, but a performance edge, too. The ideal display offers a big-picture view of what is happening in real time, along with information on historical trends, so that users can divine the how and why and redirect future action. As David Nettleton emphasizes, the dashboard’s utility extends beyond monitoring “the current situation”; it also “allows a manager to … make provisions, and take appropriate actions.”….

The dashboard market now extends far beyond the corporate world. In 1994, New York City police commissioner William Bratton adapted former officer Jack Maple’s analog crime maps to create the CompStat model of aggregating and mapping crime statistics. Around the same time, the administrators of Charlotte, North Carolina, borrowed a business idea — Robert Kaplan’s and David Norton’s “total quality management” strategy known as the “Balanced Scorecard” — and began tracking performance in five “focus areas” defined by the City Council: housing and neighborhood development, community safety, transportation, economic development, and the environment. Atlanta followed Charlotte’s example in creating its own city dashboard.

In 1999, Baltimore mayor Martin O’Malley, confronting a crippling crime rate and high taxes, designed CitiStat, “an internal process of using metrics to create accountability within his government.” (This rhetoric of data-tested internal “accountability” is prevalent in early dashboard development efforts.) The project turned to face the public in 2003, when Baltimore launched a website of city operational statistics, which inspired DCStat (2005), Maryland’s StateStat (2007), and NYCStat (2008). Since then, myriad other states and metro areas — driven by a “new managerialist” approach to urban governance, committed to “benchmarking” their performance against other regions, and obligated to demonstrate compliance with sustainability agendas — have developed their own dashboards.

The Open Michigan Mi Dashboard is typical of these efforts. The state website presents data on education, health and wellness, infrastructure, “talent” (employment, innovation), public safety, energy and environment, financial health, and seniors. You (or “Mi”) can monitor the state’s performance through a side-by-side comparison of “prior” and “current” data, punctuated with a thumbs-up or thumbs-down icon indicating the state’s “progress” on each metric. Another click reveals a graph of annual trends and a citation for the data source, but little detail about how the data are actually derived. How the public is supposed to use this information is an open question….(More)”

Why governments need guinea pigs for policies


Jonathan Breckon in the Guardian:”People are unlikely to react positively to the idea of using citizens as guinea pigs; many will be downright disgusted. But there are times when government must experiment on us in the search for knowledge and better policy….

Though history calls into question the ethics of experimentation, unless we try things out, we will never learn. The National Audit Office says that £66bn worth of government projects have no plans to evaluate their impact. It is unethical to roll out policies in this arbitrary way. We have to experiment on a small scale to have a better understanding of how things work before rolling out policies across the UK. This is just as relevant to social policy, as it is to science and medicine, as set out in a new report by the Alliance for Useful Evidence.

Whether it’s the best ways to teach our kids to read, designing programmes to get unemployed people back to work, or encouraging organ donation – if the old ways don’t work, we have to test new ones. And that testing can’t always be done by a committee in Whitehall or in a university lab.

Experimentation can’t happen in isolation. What works in Lewisham or Londonnery, might not work in Lincoln – or indeed across the UK. For instance, there is a huge amount debate around the current practice of teaching children to read and spell using phonics, which was based on a small-scale study in Clackmannanshire, as well as evidence from the US. A government-commissioned review on the evidence for phonics led professor Carole Torgerson, then at York University, to warn against making national policy off the back of just one small Scottish trial.

One way round this problem is to do larger experiments. The increasing use of the internet in public services allows for more and faster experimentation, on a larger scale for lower cost – the randomised controlled trial on voter mobilisation that went to 61 million users in the 2010 US midterm elections, for example. However, the use of the internet doesn’t get us off the ethical hook. Facebook had to apologise after a global backlash to secret psychological tests on their 689,000 users.

Contentious experiments should be approved by ethics committees – normal practice for trials in hospitals and universities.

We are also not interested in freewheeling trial-and-error; robust and appropriate research techniques to learn from experiments are vital. It’s best to see experimentation as a continuum, ranging from the messiness of attempts to try something new to experiments using the best available social science, such as randomised controlled trials.

Experimental government means avoiding an approach where everything is fixed from the outset. What we need is “a spirit of experimentation, unburdened by promises of success”, as recommended by the late professor Roger Jowell, author of the 2003 Cabinet Office report, Trying it out [pdf]….(More)”

UNESCO demonstrates global impact through new transparency portal


“Opendata.UNESCO.org  is intended to present comprehensive, quality and timely information about UNESCO’s projects, enabling users to find information by country/region, funding source, and sector and providing comprehensive project data, including budget, expenditure, completion status, implementing organization, project documents, and more. It publishes program and financial information that are in line with UN system-experience of the IATI (International Aid Transparency Initiative) standards and other relevant transparency initiatives. UNESCO is now part of more than 230 organizations that have published to the IATI Registry, which brings together donor and developing countries, civil society organizations and other experts in aid information who are committed to working together to increase the transparency of aid.

Since its creation 70 years ago, UNESCO has tirelessly championed the causes of education, culture, natural sciences, social and human sciences, communication and information, globally. For instance – started in March 2010, the program for the Enhancement of Literacy in Afghanistan (ELA) benefited from a $19.5 million contribution by Japan. It aimed to improve the level of literacy, numeracy and vocational skills of the adult population in 70 districts of 15 provinces of Afghanistan. Over the next three years, until April 2013, the ELA programme helped some 360,000 adult learners in General Literacy compotency. An interactive map allows for an easy identification of UNESCO’s high-impact programs, and up-to-date information of current and future aid allocations within and across countries.

Public participation and interactivity are key to the success of any open data project. http://Opendata.UNESCO.org will evolve as Member States and partners will get involved, by displaying data on their own websites and sharing data among different networks, building and sharing applications, providing feedback, comments, and recommendations. …(More)”

How to Fight the Next Epidemic


Bill Gates in the New York Times: “The Ebola Crisis Was Terrible. But Next Time Could Be Much Worse….Much of the public discussion about the world’s response to Ebola has focused on whether the World Health Organization, the Centers for Disease Control and Prevention and other groups could have responded more effectively. These are worthwhile questions, but they miss the larger point. The problem isn’t so much that the system didn’t work well enough. The problem is that we hardly have a system at all.

To begin with, most poor countries, where a natural epidemic is most likely to start, have no systematic disease surveillance in place. Even once the Ebola crisis was recognized last year, there were no resources to effectively map where cases occurred, or to use people’s travel patterns to predict where the disease might go next….

Data is another crucial problem. During the Ebola epidemic, the database that tracks cases has not always been accurate. This is partly because the situation is so chaotic, but also because much of the case reporting has been done on paper and then sent to a central location for data entry….

I believe that we can solve this problem, just as we’ve solved many others — with ingenuity and innovation.

We need a global warning and response system for outbreaks. It would start with strengthening poor countries’ health systems. For example, when you build a clinic to deliver primary health care, you’re also creating part of the infrastructure for fighting epidemics. Trained health care workers not only deliver vaccines; they can also monitor disease patterns, serving as part of the early warning systems that will alert the world to potential outbreaks. Some of the personnel who were in Nigeria to fight polio were redeployed to work on Ebola — and that country was able to contain the disease very quickly.

We also need to invest in disease surveillance. We need a case database that is instantly accessible to the relevant organizations, with rules requiring countries to share their information. We need lists of trained personnel, from local leaders to global experts, prepared to deal with an epidemic immediately. … (More)”

31 cities agree to use EU-funded open innovation platform for better smart cities’ services


European Commission Press Release: “At CEBIT, 25 cities from 6 EU countries (Belgium, Denmark, Finland, Italy, Portugal and Spain) and 6 cities from Brazil will present Open & Agile Smart Cities Task Force (OASC), an initiative making it easier for city councils  and startups to improve smart city services (such as transport, energy efficiency, environmental or e-health services). This will be achieved thanks to FIWARE, an EU-funded, open source platform and cloud-based building blocks developed in the EU that can be used to develop a huge range of applications, from Smart Cities to eHealth, and from transport to disaster management. Many applications have already been built using FIWARE – from warnings of earthquakes to preventing food waste to Smartaxi apps. Find a full list of cities in the Background.

The OASC deal will allow cities to share their open data (collected from sensors measuring, for example, traffic flows) so that startups can develop apps and tools that benefit all citizens (for example, an app with traffic information for people on the move). Moreover, these systems will be shared between cities (so, an app with transport information developed in city A can be also adopted by city B, without the latter having to develop it from scratch); FIWARE will also give startups and app developers in these cities access to a global market for smart city services.

Cities from across the globe are trying to make the most of open innovation. This will allow them to include a variety of stakeholders in their activities (services are increasingly connected to other systems and innovative startups are a big part of this trend) and encourage a competitive yet attractive market for developers, thus reducing costs, increasing quality and avoiding vendor lock-in….(More)”

Data democracy – increased supply of geospatial information and expanded participatory processes in the production of data


Paper by Max Craglia & Lea Shanley: “The global landscape in the supply, co-creation and use of geospatial data is changing very rapidly with new satellites, sensors and mobile devices reconfiguring the traditional lines of demand and supply and the number of actors involved. In this paper we chart some of these technology-led developments and then focus on the opportunities they have created for the increased participation of the public in generating and contributing information for a wide range of uses, scientific and non. Not all this information is open or geospatial, but sufficiently large portions of it are to make it one of the most significant phenomena of the last decade. In fact, we argue that while satellite and sensors have exponentially increased the volumes of geospatial information available, the participation of the public is transformative because it expands the range of participants and stakeholders in society using and producing geospatial information, with opportunities for more direct participation in science, politics and social action…(View full text)”

Study to examine Australian businesses’ use of government data


ComputerWorld: “The New York University’s GovLab and the federal Department of Communications have embarked on a study of how Australian organisations are employing government data sets.

The ‘Open Data 500’ study was launched today at the Locate15 conference. It aims to provide a basis for assessing the value of open data and encourage the development of new businesses based on open data, as well as encourage discussion about how to make government data more useful to businesses and not-for-profit organisations.

The study is part of a series of studies taking place under the auspices of the OD500 Global Network.

“This study will help ensure the focus of Government is on the publication of high value datasets, with an emphasis on quality rather than quantity,” a statement issued by the Department of Communications said.

“Open Data 500 advances the government’s policy of increasing the number of high value public datasets in Australia in an effort to drive productivity and innovation, as well as its commitment to greater consultation with private sector stakeholders on open data,” Communications Minister Malcolm Turnbull said in remarks prepared for the Locate 15 conference….(More)”

Budgets for the People


Collective Intelligence or Group Think?


Paper analyzing “Engaging Participation Patterns in World without Oil” by Nassim JafariNaimi and Eric M. Meyers: “This article presents an analysis of participation patterns in an Alternate Reality Game, World Without Oil. This game aims to bring people together in an online environment to reflect on how an oil crisis might affect their lives and communities as a way to both counter such a crisis and to build collective intelligence about responding to it. We present a series of participation profiles based on a quantitative analysis of 1554 contributions to the game narrative made by 322 players. We further qualitatively analyze a sample of these contributions. We outline the dominant themes, the majority of which engage the global oil crisis for its effects on commute options and present micro-sustainability solutions in response. We further draw on the quantitative and qualitative analysis of this space to discuss how the design of the game, specifically its framing of the problem, feedback mechanism, and absence of subject-matter expertise, counter its aim of generating collective intelligence, making it conducive to groupthink….(More)”