Helping Smart Cities Harness Big Data


Linda Poon in CityLab: “Harnessing the power of open data is key to developing the smart cities of the future. But not all governments have the capacity—be that funding or human capital—to collect all the necessary information and turn it into a tool. That’s where Mapbox comes in.

Mapbox offers open-source mapping platforms, and is no stranger to turning complex data into visualizations cities can use, whether it’s mapping traffic fatalities in the U.S. or the conditions of streets in Washington, D.C., during last year’s East Coast blizzard. As part of the White House Smart Cities Initiative, which announced this week that it would make more than $80 million in tech investments this year, the company is rolling out Mapbox Cities, a new “mentorship” program that, for now, will give three cities the tools and support they need to solve some of their most pressing urban challenges. It issued a call for applications earlier this week, and responses have poured in from across the globe says Christina Franken, who specializes in smart cities at Mapbox.

“It’s very much an experimental approach to working with cities,” she says. “A lot of cities have open-data platforms but they don’t really do something with the data. So we’re trying to bridge that gap.”

During Hurricane Sandy, Mapbox launched a tool to help New Yorkers figure out if they were in an evacuation zone. (Mapbox)

But the company isn’t approaching the project blindly. In a way, Mapbox has the necessary experience to help cities jumpstart their own projects. Its resume includes, for example, a map that visualizes the sheer quantity of traffic fatalities along any commuting route in the U.S., showcasing its ability to turn a whopping five years’ worth of data into a public-safety tool. During 2012’s Hurricane Sandy, they created a disaster-relief tool to help New Yorkers find shelter.

And that’s just in the United States. Mapbox recently also started a group focusing primarily on humanitarian issues and bringing their mapping and data-collecting tools to aid organizations all over the world in times of crisis. It provides free access to its vast collection of resources, and works closely with collaborators to help them customize maps based on specific needs….(More)”

Bringing together the United States of data


The U.S. Data Federation will support government-wide data standardization and data federation initiatives across both Federal agencies and local governments. This is intended to be a fundamental coordinating mechanism for a more open and interconnected digital government by profiling and supporting use-cases that demonstrate unified and coherent data architectures across disparate government agencies. These examples will highlight emerging data standards and API initiatives across all levels of government, convey the level of maturity for each effort, and facilitate greater participation by government agencies. Initiatives that may be profiled within the U.S. Data Federation include Open311, DOT’s National Transit Map, the Project Open Data metadata schema, Contact USA, and the Police Data Initiative. As part of the U.S. Data Federation, GSA will also pilot the development of reusable components needed for a successful data federation strategy including schema documentation tools, schema validation tools, and automated data aggregation and normalization capabilities. The U.S. Data Federation will provide more sophisticated and seamless opportunities on the foundation of U.S. open data initiatives by allowing the public to more easily do comparative data analysis across government bodies and create applications that work across multiple government agencies….(More)”

Privacy and Open Data


A Research Briefing by Wood, Alexandra and O’Brien, David and Gasser, Urs: “Political leaders and civic advocates are increasingly recommending that open access be the “default state” for much of the information held by government agencies. Over the past several years, they have driven the launch of open data initiatives across hundreds of national, state, and local governments. These initiatives are founded on a presumption of openness for government data and have led to the public release of large quantities data through a variety of channels. At the same time, much of the data that have been released, or are being considered for release, pertain to the behavior and characteristics of individual citizens, highlighting tensions between open data and privacy. This research briefing offers a snapshot of recent developments in the open data and privacy landscape, outlines an action map of various governance approaches to protecting privacy when releasing open data, and identifies key opportunities for decision-makers seeking to respond to challenges in this space….(More)”

Informed Choice? Motivations and methods of data usage among public officials in India


Report by Rwitwika Bhattacharya and Mohitkumar Daga: “The importance of data in informing the policy-making process is being increasingly realized across the world. With India facing significant developmental challenges, use of data offers an important opportunity to improve the quality of public services. However, lack of formal structures to internalize a data-informed decision-making process impedes the path to robust policy formation. This paper seeks to highlight these challenges through a case study of data dashboard implementation in the state of Andhra Pradesh. The study suggests the importance of capacity building, improvement of data collection and engagement of non-governmental players as measures to address issues….(More)”

Designing the Next Generation of Open Data Policy


Andrew Young and Stefaan Verhulst at the Open Data Charter Blog: “The international Open Data Charter has emerged from the global open data community as a galvanizing document to place open government data directly in the hands of citizens and organizations. To drive this process forward, and ensure that the outcomes are both systemic and transformational, new open data policy needs to be based on evidence of how and when open data works in practice. To support this work, the GovLab, in collaboration with Omidyar Network, has recently completed research which provides vital evidence of open data projects around the world, including an analysis of 19 in-depth, impact-focused case studies and a key findings paper. All of the research is now available in an eBook published by O’Reilly Media.

The research found that open data is making an impact in four core ways, including:…(More)”

How Technology is Crowd-Sourcing the Fight Against Hunger


Beth Noveck at Media Planet: “There is more than enough food produced to feed everyone alive today. Yet access to nutritious food is a challenge everywhere and depends on getting every citizen involved, not just large organizations. Technology is helping to democratize and distribute the job of tackling the problem of hunger in America and around the world.

Real-time research

One of the hardest problems is the difficulty of gaining real-time insight into food prices and shortages. Enter technology. We no longer have to rely on professional inspectors slowly collecting information face-to-face. The UN World Food Programme, which provides food assistance to 80 million people each year, together with Nielsen is conducting mobile phone surveys in 15 countries (with plans to expand to 30), asking people by voice and text about what they are eating. Formerly blank maps are now filled in with information provided quickly and directly by the most affected people, making it easy to prioritize the allocation of resources.

Technology helps the information flow in both directions, enabling those in need to reach out, but also to become more effective at helping themselves. The Indian Ministry of Agriculture, in collaboration with Reuters Market Light, provides information services in nine Indian languages to 1.4 million registered farmers in 50,000 villages across 17 Indian states via text and voice messages.

“In the United States, 40 percent of the food produced here is wasted, and yet 1 in 4 American children (and 1 in 6 adults) remain food insecure…”

Data to the people

New open data laws and policies that encourage more transparent publication of public information complement data collection and dissemination technologies such as phones and tablets. About 70 countries and hundreds of regions and cities have adopted open data policies, which guarantee that the information these public institutions collect be available for free use by the public. As a result, there are millions of open datasets now online on websites such as the Humanitarian Data Exchange, which hosts 4,000 datasets such as country-by-country stats on food prices and undernourishment around the world.

Companies are compiling and sharing data to combat food insecurity, too. Anyone can dig into the data on the Global Open Data for Agriculture and Nutrition platform, a data collaborative where 300 private and public partners are sharing information.

Importantly, this vast quantity of open data is available to anyone, not only to governments. As a result, large and small entrepreneurs are able to create new apps and programs to combat food insecurity, such as Plantwise, which uses government data to offer a knowledge bank and run “plant clinics” that help farmers lose less of what they grow to pests. Google uses open government data to show people the location of farmers markets near their homes.

Students, too, can learn to play a role. For the second summer in a row, the Governance Lab at New York University, in partnership with the United States Department of Agriculture (USDA), mounted a two-week open data summer camp for 40 middle and high school students. The next generation of problem solvers is learning new data science skills by working on food safety and other projects using USDA open data.

Enhancing connection

Ultimately, technology enables greater communication and collaboration among the public, social service organizations, restaurants, farmers and other food producers who must work together to avoid food crises. The European Food Safety Authority in Italy has begun exploring how to use internet-based collaboration (often called citizen science or crowdsourcing) to get more people involved in food and feed risk assessment.

In the United States, 40 percent of the food produced here is wasted, and yet 1 in 4 American children (and 1 in 6 adults) remain food insecure, according to the Rockefeller Foundation. Copia, a San Francisco based smartphone app facilitates donations and deliveries of those with excess food in six cities in the Bay Area. Zero Percent in Chicago similarly attacks the distribution problem by connecting restaurants to charities to donate their excess food. Full Harvest is a tech platform that facilitates the selling of surplus produce that otherwise would not have a market.

Mobilizing the world

Prize-backed challenges create the incentives for more people to collaborate online and get involved in the fight against hunger….(More)”

Open Government Implementation Model


Open Government Implementation ModelKDZ: “The City of Vienna was the first public agency in a German speaking country to develop an Open Government Initative and to commit itself to the concept of Open Data – an open and transparent system that makes city data available to citizens for their further use. Vienna’s first Open Data catalogue has been presented to the public.

The KDZ – Centre for Public Administration Research was contracted by the Chief Executive Office of Vienna to contribute to the Open Government strategy of the City of Vienna. In order to bring the insights and propositions gained to the attention of a wider public, the Open Government Implementation Model  has been translated into English.

The KDZ Implementation Model is based on and significantly elaborates the “Open Government Implementation Model” by Lee/Kwak (2011). …(More)

See also:

Responsible Data in Agriculture


Report by Lindsay Ferris and Zara Rahman for GODAN: “The agriculture sector is creating increasing amounts of data, from many different sources. From tractors equipped with GPS tracking, to open data released by government ministries, data is becoming ever more valuable, as agricultural business development and global food policy decisions are being made based upon data. But the sector is also home to severe resource inequality. The largest agricultural companies make billions of dollars per year, in comparison with subsistence farmers growing just enough to feed themselves, or smallholder farmers who grow enough to sell on a year-by-year basis. When it comes to data and technology, these differences in resources translate to stark power imbalances in data access and use. The most well resourced actors are able to delve into new technologies and make the most of those insights, whereas others are unable to take any such risks or divert any of their limited resources. Access to and use of data has radically changed the business models and behaviour of some of those well resourced actors, but in contrast, those with fewer resources are receiving the same, limited access to information that they always have.

In this paper, we have approached these issues from a responsible data perspective, drawing upon the experience of the Responsible Data community1 who over the past three years have created tools, questions and resources to deal with the ethical, legal, privacy and security challenges that come from new uses of data in various sectors. This piece aims to provide a broad overview of some of the responsible data challenges facing these actors, with a focus on the power imbalance between actors, and looking into how that inequality affects behaviour when it comes to the agricultural data ecosystem. What are the concerns of those with limited resources, when it comes to this new and rapidly changing data environment? In addition, what are the ethical grey areas or uncertainties that we need to address in the future? As a first attempt to answer these questions, we spoke to 14 individuals with various perspectives on the sector to understand what the challenges are for them and for the people they work with. We also carried out desk research to dive deeper into these issues, and we provide here an analysis of our findings and responsible data challenges….(More)”

How to advance open data research: Towards an understanding of demand, users, and key data


Danny Lämmerhirt and Stefaan Verhulst at IODC blog: “…Lord Kelvin’s famous quote “If you can not measure it, you can not improve it” equally applies to open data. Without more evidence of how open data contributes to meeting users’ needs and addressing societal challenges, efforts and policies toward releasing and using more data may be misinformed and based upon untested assumptions.

When done well, assessments, metrics, and audits can guide both (local) data providers and users to understand, reflect upon, and change how open data is designed. What we measure and how we measure is therefore decisive to advance open data.

Back in 2014, the Web Foundation and the GovLab at NYU brought together open data assessment experts from Open Knowledge, Organisation for Economic Co-operation and Development, United Nations, Canada’s International Development Research Centre, and elsewhere to explore the development of common methods and frameworks for the study of open data. It resulted in a draft template or framework for measuring open data. Despite the increased awareness for more evidence-based open data approaches, since 2014 open data assessment methods have only advanced slowly. At the same time, governments publish more of their data openly, and more civil society groups, civil servants, and entrepreneurs employ open data to manifold ends: the broader public may detect environmental issues and advocate for policy changes, neighbourhood projects employ data to enable marginalized communities to participate in urban planning, public institutions may enhance their information exchange, and entrepreneurs embed open data in new business models.

In 2015, the International Open Data Conference roadmap made the following recommendations on how to improve the way we assess and measure open data.

  1. Reviewing and refining the Common Assessment Methods for Open Data framework. This framework lays out four areas of inquiry: context of open data, the data published, use practices and users, as well as the impact of opening data.
  2. Developing a catalogue of assessment methods to monitor progress against the International Open Data Charter (based on the Common Assessment Methods for Open Data).
  3. Networking researchers to exchange common methods and metrics. This helps to build methodologies that are reproducible and increase credibility and impact of research.
  4. Developing sectoral assessments.

In short, the IODC called for refining our assessment criteria and metrics by connecting researchers, and applying the assessments to specific areas. It is hard to tell how much progress has been made in answering these recommendations, but there is a sense among researchers and practitioners that the first two goals are yet to be fully addressed.

Instead we have seen various disparate, yet well meaning, efforts to enhance the understanding of the release and impact of open data. A working group was created to measure progress on the International Open Data Charter, which provides governments with principles for implementing open data policies. While this working group compiled a list of studies and their methodologies, it did not (yet) deepen the common framework of definitions and criteria to assess and measure the implementation of the Charter.

In addition, there is an increase of sector- and case-specific studies that are often more descriptive and context specific in nature, yet do contribute to the need for examples that illustrate the value proposition for open data.

As such, there seems to be a disconnect between top-level frameworks and on-the-ground research, preventing the sharing of common methods and distilling replicable experiences about what works and what does not….(More)”

National Transit Map Seeks to Close the Transit Data Gap


Ben Miller at GovTech: “In bringing together the first ever map illustrating the nation’s transit system, the U.S. Department of Transportation isn’t just making data more accessible — it’s also aiming to modernize data collection and dissemination for many of the country’s transit agencies.

With more than 10,000 routes and 98,000 stops represented, the National Transit Map is already enormous. But Dan Morgan, chief data officer of the department, says it’s not enough. When measuring vehicles operated in maximum service — a metric illustrating peak service at a transit agency — the National Transit Map captures only about half of all transit in the U.S.

“Not all of these transit agencies have this data available,” Morgan said, “so this is an ongoing project to really close the transit data gap.”Which is why, in the process of building out the map, the DOT is working with transit agencies to make their data available.

Which is why, in the process of building out the map, the DOT is working with transit agencies to make their data available.

On the whole, transit data is easier to collect and process than a lot of transportation data because many agencies have adopted a standard called General Transit Feed Specification (GTFS) that applies to schedule-related data. That’s what made the National Transit Map an easy candidate for completion, Morgan said.

But as popular as GTFS has become, many agencies — especially smaller ones — haven’t been able to use it. The tools to convert to GTFS come with a learning curve.

“It’s really a matter of priority and availability of resources,” he said.

Bringing those agencies into the mainstream is important to achieving the goals of the map. In the map, Morgan said he sees an opportunity to achieve a new level of clarity where it has never existed before.

That’s because transit has long suffered from difficulty in seeing its own history. Transit officials can describe their systems as they exist, but looking at how they got there is trickier.

“There’s no archive,” Morgan said, “there’s no picture of how transit changes over time.”

And that’s a problem for assessing what works and what doesn’t, for understanding why the system operates the way it does and how it responds to changes. …(More)”