Brazilian Government Develops Toolkit to Guide Institutions in both Planning and Carrying Out Open Data Initatives


Nitai Silva at the Open Knowledge Blog: “Recently Brazilian government released the Kit de Dados Abertos (open data toolkit). The toolkit is made up of documents describing the process, methods and techniques for implementing an open data policy within an institution. Its goal is to both demystify the logic of opening up data and to share with public employees observed best practices that have emerged from a number of Brazilian government initiatives.

The toolkit focuses on the Plano de Dados Abertos – PDA (Open Data Plan) as the guiding instrument where commitments, agenda and policy implementation cycles in the institution are registered. We believe that making each public agency build it’s own PDA is a way to perpetuate the open data policy, making it a state policy and not just a transitory governmental action.
It is organized to facilitate the implementation of the main activities cycles that must be observed in an institution and provides links and manuals to assist in these activities. Emphasis is given to the actors/roles involved in each step and their responsibilities. It also helps to define a central person to monitor and maintain the PDA. The following diagram summarizes the macro steps of implementing an open data policy in an institution:
 

Hey Uncle Sam, Eat Your Own Dogfood


It’s been five years since Tim O’Reilly published his screed on Government as Platform. In that time, we’ve seen “civic tech” and “open data” gain in popularity and acceptance. The Federal Government has an open data platform, data.gov. And so too do states and municipalities across America. Code for America is the hottest thing around, and the healthcare.gov fiasco landed fixing public technology as a top concern in government. We’ve successfully laid the groundwork for a new kind of government technology. We’re moving towards a day when, rather than building user facing technology, the government opens up interfaces to data that allows the private sector to create applications and websites that consume public data and surface it to users.

However, we appear to have stalled out a bit in our progress towards government as platform. It’s incredibly difficult to ingest the data for successful commercial products. The kaleidoscope of data formats in open data portals like data.gov might politely be called ‘obscure’, and perhaps more accurately, ‘perversely unusable’. Some of the data hasn’t been updated since first publication, and is quite positively too stale to use. If documentation exists, most of the time it’s incomprehensible….

What we actually need, is for Uncle Sam to start dogfooding his own open data.

For those of you who aren’t familiar with the term, dogfooding is a slang term used by engineers who are using their own product. So, for example, Google employees use Gmail and Google Drive to organize their own work. This term also applies to engineering teams that consume their public APIs to access internal data. Dogfooding helps teams deeply understand their own work from the same perspective as external users. It also provides a keen incentive to make products work well.

Dogfooding is the golden rule of platforms. And currently, open government portals are flagrantly violating this golden rule. I’ve asked around, and I can’t find a single example of a government entity consuming the data they publish…”

Open Data Beyond the Big City


at PBS MediaShift: “…Open data is the future — of how we govern, of how public services are delivered, of how governments engage with those that they serve. And right now, it is unevenly distributed. I think there is a strong argument to be made that data standards can provide a number of benefits to small and midsized municipal governments and could provide a powerful incentive for these governments to adopt open data.
One way we can use standards to drive the adoption of open data is to partner with companies like YelpZillowGoogle and others that can use open data to enhance their services. But how do we get companies with 10s and 100s of millions of users to take an interest in data from smaller municipal governments?
In a word – standards.

Why do we care about cities?

When we talk about open data, it’s important to keep in mind that there is a lot of good work happening at the federal, state and local levels all over the country — plenty of states and even counties doing good things on the open data front, but for me it’s important to evaluate where we are on open data with respect to cities.
States typically occupy a different space in the service delivery ecosystem than cities, and the kinds of data that they typically make available can be vastly different from city data. State capitals are often far removed from our daily lives and we may hear about them only when a budget is adopted or when the state legislature takes up a controversial issue.
In cities, the people that represent and serve us us can be our neighbors — the guy behind you at the car wash, or the woman who’s child is in you son’s preschool class. Cities matter.
As cities go, we need to consider carefully that importance of smaller cities — there are a lot more of them than large cities and a non-trivial number of people live in them….”

New Technology and the Prevention of Violence and Conflict


Report edited by Francesco Mancini for the International Peace Institute: “In an era of unprecedented interconnectivity, this report explores the ways in which new technologies can assist international actors, governments, and civil society organizations to more effectively prevent violence and conflict. It examines the contributions that cell phones, social media, crowdsourcing, crisis mapping, blogging, and big data analytics can make to short-term efforts to forestall crises and to long-term initiatives to address the root causes of violence.
Five case studies assess the use of such tools in a variety of regions (Africa, Asia, Latin America) experiencing different types of violence (criminal violence, election-related violence, armed conflict, short-term crisis) in different political contexts (restrictive and collaborative governments).
Drawing on lessons and insights from across the cases, the authors outline a how-to guide for leveraging new technology in conflict-prevention efforts:
1. Examine all tools.
2. Consider the context.
3. Do no harm.
4. Integrate local input.
5. Help information flow horizontally.
6. Establish consensus regarding data use.
7. Foster partnerships for better results.”

The Web Observatory: A Middle Layer for Broad Data


New paper by Tiropanis Thanassis, Hall Wendy, Hendler James, and de Larrinaga Christian in Big Data: “The Web Observatory project1 is a global effort that is being led by the Web Science Trust,2 its network of WSTnet laboratories, and the wider Web Science community. The goal of this project is to create a global distributed infrastructure that will foster communities exchanging and using each other’s web-related datasets as well as sharing analytic applications for research and business web applications.3 It will provide the means to observe the digital planet, explore its processes, and understand their impact on different sectors of human activity.
The project is creating a network of separate web observatories, collections of datasets and tools for analyzing data about the Web and its use, each with their own use community. This allows researchers across the world to develop and share data, analytic approaches, publications related to their datasets, and tools (Fig. 1). The network of web observatories aims to bridge the gap that currently exists between big data analytics and the rapidly growing web of “broad data,”4 making it difficult for a large number of people to engage with them….”

The View From Your Window Is Worth Cash to This Company


Eric Jaffe in Atlantic CityLab: “A city window overlooking the street has always been a score in its own right, what with so many apartments stuck opening onto back alleys and dumpsters and fire escapes. And now, a company wants to straight up monetize the view. New York startup Placemeter is paying city residents up to $50 a month for street views captured via old smartphones. The idea is to quantify sidewalk life in the service of making the city a more efficient place.

“Measuring data about how the city moves in real time, being able to make predictions on that, is definitely a good way to help cities work better,” says founder Alex Winter. “That’s the vision of Placemeter—to build a data platform where anyone at any time can know how busy the city is, and use that.”
Here’s how it works: City residents send Placemeter a little information about where they live and what they see from their window. In turn, Placemeter sends participants a kit (complete with window suction cup) to convert their unused smartphone into a street sensor, and agrees to pay cash so long as the device stays on and collects data. The more action outside—the more shops, pedestrians, traffic, and public space—the more the view is worth.
On the back end, Placemeter converts the smartphone images into statistical data using proprietary computer vision. The company first detects moving objects (the green splotches in the video below) and classifies them either as people or as 11 types of vehicles or other common urban elements, such as food carts. A second layer of analysis connects this movement with behavioral patterns based on the location—how many cars are speeding down a street, for instance, or how many people are going into a store….
Efforts to quantify city life with big data aren’t new, but where Placemeter’s clear advance is its ability to count pedestrians. Cities often track sidewalk traffic with little more than a hired hand and a manual clicker and spot locations. With its army of smartphone eyes, Placemeter promises a much wider net of real-time data dynamic enough to recognize not only that a person exists but also that person’s behavior, from walking speed to retail interest to general interaction with streets or public spaces…”

Things Fall Apart: How Social Media Leads to a Less Stable World


Commentary by Curtis Hougland at Knowledge@Wharton: “James Foley. David Haines. Steven Sotloff. The list of people beheaded by followers of the Islamic State of Iraq and Syria (ISIS) keeps growing. The filming of these acts on video and distribution via social media platforms such as Twitter represent a geopolitical trend in which social media has become the new frontline for proxy wars across the globe. While social media does indeed advance connectivity and wealth among people, its proliferation at the same time results in a markedly less stable world.
That social media benefits mankind is irrefutable. I have been an evangelist for the power of new media for 20 years. However, technology in the form of globalized communication, transportation and supply chains conspires to make today’s world more complex. Events in any corner of the world now impact the rest of the globe quickly and sharply. Nations are being pulled apart along sectarian seams in Iraq, tribal divisions in Afghanistan, national interests in Ukraine and territorial fences in Gaza. These conflicts portend a quickening of global unrest, confirmed by Foreign Policy magazine’s map of civil protest. The ISIS videos are simply the exposed wire. I believe that over the next century, even great nations will Balkanize — break into smaller nations. One of the principal drivers of this Balkanization is social media Twitter .
Social media is a behavior, an expression of the innate human need to socialize and share experiences. Social media is not simply a set of technology channels and networks. Both the public and private sectors have underestimated the human imperative to behave socially. The evidence is now clear with more than 52% of the population living in cities and approximately 2 billion people active in social media globally. Some 96% of content emanates from individuals, not brands, media or governments — a volume that far exceeds participation in democratic elections.
Social media is not egalitarian, though. Despite the exponential growth of user-generated content, people prefer to congregate online around like-minded individuals. Rather than seek out new beliefs, people choose to reinforce their existing political opinions through their actions online. This is illustrated in Pew Internet’s 2014 study, “Mapping Twitter Topic Networks from Polarized Crowds to Community Clusters.” Individuals self-organize by affinity, and within affinity, by sensibility and personality. The ecosystem of social media is predicated on delivering more of what the user already likes. This, precisely, is the function of a Follow or Like. In this way, media coagulates rather than fragments online….”

New Data for a New Energy Future


(This post originally appeared on the blog of the U.S. Chamber of Commerce Foundation.)

Two growing concerns—climate change and U.S. energy self-sufficiency—have accelerated the search for affordable, sustainable approaches to energy production and use. In this area, as in many others, data-driven innovation is a key to progress. Data scientists are working to help improve energy efficiency and make new forms of energy more economically viable, and are building new, profitable businesses in the process.
In the same way that government data has been used by other kinds of new businesses, the Department of Energy is releasing data that can help energy innovators. At a recent “Energy Datapalooza” held by the department, John Podesta, counselor to the President, summed up the rationale: “Just as climate data will be central to helping communities prepare for climate change, energy data can help us reduce the harmful emissions that are driving climate change.” With electric power accounting for one-third of greenhouse gas emissions in the United States, the opportunities for improvement are great.
The GovLab has been studying the business applications of public government data, or “open data,” for the past year. The resulting study, the Open Data 500, now provides structured, searchable information on more than 500 companies that use open government data as a key business driver. A review of those results shows four major areas where open data is creating new business opportunities in energy and is likely to build many more in the near future.

Commercial building efficiency
Commercial buildings are major energy consumers, and energy costs are a significant business expense. Despite programs like LEED Certification, many commercial buildings waste large amounts of energy. Now a company called FirstFuel, based in Boston, is using open data to drive energy efficiency in these buildings. At the Energy Datapalooza, Swap Shah, the company’s CEO, described how analyzing energy data together with geospatial, weather, and other open data can give a very accurate view of a building’s energy consumption and ways to reduce it. (Sometimes the solution is startlingly simple: According to Shah, the largest source of waste is running heating and cooling systems at the same time.) Other companies are taking on the same kind of task – like Lucid, which provides an operating system that can track a building’s energy use in an integrated way.

Home energy use
A number of companies are finding data-driven solutions for homeowners who want to save money by reducing their energy usage. A key to success is putting together measurements of energy use in the home with public data on energy efficiency solutions. PlotWatt, for example, promises to help consumers “save money with real-time energy tracking” through the data it provides. One of the best-known companies in this area, Opower, uses a psychological strategy: it simultaneously gives people access to their own energy data and lets them compare their energy use to their neighbors’ as an incentive to save. Opower partners with utilities to provide this information, and the Virginia-based company has been successful enough to open offices in San Francisco, London, and Singapore. Soon more and more people will have access to data on their home energy use: Green Button, a government-promoted program implemented by utilities, now gives about 100 million Americans data about their energy consumption.

Solar power and renewable energy
As solar power becomes more efficient and affordable, a number of companies are emerging to support this energy technology. Clean Power Finance, for example, uses its database to connect solar entrepreneurs with sources of capital. In a different way, a company called Solar Census is analyzing publicly available data to find exactly where solar power can be produced most efficiently. The kind of analysis that used to require an on-site survey over several days can now be done in less than a minute with their algorithms.
Other kinds of geospatial and weather data can support other forms of renewable energy. The data will make it easier to find good sites for wind power stations, water sources for small-scale hydroelectric projects, and the best opportunities to tap geothermal energy.

Supporting new energy-efficient vehicles
The Tesla and other electric vehicles are becoming commercially viable, and we will soon see even more efficient vehicles on the road. Toyota has announced that its first fuel-cell cars, which run on hydrogen, will be commercially available by mid-2015, and other auto manufacturers have announced plans to develop fuel-cell vehicles as well. But these vehicles can’t operate without a network to supply power, be it electricity for a Tesla battery or hydrogen for a fuel cell.
It’s a chicken-and-egg problem: People won’t buy large numbers of electric or fuel-cell cars unless they know they can power them, and power stations will be scarce until there are enough vehicles to support their business. Now some new companies are facilitating this transition by giving drivers data-driven tools to find and use the power sources they need. Recargo, for example, provides tools to help electric car owners find charging stations and operate their vehicles.
The development of new energy sources will involve solving social, political, economic, and technological issues. Data science can help develop solutions and bring us more quickly to a new kind of energy future.
Joel Gurin, senior advisor at the GovLab and project director, Open Data 500. He also currently serves as a fellow of the U.S. Chamber of Commerce Foundation.

Driving Innovation With Open Data


Research Article by The GovLab’s Joel Gurin (Chapter 6 in the report, “The Future of Data-Driven Innovation.”):  The chapters in this report provide ample evidence of the power of data and its business potential. But like any business resource, data is only valuable if the benefit of using it outweighs its cost. Data collection, management, distribution, quality control, and application all come at a price—a potential obstacle for companies of any size, though especially for small and medium-sized enterprises.
Over the last several years, however, the “I” of data’s return on investment (ROI) has become less of a hurdle, and new data-driven companies are developing rapidly as a result. One major reason is that governments at the federal, state, and local level are making more data available at little or no charge for the private sector and the public to use. Governments collect data of all kinds—including scientific, demographic, and financial data—at taxpayer expense.
Now, public sector agencies and departments are increasingly repaying that public investment by making their data available to all for free or at a low cost. This is Open Data. While there are still costs in putting the data to use, the growing availability of this national resource is becoming a significant driver for hundreds of new businesses. This chapter describes the growing potential of Open Data and the data-driven innovation it supports, the types of data and applications that are most promising, and the policies that will encourage innovation going forward. Read and download this article in PDF format.

Building a Smarter University Big Data, Innovation, and Analytics


New book edited by Jason E. Lane : “The Big Data movement and the renewed focus on data analytics are transforming everything from healthcare delivery systems to the way cities deliver services to residents. Now is the time to examine how this Big Data could help build smarter universities. While much of the cutting-edge research that is being done with Big Data is happening at colleges and universities, higher education has yet to turn the digital mirror on itself to advance the academic enterprise. Institutions can use the huge amounts of data being generated to improve the student learning experience, enhance research initiatives, support effective community outreach, and develop campus infrastructure. This volume focuses on three primary themes related to creating a smarter university:  refining the operations and management of higher education institutions, cultivating the education pipeline, and educating the next generation of data scientists. Through an analysis of these issues, the contributors address how universities can foster innovation and ingenuity in the academy. They also provide scholarly and practical insights in order to frame these topics for an international discussion.”