Smart Chicago Collaborative: “…we’re happy to announce the launch of our latest project, the Chicago Health Atlas, where you can view citywide information about health trends and take action near you to improve your own health….Read more about data sources on the Chicago Health Atlas About page.
Canada's Action Plan on Open Government
Introduction: “Canada’s commitment to open government is part of the federal government’s efforts to foster greater openness and accountability, to provide Canadians with more opportunities to learn about and participate in government, to drive innovation and economic opportunities for all Canadians and, at the same time, create a more cost effective, efficient and responsive government.
The Government of Canada first launched its Open Government strategy in March 2011, and then further enhanced its commitment by announcing its intention to join the Open Government Partnership in September 2011.
Over the past two years, we have consulted Canadians on both the development of a Digital Economy Strategy and on Open Government. Our Digital Economy consultation sought feedback from all Canadians on how to improve innovation and creativity, and achieve the shared goal of making Canada a global leader in the digital economy. More recently, in the fall of 2011, we launched a consultation to explore Canadians’ perspectives on Open Government in order to inform the development of Canada’s Action Plan on Open Government.
The results of these consultations stressed the importance of providing open access to public sector information and data and, in particular, the need to improve the availability of data to researchers and the private sector with fewer restrictions on reuse of these information assets. Canadians also want the opportunity to engage in an ongoing dialogue with government on policies and priorities. Cumulatively, the valuable information and insight received from Canadians have helped us shape the direction for open government in Canada. As we move forward, we will continue to consult with Canadians and Canada’s active open government community on how best to implement this plan.
Our Action Plan on Open Government sets out our commitments to Canadians and for the Open Government Partnership, which we will achieve over a three-year period through the effective and prudent use of resources. It is structured along the three streams of our Open Government Strategy: Open Information, Open Data, and Open Dialogue.”
Targeting Transparency
New paper by David Weil, Mary Graham, and Archon Fung in Science Magazine: “When rules, taxes, or subsidies prove impractical as policy tools, governments increasingly employ “targeted transparency,” compelling disclosure of information as an alternative means of achieving specific objectives. For example, the U.S. Affordable Care Act of 2010 requires calories be posted on menus to enlist both restaurants and patrons in the effort to reduce obesity. It is crucial to understand when and how such targeted transparency works, as well as when it is inappropriate. Research about its use and effectiveness has begun to take shape, drawing on social and behavioral scientists, economists, and legal scholars. We explore questions central to the performance of targeted transparency policies.
Targeted transparency differs from broader “right-to-know” and “open-government” policies that span from the 1966 Freedom of Information Act to the Obama Administration’s “open-government” initiative encouraging officials to make existing data sets readily available and easy to parse as an end in itself (1, 2). Targeted transparency offers a more focused approach often used to introduce new scientific evidence of public risks into market choices. Government compels companies or agencies to disclose information in standardized formats to reduce specific risks, to ameliorate externalities arising from a failure of consumers or producers to fully consider social costs associated with a product, or to improve provision of public goods and services. Such policies are more light-handed than conventional regulation, relying on the power of information rather than on enforcement of rules and standards or financial inducements….”
See also the Transparency Policy Project at http://transparencypolicy.net/
Visualizing 3 Billion Tweets
Eric Gundersen from Mapbox: “This is a look at 3 billion tweets – every geotagged tweet since September 2011, mapped, showing facets of Twitter’s ecosystem and userbase in incredible new detail, revealing demographic, cultural, and social patterns down to city level detail, across the entire world. We were brought in by the data team at Gnip, who have awesome APIs and raw access to the Twitter firehose, and together Tom and data artist Eric Fischer used our open source tools to visualize the data and build interfaces that let you explore the stories of space, language, and access to technology.
This is big data, and there’s a significant level of geographic overlap between tweets, so Eric wrote an open-source tool that de-duplicated 2.7 billion overlapping datapoints, leaving 280 million unique locations…”
Visualizing the Stunning Growth of 8 Years of OpenStreetMap
Emily Badger in Atlantic Cities: “The U.S. OpenStreetMap community gathered in San Francisco over the weekend for its annual conference, the State of the Map. The loose citizen-cartography collective has now been incrementally mapping the world since 2004. While they were taking stock, it turns out the global open mapping effort has now mapped data on more than 78 million buildings and 21 million miles of road (if you wanted to drive all those roads at, say, 60 miles an hour, it would take you some 40 years to do it).
And more than a million people have chipped away at this in an impressively democratic manner: 83.6 percent of the changes in the whole database have been made by 99.9 percent of contributors.
These numbers come from the OpenStreetMap 2013 Data Report, which also contains, of course, more maps. The report, created by MapBox, includes a beautiful worldwide visualization of all the road updates made as OpenStreetMap has grown, with some of the earliest imports of data shown in green and blue, and more recent ones in white. You can navigate the full map here (scroll down), but we’ve grabbed a couple of snapshots for you as well.”
Brazilian Students Dig for Corruption (Video)
New York Times Video: “Student protesters at a public university in Rio de Janeiro are teaching each other how to expose data about the city’s transport system.”
Brazilian Students Dig for Corruption
Student protesters at a public university in Rio de Janeiro are teaching each other how to expose data about the city’s transport system.
G8 Open Data Charter: "Open Data by Default" will "fuel innovation"
G8 Open Data Charter, June 2013: “Principle 1: Open Data by Default
13. We recognise that free access to, and subsequent re-use of, open data are of significant value to society and the economy.
14. We agree to orient our governments towards open data by default.
15. We recognise that the term government data is meant in the widest sense possible. This could apply to data owned by national, federal, local, or international government bodies, or by the wider public sector.
16. We recognise that there is national and international legislation, in particular pertaining to intellectual property, personally-identifiable and sensitive information, which must be observed.
17. We will: establish an expectation that all government data be published openly by default , as outlined in this Charter, while recognising that there are legitimate reasons why some data cannot be released….
Principle 4: Releasing Data for Improved Governance
25. We recognise that the release of open data strengthens our democratic institutions and encourages better policy-making to meets the needs of our citizens. This is true not only in our own countries but across the world.
26. We also recognise that interest in open data is growing in other multilateral organisations and initiatives.
27. We will: share technical expertise and experience with each other and with other countries across the world so that everyone can reap the benefits of open data; and be transparent about our own data collection, standards, and publishing processes , by documenting all of these related processes online.
Principle 5: Releasing Data for Innovation
28. Recognising the importance of diversity in stimulating creativity and innovation, we agree that the more people and or ganisations that use our data, the greater the social and economic benefits that will be generated. This is true for both commercial and non-commercial uses .
29. We will: work to increase open data literacy and encourage people, such as developers of applications and civil society organisations that work in the field of open data promotion, to unlock the value of open data ; empower a future generation of data innovators by providing data in machine-readable formats.”
See also:
Professor Sir Nigel Shadbolt, Chairman and Co-Founder, Open Data Institute on G8 Open Data Charter: why it matters
Nick Sinai and Marina Martin from the White House on Open Data Going Global
Weather Could Be Next On The Auction Block For Crowdsourced Data
Darrell Etherington in TechCrunch: “Waze’s big exit to Google proved one thing: if companies can harness the power of the crowd to deliver real-time, granular data, big tech corporations will be watching them closely as potential acquisition targets. There’s another category ripe for the picking, even if the problem being solved isn’t as apparent or immediately useful as traffic and navigation data: weather. A few apps are trying to harness the crowd to provide accurate, ground-level forecasts and conditions, and they’re catching on with consumers, too.
Montreal-based startup SkyMotion is one such firm, and it recently launched its 4.0 update, which not only harnesses crowdsourced weather reports, but also allows other businesses to plug into that data using a public API, to integrate real-time reporting data from SkyMotion’s users into their own products. That provides an up-to-the-minute forecast, one that probably won’t show you weather conditions completely dissimilar from the ones you’re actually feeling outside at any given moment, as can still be the case with apps that pull weather data only from specific weather monitoring stations….
SkyMotion isn’t alone in crowdsourcing weather data. There’s also Weddar, the “people-powered” weather service and mobile app that encourages location-based reporting with a very human element, since it asks people how conditions generally feel on the ground, instead of seeking out specifics…”
Experiments in Democracy
Jeremy Rozansky, assistant editor of National Affairs in The New Atlantis: ” In his debut book Uncontrolled, entrepreneur and policy analyst Jim Manzi argues that social scientists and policymakers should instead adopt the “experimental method.” The essential tool of this method is the randomized field trial (RFT), a technique that already informs many of our successful private enterprises. Perhaps the best known example of RFTs — one that Manzi uses to illustrate the concept — is the kind of clinical trial performed to test new medicines, wherein researchers “undertake a painstaking series of replicated controlled experiments to measure the effects of various interventions under various conditions,” as he puts it.
The central argument of Uncontrolled is that RFTs should be adopted more widely by businesses as well as government. The book is helpful and holds much wisdom — although the approach he recommends is ultimately just another streetlamp in the night, casting a pale light that tapers off after a few yards. Much still lies beyond its glow….
The econometric method now dominates the social sciences because it helps to cope with the problem of high causal density. It begins with a large data set: economic records, election results, surveys, and other similar big pools of data. Then the social scientist uses statistical techniques to model the interactions of sundry independent variables (causes) and a dependent variable (the effect). But for this method to work properly, social scientists must know all the causally important variables beforehand, because a hidden conditional could easily yield a false positive.
The experimental method, which Manzi prefers, offers a different way of coping with high causal density: sidestepping the problem of isolating exact causes. To sort out whether a given treatment or policy works, a scientist or social scientist can try it out on a random section of a population, and compare the results to a different section of the population where the treatment or policy was not implemented. So while econometric models aim to identify which particular variables are responsible for different results, RFTs have more modest aims, as they do not seek to identify every hidden conditional. By using the RFT approach, we may not know precisely why we achieved a desired effect, since we do not model all possible variables. But we can gain some ability to know that we will achieve a desired effect, at least under certain conditions.
Strictly speaking, even a randomized field trial only tells us with certainty that some exact technique worked with some specific population on some specific date in the past when conducted by some specific experimenters. We cannot know whether a given treatment or policy will work again under the same conditions at a later date, much less on a different population, much less still on the population as a whole. But scientists must always be cautious about moving from particular results to general conclusions; this is why experiments need to be replicated. And the more we do replicate them, the more information we can gain from those particular results, and the more reliably they can build toward teaching us which treatments or policies might work or (more often) which probably won’t. The result is that the RFT approach is very well suited to the business of government, since policymakers usually only need to know whether a given policy will work — whether it will produce a desired outcome.”
Data-Smart City Solutions
Press Release: “Today the Ash Center for Democratic Governance and Innovation at Harvard Kennedy School announced the launch of Data-Smart City Solutions, a new initiative aimed at using big data and analytics to transform the way local government operates. Bringing together leading industry, academic, and government officials, the initiative will offer city leaders a national depository of cases and best practice examples where cities and private partners use analytics to solve city problems. Data-Smart City Solutions is funded by Bloomberg Philanthropies and the John D. and Catherine T. MacArthur Foundation.
Data-Smart City Solutions highlights best practices, curates resources, and supports cities embarking on new data projects. The initiative’s website contains feature-length articles on how data drives innovation in different policy areas, profile pieces on municipal leaders at the forefront of implementing data analytics in their cities, and resources for interested officials to begin data projects in their own communities.
Recent articles include an assessment of Boston’s Adopt-a-Hydrant program as a potential harbinger of future city work promoting civic engagement and infrastructure maintenance, and a feature on how predictive technology is transforming police work. The site also spotlights municipal use of data such as San Francisco’s efforts to integrate data from different social service departments to better identify and serve at-risk youth. In addition to visiting the initiative’s website, Data-Smart City Solutions’ work is chronicled in their newsletter as well as on their Twitter page.”