How Internet surveillance predicts disease outbreak before WHO


Kurzweil News: “Have you ever Googled for an online diagnosis before visiting a doctor? If so, you may have helped provide early warning of an infectious disease epidemic.
In a new study published in Lancet Infectious Diseases, Internet-based surveillance has been found to detect infectious diseases such as Dengue Fever and Influenza up to two weeks earlier than traditional surveillance methods, according to Queensland University of Technology (QUT) research fellow and senior author of the paper Wenbiao Hu.
Hu, based at the Institute for Health and Biomedical Innovation, said there was often a lag time of two weeks before traditional surveillance methods could detect an emerging infectious disease.
“This is because traditional surveillance relies on the patient recognizing the symptoms and seeking treatment before diagnosis, along with the time taken for health professionals to alert authorities through their health networks. In contrast, digital surveillance can provide real-time detection of epidemics.”
Hu said the study used search engine algorithms such as Google Trends and Google Insights. It found that detecting the 2005–06 avian influenza outbreak “Bird Flu” would have been possible between one and two weeks earlier than official surveillance reports.
“In another example, a digital data collection network was found to be able to detect the SARS outbreak more than two months before the first publications by the World Health Organization (WHO),” Hu said.
According to this week’s CDC FluView report published Jan. 17, 2014, influenza activity in the United States remains high overall, with 3,745 laboratory-confirmed influenza-associated hospitalizations reported since October 1, 2013 (credit: CDC)
“Early detection means early warning and that can help reduce or contain an epidemic, as well alert public health authorities to ensure risk management strategies such as the provision of adequate medication are implemented.”
Hu said the study found that social media including Twitter and Facebook and microblogs could also be effective in detecting disease outbreaks. “The next step would be to combine the approaches currently available such as social media, aggregator websites, and search engines, along with other factors such as climate and temperature, and develop a real-time infectious disease predictor.”
“The international nature of emerging infectious diseases combined with the globalization of travel and trade, have increased the interconnectedness of all countries and that means detecting, monitoring and controlling these diseases is a global concern.”
The other authors of the paper were Gabriel Milinovich (first author), Gail Williams and Archie Clements from the University of Queensland School of Population, Health and State.
Supramap 
Another powerful tool is Supramap, a web application that synthesizes large, diverse datasets so that researchers can better understand the spread of infectious diseases across hosts and geography by integrating genetic, evolutionary, geospatial, and temporal data. It is now open-source — create your own maps here.
Associate Professor Daniel Janies, Ph.D., an expert in computational genomics at the Wexner Medical Center at The Ohio State University (OSU), worked with software engineers at the Ohio Supercomputer Center (OSC) to allow researchers and public safety officials to develop other front-end applications that draw on the logic and computing resources of Supramap.
It was originally developed in 2007 to track the spread and evolution of pandemic (H1N1) and avian influenza (H5N1).
“Using SUPRAMAP, we initially developed maps that illustrated the spread of drug-resistant influenza and host shifts in H1N1 and H5N1 influenza and in coronaviruses, such as SARS,” said Janies. “SUPRAMAP allows the user to track strains carrying key mutations in a geospatial browser such as Google Earth. Our software allows public health scientists to update and view maps on the evolution and spread of pathogens.”
Grant funding through the U.S. Army Research Laboratory and Office supports this Innovation Group on Global Infectious Disease Research project. Support for the computational requirements of the project comes from  the American Museum of Natural History (AMNH) and OSC. Ohio State’s Wexner Medical Center, Department of Biomedical Informatics and offices of Academic Affairs and Research provide additional support.”
See also

Innovation by Competition: How Challenges and Competition Get the Most Out of the Crowd


Innocentive: “Crowdsourcing has become the 21st century’s alternative to problem solving in place of traditional employee-based strategies. It has become the modern solution to provide for needed services, content, and ideas. Crowdsourced ideas are paving the way for today’s organizations to tackle innovation challenges that confront them in today’s competitive global marketplace. To put it all in perspective, crowds used to be thought of as angry mobs. Today, crowds are more like friendly and helpful contributors. What an interesting juxtaposition, eh?
Case studies proving the effectiveness of crowdsourcing to conquer innovation challenge, particularly in the fields of science and engineering abound. Despite this fact that success stories involving crowdsourcing are plentiful, very few firms are really putting its full potential to use. Advances in ALS and AIDS research have both made huge advances thanks to crowdsourcing, just to name a couple.
Biologists at the University of Washington were able to map the structure of an AIDS related virus thanks to the collaboration involved with crowdsourcing. How did they do this?  With the help of gamers playing a game designed to help get the information the University of Washington needed. It was a solution that remained unattainable for over a decade until enough top notch scientific minds were expertly probed from around the world with effective crowdsourcing techniques.
Dr. Seward Rutkove discovered an ALS biomarker to accurately measure the progression of the disease in patients through the crowdsourcing tactics utilized in a prize contest by an organization named Prize4Life, who utilized our Challenge Driven Innovation approach to engage the crowd.
The truth is, the concept of crowdsourcing to innovate has been around for centuries. But, with the growing connectedness of the world due to sheer Internet access, the power and ability to effectively crowdsource has increased exponentially. It’s time for corporations to realize this, and stop relying on stale sources of innovation. ..”

Tech Policy Is Not A Religion


Opinion Piece by Robert Atkinson: “”Digital libertarians” and “digital technocrats” want us to believe their way is the truth and the light. It’s not that black and white. Manichaeism, an ancient religion, took a dualistic view of the world. It described the struggle between a good, spiritual world of light, and an evil, material world of darkness. Listening to tech policy debates, especially in America, one would presume that Manichaeism is alive and well.
On one side (light or dark, depending on your view) are the folks who embrace free markets, bottom-up processes, multi-stakeholderism, open-source systems, and crowdsourced innovations. On the other are those who embrace government intervention, top-down processes, additional regulation, proprietary systems, and expert-based innovations.
For the first group, whom I’ll call the digital libertarians, government is the problem, not the solution. Tech enables freedom, and statist actions can only limit it.
According to this camp, tech is moving so fast that government can’t hope to keep up — the only workable governance system is a nimble one based on multi-stakeholder processes, such as ICANN and W3C. With Web 2.0, everyone can be a contributor, and it is through the proliferation of multiple and disparate voices that we discover the truth. And because of the ability of communities of coders to add their contributions, the only viable tech systems are based on open-source models.
For the second group, the digital technocrats, the problem is the anarchic, lawless, corporate-dominated nature of the digital world. Tech is so disruptive, including to long-established norms and laws, it needs to be limited and shaped, and only the strong hand of the state can do that. Because of the influence of tech on all aspects of society, any legitimate governance process must stem from democratic institutions — not from a select group of insiders — and that can only happen with government oversight such as through the UN’s International Telecommunication Union.
According to this camp, because there are so many uninformed voices on the Internet spreading urban myths like wildfire, we need carefully vetted experts, whether in media or other organizations, to sort through the mass of information and provide expert, unbiased analysis. And because IT systems are so critical to the safety and well-functioning of  society, we need companies to build and profit from them through a closed-source model.
Of course, just as religious Manichaeism leads to distorted practices of faith, tech Manichaeism leads to distorted policy practices and views. Take Internet governance. The process of ensuring Internet governance and evolution is complex and rapidly changing. A strong case can be made for the multi-stakeholder process as the driving force.
But this situation doesn’t mean, as digital libertarians would assert, that governments should stay out of the Internet altogether. Governments are not, as digital libertarian John Perry Barlow arrogantly asserts, “weary giants of flesh and steel.” Governments can and do play legitimate roles in many Internet policy issues, from establishing cybersecurity guidelines to setting online sales tax policy to combatting spam and digital piracy to setting rules governing unfair and deceptive online marketing practices.
This assertion doesn’t mean governments always get things right. They don’t. But as the Information Technology and Innovation Foundation writes in its recent response to Barlow’s manifesto, to deny people the right to regulate Internet activity through their government officials ignores the significant contribution the government can play in promoting the continued development of the Internet and digital economy.
At the same time, the digital technocrats must understand that the digital world is different from the analog one, and that old rules, regulations, and governing structures simply don’t apply. When ITU Secretary General Hamadoun Toure argues that “at the behest of all the world’s nations, the UN must lead this effort” to manage the global Internet, and that “for big commercial interests, it’s about maximizing the bottom line,” he’s ignoring the critical role that tech companies and other non-government stakeholders play in the Internet ecosystem.
Because digital technology is such a vastly complex system, digital libertarians claim that their “light” approach is superior to the “dark,” controlling, technocratic approach. In fact, this very complexity requires that we base Internet policy on pragmatism, not religion.
Conversely, because technology is so important to opportunity and the functioning of societies, digital technocrats assert that only governments can maximize these benefits. In fact, its importance requires us to respect its complexity and the role of private sector innovators in driving digital progress.
In short, the belief that one or the other of these approaches is sufficient in itself to maximize tech innovation is misleading at best and damaging at worst.”

Crowdsourcing forecasts on science and technology events and innovations


Kurzweil News: “George Mason University launched today, Jan. 10, the largest and most advanced science and technology prediction market in the world: SciCast.
The federally funded research project aims to improve the accuracy of science and technology forecasts. George Mason research assistant professor Charles Twardy is the principal investigator of the project.
SciCast crowdsources forecasts on science and technology events and innovations from aerospace to zoology.
For example, will Amazon use drones for commercial package delivery by the end of 2017? Today, SciCast estimates the chance at slightly more than 50 percent. If you think that is too low, you can estimate a higher chance. SciCast will use your estimate to adjust the combined forecast.
Forecasters can update their forecasts at any time; in the above example, perhaps after the Federal Aviation Administration (FAA) releases its new guidelines for drones. The continually updated and reshaped information helps both the public and private sectors better monitor developments in a variety of industries. SciCast is a real-time indicator of what participants think is going to happen in the future.
“Combinatorial” prediction market better than simple average


How SciCast works (Credit: George Mason University)
The idea is that collective wisdom from diverse, informed opinions can provide more accurate predictions than individual forecasters, a notion borne out by other crowdsourcing projects. Simply taking an average is almost always better than going with the “best” expert. But in a two-year test on geopolitical questions, the SciCast method did 40 percent better than the simple average.
SciCast uses the first general “combinatorial” prediction market. In a prediction market, forecasters spend points to adjust the group forecast. Significant changes “cost” more — but “pay” more if they turn out to be right. So better forecasters gain more points and therefore more influence, improving the accuracy of the system.
In a combinatorial market like SciCast, forecasts can influence each other. For example, forecasters might have linked cherry production to honeybee populations. Then, if forecasters increase the estimated percentage of honeybee colonies lost this winter, SciCast automatically reduces the estimated 2014 cherry production. This connectivity among questions makes SciCast more sophisticated than other prediction markets.
SciCast topics include agriculture, biology and medicine, chemistry, computational sciences, energy, engineered technologies, global change, information systems, mathematics, physics, science and technology business, social sciences, space sciences and transportation….

Crowdsourcing forecasts on science and technology events and innovations

George Mason University’s just-launched SciCast is largest and most advanced science and technology prediction market in the world
January 10, 2014


Example of SciCast crowdsourced forecast (credit: George Mason University)
George Mason University launched today, Jan. 10, the largest and most advanced science and technology prediction market in the world: SciCast.
The federally funded research project aims to improve the accuracy of science and technology forecasts. George Mason research assistant professor Charles Twardy is the principal investigator of the project.
SciCast crowdsources forecasts on science and technology events and innovations from aerospace to zoology.
For example, will Amazon use drones for commercial package delivery by the end of 2017? Today, SciCast estimates the chance at slightly more than 50 percent. If you think that is too low, you can estimate a higher chance. SciCast will use your estimate to adjust the combined forecast.
Forecasters can update their forecasts at any time; in the above example, perhaps after the Federal Aviation Administration (FAA) releases its new guidelines for drones. The continually updated and reshaped information helps both the public and private sectors better monitor developments in a variety of industries. SciCast is a real-time indicator of what participants think is going to happen in the future.
“Combinatorial” prediction market better than simple average


How SciCast works (Credit: George Mason University)
The idea is that collective wisdom from diverse, informed opinions can provide more accurate predictions than individual forecasters, a notion borne out by other crowdsourcing projects. Simply taking an average is almost always better than going with the “best” expert. But in a two-year test on geopolitical questions, the SciCast method did 40 percent better than the simple average.
SciCast uses the first general “combinatorial” prediction market. In a prediction market, forecasters spend points to adjust the group forecast. Significant changes “cost” more — but “pay” more if they turn out to be right. So better forecasters gain more points and therefore more influence, improving the accuracy of the system.
In a combinatorial market like SciCast, forecasts can influence each other. For example, forecasters might have linked cherry production to honeybee populations. Then, if forecasters increase the estimated percentage of honeybee colonies lost this winter, SciCast automatically reduces the estimated 2014 cherry production. This connectivity among questions makes SciCast more sophisticated than other prediction markets.
SciCast topics include agriculture, biology and medicine, chemistry, computational sciences, energy, engineered technologies, global change, information systems, mathematics, physics, science and technology business, social sciences, space sciences and transportation.
Seeking futurists to improve forecasts, pose questions


(Credit: George Mason University)
“With so many science and technology questions, there are many niches,” says Twardy, a researcher in the Center of Excellence in Command, Control, Communications, Computing and Intelligence (C4I), based in Mason’s Volgenau School of Engineering.
“We seek scientists, statisticians, engineers, entrepreneurs, policymakers, technical traders, and futurists of all stripes to improve our forecasts, link questions together and pose new questions.”
Forecasters discuss the questions, and that discussion can lead to new, related questions. For example, someone asked,Will Amazon deliver its first package using an unmanned aerial vehicle by Dec. 31, 2017?
An early forecaster suggested that this technology is likely to first be used in a mid-sized town with fewer obstructions or local regulatory issues. Another replied that Amazon is more likely to use robots to deliver packages within a short radius of a conventional delivery vehicle. A third offered information about an FAA report related to the subject.
Any forecaster could then write a question about upcoming FAA rulings, and link that question to the Amazon drones question. Forecasters could then adjust the strength of the link.
“George Mason University has succeeded in launching the world’s largest forecasting tournament for science and technology,” says Jason Matheny, program manager of Forecasting Science and Technology at the Intelligence Advanced Research Projects Activity, based in Washington, D.C. “SciCast can help the public and private sectors to better understand a range of scientific and technological trends.”
Collaborative but Competitive
More than 1,000 experts and enthusiasts from science and tech-related associations, universities and interest groups preregistered to participate in SciCast. The group is collaborative in spirit but also competitive. Participants are rewarded for accurate predictions by moving up on the site leaderboard, receiving more points to spend influencing subsequent prognostications. Participants can (and should) continually update their predictions as new information is presented.
SciCast has partnered with the American Association for the Advancement of Science, the Institute of Electrical and Electronics Engineers, and multiple other science and technology professional societies.
Mason members of the SciCast project team include Twardy; Kathryn Laskey, associate director for the C4I and a professor in the Department of Systems Engineering and Operations Research; associate professor of economics Robin Hanson; C4I research professor Tod Levitt; and C4I research assistant professors Anamaria Berea, Kenneth Olson and Wei Sun.
To register for SciCast, visit www.SciCast.org, or for more information, e-mail support@scicast.org. SciCast is open to anyone age 18 or older.”

New Book: Open Data Now


New book by Joel Gurin (The GovLab): “Open Data is the world’s greatest free resource–unprecedented access to thousands of databases–and it is one of the most revolutionary developments since the Information Age began. Combining two major trends–the exponential growth of digital data and the emerging culture of disclosure and transparency–Open Data gives you and your business full access to information that has never been available to the average person until now. Unlike most Big Data, Open Data is transparent, accessible, and reusable in ways that give it the power to transform business, government, and society.
Open Data Now is an essential guide to understanding all kinds of open databases–business, government, science, technology, retail, social media, and more–and using those resources to your best advantage. You’ll learn how to tap crowds for fast innovation, conduct research through open collaboration, and manage and market your business in a transparent marketplace.
Open Data is open for business–and the opportunities are as big and boundless as the Internet itself. This powerful, practical book shows you how to harness the power of Open Data in a variety of applications:

  • HOT STARTUPS: turn government data into profitable ventures
  • SAVVY MARKETING: understand how reputational data drives your brand
  • DATA-DRIVEN INVESTING: apply new tools for business analysis
  • CONSUMER IN FORMATION: connect with your customers using smart disclosure
  • GREEN BUSINESS: use data to bet on sustainable companies
  • FAST R&D: turn the online world into your research lab
  • NEW OPPORTUNITIES: explore open fields for new businesses

Whether you’re a marketing professional who wants to stay on top of what’s trending, a budding entrepreneur with a billion-dollar idea and limited resources, or a struggling business owner trying to stay competitive in a changing global market–or if you just want to understand the cutting edge of information technology–Open Data Now offers a wealth of big ideas, strategies, and techniques that wouldn’t have been possible before Open Data leveled the playing field.
The revolution is here and it’s now. It’s Open Data Now.”

Supporting open government in New Europe


Google Europe Blog: “The “New Europe” countries that joined the European Union over the past decade are moving ahead fast to use the Internet to improve transparency and open government. We recently partnered with Techsoup Global to support online projects driving forward good governance in Romania, the Czech Republic, and most recently, in Slovakia.
Techsoup Global, in partnership with the Slovak Center for Philanthropy, recently held an exciting social-startups awards ceremony Restart Slovakia 2013 in Bratislava. Slovakia’s Deputy Minister of Finance and Digital Champion Peter Pellegrini delivered keynote promoting Internet and Open Data and announced the winners of this year contest. Ambassadors from U.S., Israel and Romania and several distinguished Slovak NGOs also attended the ceremony.
Winning projects included:

  • Vzdy a vsade – Always and Everywhere – a volunteer portal offering online and anonymous psychological advice to internet users via chat.
  • Nemlcme.sk – a portal providing counsel for victims of sexual assaults.
  • Co robim – an educational online library of job careers advising young people how to choose their career paths and dream jobs.
  • Mapa zlocinu – an online map displaying various rates of criminality in different neighbourhoods.
  • Demagog.sk – a platform focused on analyzing public statements of politicians and releasing information about politicians and truthfulness of their speeches in a user-friendly format.”

From Faith-Based to Evidence-Based: The Open Data 500 and Understanding How Open Data Helps the American Economy


Beth Noveck in Forbes: “Public funds have, after all, paid for their collection, and the law says that federal government data are not protected by copyright. By the end of 2009, the US and the UK had the only two open data one-stop websites where agencies could post and citizens could find open data. Now there are over 300 such portals for government data around the world with over 1 million available datasets. This kind of Open Data — including weather, safety and public health information as well as information about government spending — can serve the country by increasing government efficiency, shedding light on regulated industries, and driving innovation and job creation.

It’s becoming clear that open data has the potential to improve people’s lives. With huge advances in data science, we can take this data and turn it into tools that help people choose a safer hospital, pick a better place to live, improve the performance of their farm or business by having better climate models, and know more about the companies with whom they are doing business. Done right, people can even contribute data back, giving everyone a better understanding, for example of nuclear contamination in post-Fukushima Japan or incidences of price gouging in America’s inner cities.

The promise of open data is limitless. (see the GovLab index for stats on open data) But it’s important to back up our faith with real evidence of what works. Last September the GovLab began the Open Data 500 project, funded by the John S. and James L. Knight Foundation, to study the economic value of government Open Data extensively and rigorously.  A recent McKinsey study pegged the annual global value of Open Data (including free data from sources other than government), at $3 trillion a year or more. We’re digging in and talking to those companies that use Open Data as a key part of their business model. We want to understand whether and how open data is contributing to the creation of new jobs, the development of scientific and other innovations, and adding to the economy. We also want to know what government can do better to help industries that want high quality, reliable, up-to-date information that government can supply. Of those 1 million datasets, for example, 96% are not updated on a regular basis.

The GovLab just published an initial working list of 500 American companies that we believe to be using open government data extensively.  We’ve also posted in-depth profiles of 50 of them — a sample of the kind of information that will be available when the first annual Open Data 500 study is published in early 2014. We are also starting a similar study for the UK and Europe.

Even at this early stage, we are learning that Open Data is a valuable resource. As my colleague Joel Gurin, author of Open Data Now: the Secret to Hot Start-Ups, Smart Investing, Savvy Marketing and Fast Innovation, who directs the project, put it, “Open Data is a versatile and powerful economic driver in the U.S. for new and existing businesses around the country, in a variety of ways, and across many sectors. The diversity of these companies in the kinds of data they use, the way they use it, their locations, and their business models is one of the most striking things about our findings so far.” Companies are paradoxically building value-added businesses on top of public data that anyone can access for free….”

FULL article can be found here.

A permanent hacker space in the Brazilian Congress


Blog entry by Dan Swislow at OpeningParliament: “On December 17, the presidency of the Brazilian Chamber of Deputies passed a resolution that creates a permanent Laboratório Ráquer or “Hacker Lab” inside the Chamber—a global first.
Read the full text of the resolution in Portuguese.
The resolution mandates the creation of a physical space at the Chamber that is “open for access and use by any citizen, especially programmers and software developers, members of parliament and other public workers, where they can utilize public data in a collaborative fashion for actions that enhance citizenship.”
The idea was born out of a week-long, hackathon (or “hacker marathon”) event hosted by the Chamber of Deputies in November, with the goal of using technology to enhance the transparency of legislative work and increase citizen understanding of the legislative process. More than 40 software developers and designers worked to create 22 applications for computers and mobile devices. The applications were voted on and the top three awarded prizes.
The winner was Meu Congress, a website that allows citizens to track the activities of their elected representatives, and monitor their expenses. Runner-ups included Monitora, Brasil!, an Android application that allows users to track proposed bills, attendance and the Twitter feeds of members; and Deliberatório, an online card game that simulates the deliberation of bills in the Chamber of Deputies.
The hackathon engaged the software developers directly with members and staff of the Chamber of Deputies, including the Chamber’s President, Henrique Eduardo Alves. Hackathon organizer Pedro Markun of Transparencia Hacker made a formal proposal to the President of the Chamber for a permanent outpost, where, as Markun said in an email, “we could hack from inside the leviathan’s belly.”
The Chamber’s Director-General has established nine staff positions for the Hacker Lab under the leadership of the Cristiano Ferri Faria, who spoke with me about the new project.
Faria explained that the hackathon event was a watershed moment for many public officials: “For 90-95% of parliamentarians and probably 80% of civil servants, they didn’t know how amazing a simple app, for instance, can make it much easier to analyze speeches.” Faria pointed to one of the hackathon contest entries, Retórica Parlamentar, which provides an interactive visualization of plenary remarks by members of the Chamber. “When members saw that, they got impressed and wondered, ‘There’s something new going on and we need to understand it and support it.’”

The GovLab Index: Open Data


Please find below the latest installment in The GovLab Index series, inspired by Harper’s Index. “The GovLab Index: Open Data — December 2013” provides an update on our previous Open Data installment, and highlights global trends in Open Data and the release of public sector information. Previous installments include Measuring Impact with Evidence, The Data Universe, Participation and Civic Engagement and Trust in Institutions.
Value and Impact

  • Potential global value of open data estimated by McKinsey: $3 trillion annually
  • Potential yearly value for the United States: $1.1 trillion 
  • Europe: $900 billion
  • Rest of the world: $1.7 trillion
  • How much the value of open data is estimated to grow per year in the European Union: 7% annually
  • Value of releasing UK’s geospatial data as open data: 13 million pounds per year by 2016
  • Estimated worth of business reuse of public sector data in Denmark in 2010: more than €80 million a year
  • Estimated worth of business reuse of public sector data across the European Union in 2010: €27 billion a year
  • Total direct and indirect economic gains from easier public sector information re-use across the whole European Union economy, as of May 2013: €140 billion annually
  • Economic value of publishing data on adult cardiac surgery in the U.K., as of May 2013: £400 million
  • Economic value of time saved for users of live data from the Transport for London apps, as of May 2013: between £15 million and £58 million
  • Estimated increase in GDP in England and Wales in 2008-2009 due to the adoption of geospatial information by local public services providers: +£320m
  • Average decrease in borrowing costs in sovereign bond markets for emerging market economies when implementing transparent practices (measured by accuracy and frequency according to IMF policies, across 23 countries from 1999-2002): 11%
  • Open weather data supports an estimated $1.5 billion in applications in the secondary insurance market – but much greater value comes from accurate weather predictions, which save the U.S. annually more than $30 billion
  • Estimated value of GPS data: $90 billion

Efforts and Involvement

  • Number of U.S. based companies identified by the GovLab that use government data in innovative ways: 500
  • Number of open data initiatives worldwide in 2009: 2
  • Number of open data initiatives worldwide in 2013: over 300
  • Number of countries with open data portals: more than 40
  • Countries who share more information online than the U.S.: 14
  • Number of cities globally that participated in 2013 International Open Data Hackathon Day: 102
  • Number of U.S. cities with Open Data Sites in 2013: 43
  • U.S. states with open data initiatives: 40
  • Membership growth in the Open Government Partnership in two years: from 8 to 59 countries
  • Number of time series indicators (GDP, foreign direct investment, life expectancy, internet users, etc.) in the World Bank Open Data Catalog: over 8,000
  • How many of 77 countries surveyed by the Open Data Barometer have some form of Open Government Data Initiative: over 55%
  • How many OGD initiatives have dedicated resources with senior level political backing: over 25%
  • How many countries are in the Open Data Index: 70
    • How many of the 700 key datasets in the Index are open: 84
  • Number of countries in the Open Data Census: 77
    • How many of the 727 key datasets in the Census are open: 95
  • How many countries surveyed have formal data policies in 2013: 55%
  • Those who have machine-readable data available: 25%
  • Top 5 countries in Open Data rankings: United Kingdom, United States, Sweden, New Zealand, Norway
  • The different levels of Open Data Certificates a data user or publisher can achieve “along the way to world-class open data”: 4 levels, Raw, Pilot, Standard and Expert
  • The number of data ecosystems categories identified by the OECD: 3, data producers, infomediaries, and users

Examining Datasets
FULL VERSION AT http://thegovlab.org/govlab-index-open-data-updated/
 

A Bottom-Up Smart City?


Alicia Rouault at Data-Smart City Solutions: “America’s shrinking cities face a tide of disinvestment, abandonment, vacancy, and a shift toward deconstruction and demolition followed by strategic reinvestment, rightsizing, and a host of other strategies designed to renew once-great cities. Thriving megacity regions are experiencing rapid growth in population, offering a different challenge for city planners to redefine density, housing, and transportation infrastructure. As cities shrink and grow, policymakers are increasingly called to respond to these changes by making informed, data-driven decisions. What is the role of the citizen in this process of collecting and understanding civic data?
Writing for Forbes in “Open Sourcing the Neighborhood,” Professor of Sociology at Columbia University Saskia Sassen calls for “open source urbanism” as an antidote to the otherwise top-down smart city movement. This form of urbanism involves opening traditional verticals of information within civic and governmental institutions. Citizens can engage with and understand the logic behind decisions by exploring newly opened administrative data. Beyond opening these existing datasets, Sassen points out that citizen experts hold invaluable institutional memory that can serve as an alternate and legitimate resource for policymakers, economists, and urban planners alike.
In 2012, we created a digital platform called LocalData to address the production and use of community-generated data in a municipal context. LocalData is a digital mapping service used globally by universities, non-profits, and municipal governments to gather and understand data at a neighborhood scale. In contrast to traditional Census or administrative data, which is produced by a central agency and collected infrequently, our platform provides a simple method for both community-based organizations and municipal employees to gather real-time data on project-specific indicators: property conditions, building inspections, environmental issues or community assets. Our platform then visualizes data and exports it into formats integrated with existing systems in government to seamlessly provide accurate and detailed information for decision makers.
LocalData began as a project in Detroit, Michigan where the city was tackling a very real lack of standard, updated, and consistent condition information on the quality and status of vacant and abandoned properties. Many of these properties were owned by the city and county due to high foreclosure rates. One of Detroit’s strategies for combating crime and stabilizing neighborhoods is to demolish property in a targeted fashion. This strategy serves as a political win as much as providing an effective way to curb the secondary effects of vacancy: crime, drug use, and arson. Using LocalData, the city mapped critical corridors of emergent commercial property as an analysis tool for where to place investment, and documented thousands of vacant properties to understand where to target demolition.
Vacancy is not unique to the Midwest. Following our work with the Detroit Mayor’s office and planning department, LocalData has been used in dozens of other cities in the U.S. and abroad. Currently the Smart Chicago Collaborative is using LocalData to conduct a similar audit of vacant and abandoned property in southwest Chicagos. Though an effective tool for capturing building-specific information, LocalData has also been used to capture behavior and movement of goods. The MIT Megacities Logistics Lab has used LocalData to map and understand the intensity of urban supply chains by interviewing shop owners and mapping delivery routes in global megacities in Mexico, Colombia, Brazil and the U.S. The resulting information has been used with analytical models to help both city officials and companies to design better city logistics policies and operations….”