From Faith-Based to Evidence-Based: The Open Data 500 and Understanding How Open Data Helps the American Economy


Beth Noveck in Forbes: “Public funds have, after all, paid for their collection, and the law says that federal government data are not protected by copyright. By the end of 2009, the US and the UK had the only two open data one-stop websites where agencies could post and citizens could find open data. Now there are over 300 such portals for government data around the world with over 1 million available datasets. This kind of Open Data — including weather, safety and public health information as well as information about government spending — can serve the country by increasing government efficiency, shedding light on regulated industries, and driving innovation and job creation.

It’s becoming clear that open data has the potential to improve people’s lives. With huge advances in data science, we can take this data and turn it into tools that help people choose a safer hospital, pick a better place to live, improve the performance of their farm or business by having better climate models, and know more about the companies with whom they are doing business. Done right, people can even contribute data back, giving everyone a better understanding, for example of nuclear contamination in post-Fukushima Japan or incidences of price gouging in America’s inner cities.

The promise of open data is limitless. (see the GovLab index for stats on open data) But it’s important to back up our faith with real evidence of what works. Last September the GovLab began the Open Data 500 project, funded by the John S. and James L. Knight Foundation, to study the economic value of government Open Data extensively and rigorously.  A recent McKinsey study pegged the annual global value of Open Data (including free data from sources other than government), at $3 trillion a year or more. We’re digging in and talking to those companies that use Open Data as a key part of their business model. We want to understand whether and how open data is contributing to the creation of new jobs, the development of scientific and other innovations, and adding to the economy. We also want to know what government can do better to help industries that want high quality, reliable, up-to-date information that government can supply. Of those 1 million datasets, for example, 96% are not updated on a regular basis.

The GovLab just published an initial working list of 500 American companies that we believe to be using open government data extensively.  We’ve also posted in-depth profiles of 50 of them — a sample of the kind of information that will be available when the first annual Open Data 500 study is published in early 2014. We are also starting a similar study for the UK and Europe.

Even at this early stage, we are learning that Open Data is a valuable resource. As my colleague Joel Gurin, author of Open Data Now: the Secret to Hot Start-Ups, Smart Investing, Savvy Marketing and Fast Innovation, who directs the project, put it, “Open Data is a versatile and powerful economic driver in the U.S. for new and existing businesses around the country, in a variety of ways, and across many sectors. The diversity of these companies in the kinds of data they use, the way they use it, their locations, and their business models is one of the most striking things about our findings so far.” Companies are paradoxically building value-added businesses on top of public data that anyone can access for free….”

FULL article can be found here.

A permanent hacker space in the Brazilian Congress


Blog entry by Dan Swislow at OpeningParliament: “On December 17, the presidency of the Brazilian Chamber of Deputies passed a resolution that creates a permanent Laboratório Ráquer or “Hacker Lab” inside the Chamber—a global first.
Read the full text of the resolution in Portuguese.
The resolution mandates the creation of a physical space at the Chamber that is “open for access and use by any citizen, especially programmers and software developers, members of parliament and other public workers, where they can utilize public data in a collaborative fashion for actions that enhance citizenship.”
The idea was born out of a week-long, hackathon (or “hacker marathon”) event hosted by the Chamber of Deputies in November, with the goal of using technology to enhance the transparency of legislative work and increase citizen understanding of the legislative process. More than 40 software developers and designers worked to create 22 applications for computers and mobile devices. The applications were voted on and the top three awarded prizes.
The winner was Meu Congress, a website that allows citizens to track the activities of their elected representatives, and monitor their expenses. Runner-ups included Monitora, Brasil!, an Android application that allows users to track proposed bills, attendance and the Twitter feeds of members; and Deliberatório, an online card game that simulates the deliberation of bills in the Chamber of Deputies.
The hackathon engaged the software developers directly with members and staff of the Chamber of Deputies, including the Chamber’s President, Henrique Eduardo Alves. Hackathon organizer Pedro Markun of Transparencia Hacker made a formal proposal to the President of the Chamber for a permanent outpost, where, as Markun said in an email, “we could hack from inside the leviathan’s belly.”
The Chamber’s Director-General has established nine staff positions for the Hacker Lab under the leadership of the Cristiano Ferri Faria, who spoke with me about the new project.
Faria explained that the hackathon event was a watershed moment for many public officials: “For 90-95% of parliamentarians and probably 80% of civil servants, they didn’t know how amazing a simple app, for instance, can make it much easier to analyze speeches.” Faria pointed to one of the hackathon contest entries, Retórica Parlamentar, which provides an interactive visualization of plenary remarks by members of the Chamber. “When members saw that, they got impressed and wondered, ‘There’s something new going on and we need to understand it and support it.’”

The GovLab Index: Open Data


Please find below the latest installment in The GovLab Index series, inspired by Harper’s Index. “The GovLab Index: Open Data — December 2013” provides an update on our previous Open Data installment, and highlights global trends in Open Data and the release of public sector information. Previous installments include Measuring Impact with Evidence, The Data Universe, Participation and Civic Engagement and Trust in Institutions.
Value and Impact

  • Potential global value of open data estimated by McKinsey: $3 trillion annually
  • Potential yearly value for the United States: $1.1 trillion 
  • Europe: $900 billion
  • Rest of the world: $1.7 trillion
  • How much the value of open data is estimated to grow per year in the European Union: 7% annually
  • Value of releasing UK’s geospatial data as open data: 13 million pounds per year by 2016
  • Estimated worth of business reuse of public sector data in Denmark in 2010: more than €80 million a year
  • Estimated worth of business reuse of public sector data across the European Union in 2010: €27 billion a year
  • Total direct and indirect economic gains from easier public sector information re-use across the whole European Union economy, as of May 2013: €140 billion annually
  • Economic value of publishing data on adult cardiac surgery in the U.K., as of May 2013: £400 million
  • Economic value of time saved for users of live data from the Transport for London apps, as of May 2013: between £15 million and £58 million
  • Estimated increase in GDP in England and Wales in 2008-2009 due to the adoption of geospatial information by local public services providers: +£320m
  • Average decrease in borrowing costs in sovereign bond markets for emerging market economies when implementing transparent practices (measured by accuracy and frequency according to IMF policies, across 23 countries from 1999-2002): 11%
  • Open weather data supports an estimated $1.5 billion in applications in the secondary insurance market – but much greater value comes from accurate weather predictions, which save the U.S. annually more than $30 billion
  • Estimated value of GPS data: $90 billion

Efforts and Involvement

  • Number of U.S. based companies identified by the GovLab that use government data in innovative ways: 500
  • Number of open data initiatives worldwide in 2009: 2
  • Number of open data initiatives worldwide in 2013: over 300
  • Number of countries with open data portals: more than 40
  • Countries who share more information online than the U.S.: 14
  • Number of cities globally that participated in 2013 International Open Data Hackathon Day: 102
  • Number of U.S. cities with Open Data Sites in 2013: 43
  • U.S. states with open data initiatives: 40
  • Membership growth in the Open Government Partnership in two years: from 8 to 59 countries
  • Number of time series indicators (GDP, foreign direct investment, life expectancy, internet users, etc.) in the World Bank Open Data Catalog: over 8,000
  • How many of 77 countries surveyed by the Open Data Barometer have some form of Open Government Data Initiative: over 55%
  • How many OGD initiatives have dedicated resources with senior level political backing: over 25%
  • How many countries are in the Open Data Index: 70
    • How many of the 700 key datasets in the Index are open: 84
  • Number of countries in the Open Data Census: 77
    • How many of the 727 key datasets in the Census are open: 95
  • How many countries surveyed have formal data policies in 2013: 55%
  • Those who have machine-readable data available: 25%
  • Top 5 countries in Open Data rankings: United Kingdom, United States, Sweden, New Zealand, Norway
  • The different levels of Open Data Certificates a data user or publisher can achieve “along the way to world-class open data”: 4 levels, Raw, Pilot, Standard and Expert
  • The number of data ecosystems categories identified by the OECD: 3, data producers, infomediaries, and users

Examining Datasets
FULL VERSION AT http://thegovlab.org/govlab-index-open-data-updated/
 

A Bottom-Up Smart City?


Alicia Rouault at Data-Smart City Solutions: “America’s shrinking cities face a tide of disinvestment, abandonment, vacancy, and a shift toward deconstruction and demolition followed by strategic reinvestment, rightsizing, and a host of other strategies designed to renew once-great cities. Thriving megacity regions are experiencing rapid growth in population, offering a different challenge for city planners to redefine density, housing, and transportation infrastructure. As cities shrink and grow, policymakers are increasingly called to respond to these changes by making informed, data-driven decisions. What is the role of the citizen in this process of collecting and understanding civic data?
Writing for Forbes in “Open Sourcing the Neighborhood,” Professor of Sociology at Columbia University Saskia Sassen calls for “open source urbanism” as an antidote to the otherwise top-down smart city movement. This form of urbanism involves opening traditional verticals of information within civic and governmental institutions. Citizens can engage with and understand the logic behind decisions by exploring newly opened administrative data. Beyond opening these existing datasets, Sassen points out that citizen experts hold invaluable institutional memory that can serve as an alternate and legitimate resource for policymakers, economists, and urban planners alike.
In 2012, we created a digital platform called LocalData to address the production and use of community-generated data in a municipal context. LocalData is a digital mapping service used globally by universities, non-profits, and municipal governments to gather and understand data at a neighborhood scale. In contrast to traditional Census or administrative data, which is produced by a central agency and collected infrequently, our platform provides a simple method for both community-based organizations and municipal employees to gather real-time data on project-specific indicators: property conditions, building inspections, environmental issues or community assets. Our platform then visualizes data and exports it into formats integrated with existing systems in government to seamlessly provide accurate and detailed information for decision makers.
LocalData began as a project in Detroit, Michigan where the city was tackling a very real lack of standard, updated, and consistent condition information on the quality and status of vacant and abandoned properties. Many of these properties were owned by the city and county due to high foreclosure rates. One of Detroit’s strategies for combating crime and stabilizing neighborhoods is to demolish property in a targeted fashion. This strategy serves as a political win as much as providing an effective way to curb the secondary effects of vacancy: crime, drug use, and arson. Using LocalData, the city mapped critical corridors of emergent commercial property as an analysis tool for where to place investment, and documented thousands of vacant properties to understand where to target demolition.
Vacancy is not unique to the Midwest. Following our work with the Detroit Mayor’s office and planning department, LocalData has been used in dozens of other cities in the U.S. and abroad. Currently the Smart Chicago Collaborative is using LocalData to conduct a similar audit of vacant and abandoned property in southwest Chicagos. Though an effective tool for capturing building-specific information, LocalData has also been used to capture behavior and movement of goods. The MIT Megacities Logistics Lab has used LocalData to map and understand the intensity of urban supply chains by interviewing shop owners and mapping delivery routes in global megacities in Mexico, Colombia, Brazil and the U.S. The resulting information has been used with analytical models to help both city officials and companies to design better city logistics policies and operations….”

Web Science: Understanding the Emergence of Macro-Level Features on the World Wide Web


Monograph by Kieron O’Hara, Noshir S. Contractor, Wendy Hall, James A. Hendler and Nigel Shadbolt in Foundations and Trends in Web Sciences: “Web Science considers the development of Web Science since the publication of ‘A Framework for Web Science’ (Berners-Lee et al., 2006). This monograph argues that the requirement for understanding should ideally be accompanied by some measure of control, which makes Web Science crucial in the future provision of tools for managing our interactions, our politics, our economics, our entertainment, and – not least – our knowledge and data sharing…
In this monograph we consider the development of Web Science since the launch of this journal and its inaugural publication ‘A Framework for Web Science’ [44]. The theme of emergence is discussed as the characteristic phenomenon of Web-scale applications, where many unrelated micro-level actions and decisions, uninformed by knowledge about the macro-level, still produce noticeable and coherent effects at the scale of the Web. A model of emergence is mapped onto the multitheoretical multilevel (MTML) model of communication networks explained in [252]. Four specific types of theoretical problem are outlined. First, there is the need to explain local action. Second, the global patterns that form when local actions are repeated at scale have to be detected and understood. Third, those patterns feed back into the local, with intricate and often fleeting causal connections to be traced. Finally, as Web Science is an engineering discipline, issues of control of this feedback must be addressed. The idea of a social machine is introduced, where networked interactions at scale can help to achieve goals for people and social groups in civic society; an important aim of Web Science is to understand how such networks can operate, and how they can control the effects they produce on their own environment.”

Open Data in Action


Nick Sinai at the White House: “Over the past few years, the Administration has launched a series of Open Data Initiatives, which, have released troves of valuable data in areas such as health, energy, education, public safety, finance, and global development…
Today, in furtherance of this exciting economic dynamic, The Governance Lab (The GovLab) —a research institution at New York University—released the beta version of its Open Data 500 project—an initiative designed to identify, describe, and analyze companies that use open government data in order to study how these data can serve business needs more effectively. As part of this effort, the organization is compiling a list of 500+ companies that use open government data to generate new business and develop new products and services.
This working list of 500+ companies, from sectors ranging from real estate to agriculture to legal services, shines a spotlight on surprising array of innovative and creative ways that open government data is being used to grow the economy – across different company sizes, different geographies, and different industries. The project includes information about  the companies and what government datasets they have identified as critical resources for their business.
Some of examples from the Open Data 500 Project include:
  • Brightscope, a San Diego-based company that leverages data from the Department of Labor, the Security and Exchange Commission, and the Census Bureau to rate consumers’ 401k plans objectively on performance and fees, so companies can choose better plans and employees can make better decisions about their retirement options.
  • AllTuition, a  Chicago-based startup that provides services—powered by data from Department of Education on Federal student financial aid programs and student loans— to help students and parents manage the financial-aid process for college, in part by helping families keep track of deadlines, and walking them through the required forms.
  • Archimedes, a San Francisco healthcare modeling and analytics company, that leverages  Federal open data from the National Institutes of Health, the Centers for Disease Control and Prevention, and the Center for Medicaid and Medicare Services, to  provide doctors more effective individualized treatment plans and to enable patients to make informed health decisions.
You can learn more here about the project and view the list of open data companies here.

See also:
Open Government Data: Companies Cash In

NYU project touts 500 top open-data firms”

Open data and transparency: a look back at 2013


Zoe Smith in the Guardian on the open data and development in 2013: “The clarion call for a “data revolution” made in the post-2015 high level panel report is a sign of a growing commitment to see freely flowing data become a tool for social change.

Web-based technology continued to offer increasing numbers of people the ability to share standardised data and statistics to demand better governance and strengthen accountability. 2013 seemed to herald the moment that the open data/transparency movement entered the mainstream.
Yet for those who have long campaigned on the issue, the call was more than just a catchphrase, it was a unique opportunity. “If we do get a global drive towards open data in relation to development or anything else, that would be really transformative and it’s quite rare to see such bold statements at such an early stage of the process. I think it set the tone for a year in which transparency was front and centre of many people’s agendas,” says David Hall Matthews, of Publish What You Fund.
This year saw high level discussions translated into commitments at the policy level. David Cameron used the UK’s presidency of the G8 to trigger international action on the three Ts (tax, trade and transparency) through the IF campaign. The pledge at Lough Erne, in Scotland, reaffirmed the commitment to the Busan open data standard as well as the specific undertaking that all G8 members would implement International Aid Transparency Index (IATI) standards by the end of 2015.
2013 was a particularly good year for the US Millenium Challenge Corporation (MCC) which topped the aid transparency index. While at the very top MCC and UK’s DfID were examples of best practice, there was still much room for improvement. “There is a really long tail of agencies who are not really taking transparency at all, yet. This includes important donors, the whole of France and the whole of Japan who are not doing anything credible,” says Hall-Matthews.
Yet given the increasing number of emerging and ‘frontier‘ markets whose growth is driven in large part by wealth derived from natural resources, 2013 saw a growing sense of urgency for transparency to be applied to revenues from oil, gas and mineral resources that may far outstrip aid. In May, the new Extractive Industries Transparency Initiative standard (EITI) was adopted, which is said to be far broader and deeper than its previous incarnation.
Several countries have done much to ensure that transparency leads to accountability in their extractive industries. In Nigeria, for example, EITI reports are playing an important role in the debate about how resources should be managed in the country. “In countries such as Nigeria they’re taking their commitment to transparency and EITI seriously, and are going beyond disclosing information but also ensuring that those findings are acted upon and lead to accountability. For example, the tax collection agency has started to collect more of the revenues that were previously missing,” says Jonas Moberg, head of the EITI International Secretariat.
But just the extent to which transparency and open data can actually deliver on its revolutionary potential has also been called into question. Governments and donors agencies can release data but if the power structures within which this data is consumed and acted upon do not shift is there really any chance of significant social change?
The complexity of the challenge is illustrated by the case of Mexico which, in 2014, will succeed Indonesia as chair of the Open Government Partnership. At this year’s London summit, Mexico’s acting civil service minister, spoke of the great strides his country has made in opening up the public procurement process, which accounts for around 10% of GDP and is a key area in which transparency and accountability can help tackle corruption.
There is, however, a certain paradox. As SOAS professor, Leandro Vergara Camus, who has written extensively on peasant movements in Mexico, explains: “The NGO sector in Mexico has more of a positive view of these kinds of processes than the working class or peasant organisations. The process of transparency and accountability have gone further in urban areas then they have in rural areas.”…
With increasing numbers of organisations likely to jump on the transparency bandwagon in the coming year the greatest challenge is using it effectively and adequately addressing the underlying issues of power and politics.

Top 2013 transparency publications

Open data, transparency and international development, The North South Institute
Data for development: The new conflict resource?, Privacy International
The fix-rate: a key metric for transparency and accountability, Integrity Action
Making UK aid more open and transparent, DfID
Getting a seat at the table: Civil Society advocacy for budget transparency in “untransparent” countries, International Budget Partnership

The dates that mattered

23-24 May: New Extractive Industries Transparency Index standard adopted
30 May: Post 2015 high level report calling for a ‘data revolution’ is published
17-18 June: UK premier, David Cameron, campaigns for tax, trade and transparency during the G8
24 October: US Millenium Challenge Corporation tops the aid transparency index”
30 October – 1 November: Open Government Partnership in London gathers civil society, governments and data experts

Ten thoughts for the future


The Economist: “CASSANDRA has decided to revisit her fellow forecasters Thomas Malnight and Tracey Keys to find out what their predictions are for 2014. Once again they have produced a collection of trends for the year ahead, in their “Global Trends Report”.
The possibilities of mind control seem alarming ( point 6) as do the  implications of growing income inequality (point 10). Cassandra also hopes that “unemployability” and “unemployerability”, as discussed in point 9, are contested next year (on both linguistic and social fronts).
Nevertheless, the forecasts make for intriguing reading and highlights appear below.
 1. From social everything to being smart socially
Social technologies are everywhere, but these vast repositories of digital “stuff” bury the exceptional among the unimportant. It’s time to get socially smart. Users are moving to niche networks to bring back the community feel and intelligence to social interactions. Businesses need to get smarter about extracting and delivering value from big data including challenging business models. For social networks, mobile is the great leveller. Competition for attention with other apps will intensify the battle to own key assets from identity to news sharing, demanding radical reinvention.
2. Information security: The genie is out of the bottle
Thought your information was safe? Think again. The information security genie is out of the bottle as cyber-surveillance and data mining by public and private organizations increases – and don’t forget criminal networks and whistleblowers. It will be increasingly hard to tell friend from foe in cyberspace as networks build artificial intelligence to decipher your emotions and smart cities track your every move. Big brother is here: Protecting identity, information and societies will be a priority for all.
3. Who needs shops anyway?
Retailers are facing a digitally driven perfect storm. Connectivity, rising consumer influence, time scarcity, mobile payments, and the internet of things, are changing where, when and how we shop – if smart machines have not already done the job. Add the sharing economy, driven by younger generations where experience and sustainable consumption are more important than ownership, and traditional retail models break down. The future of shops will be increasingly defined by experiential spaces offering personalized service, integrated online and offline value propositions, and pop-up stores to satisfy demands for immediacy and surprise.
4. Redistributing the industrial revolution
Complex, global value chains are being redistributed by new technologies, labour market shifts and connectivity. Small-scale manufacturing, including 3D and soon 4D printing, and shifting production economics are moving production closer to markets and enabling mass customization – not just by companies but by the tech-enabled maker movement which is going mainstream. Rising labour costs in developing markets, high unemployment in developed markets, global access to online talent and knowledge, plus advances in robotics mean reshoring of production to developed markets will increase. Mobility, flexibility and networks will define the future industrial landscape.
5. Hubonomics: The new face of globalization
As production and consumption become more distributed, hubs will characterize the next wave of “globalization.” They will specialize to support the needs of growing regional trade, emerging city states, on-line communities of choice, and the next generation of flexible workers and entrepreneurs. Underpinning these hubs will be global knowledge networks and new business and governance models based on hubonomics™, that leverage global assets and hub strengths to deliver local value.
6. Sci-Fi is here: Making the impossible possible
Cross-disciplinary approaches and visionary entrepreneurs are driving scientific breakthroughs that could change not just our lives and work but our bodies and intelligence. Labs worldwide are opening up the vast possibilities of mind control and artificial intelligence, shape-shifting materials and self-organizing nanobots, cyborgs and enhanced humans, space exploration, and high-speed, intelligent transportation. Expect great debate around the ethics, financing, and distribution of public and private benefits of these advances – and the challenge of translating breakthroughs into replicable benefits.
7. Growing pains: Transforming markets and generations
The BRICS are succumbing to Newton’s law of gravitation: Brazil’s lost it, India’s losing it, China’s paying the price for growth, Russia’s failing to make a superpower come-back, and South Africa’s economy is in disarray. In other developing markets currencies have tumbled, Arab Spring governments are still in turmoil and social unrest is increasing along with the number of failing states. But the BRICS & Beyond growth engine is far from dead. Rather it is experiencing growing pains which demand significant shifts in governance, financial systems, education and economic policies to catch up. The likely transformers will be younger generations who aspire to greater freedom and quality of life than their parents.
8. Panic versus denial: The resource gap grows, the global risks rise – but who is listening?
The complex nexus of food, water, energy and climate change presents huge global economic, environmental and societal challenges – heating up the battle to access new resources from the Arctic to fracking. Risks are growing, even as multilateral action stalls. It’s a crisis of morals, governance, and above all marketing and media, pitting crisis deniers against those who recognize the threats but are communicating panic versus reasoned solutions. Expect more debate and calls for responsible capitalism – those that are listening will be taking action at multiple levels in society and business.
9. Fighting unemployability and unemployerability
Companies are desperate for talented workers – yet unemployment rates remain high. Polarization towards higher and lower skill levels is squeezing mid-level jobs, even as employers complain that education systems are not preparing students for the jobs of the future. Fighting unemployability is driving new government-business partnerships worldwide, and will remain a critical issue given massive youth unemployment. Employers must also focus on organizational unemployerability – not being able to attract and retain desired talent – as new generations demand exciting and meaningful work where they can make an impact. If they can’t find it, they will quickly move on or swell the growing ranks of young entrepreneurs.
10. Surviving in a bipolar world: From expecting consistency to embracing ambiguity
Life is not fair, nor is it predictable.  Income inequality is growing. Intolerance and nationalism are rising but interdependence is the currency of a connected world. Pressure on leaders to deliver results today is intense but so too is the need for fundamental change to succeed in the long term. The contradictions of leadership and life are increasing faster than our ability to reconcile the often polarized perspectives and values each embodies. Increasingly, they are driving irrational acts of leadership (think the US debt ceiling), geopolitical, social and religious tensions, and individual acts of violence. Surviving in this world will demand stronger, responsible leadership comfortable with and capable of embracing ambiguity and uncertainty, as opposed to expecting consistency and predictability.”

Tech challenge develops algorithms to predict


SciDevNet: “Mathematical models that use existing socio-political data to predict mass atrocities could soon inform governments and NGOs on how and where to take preventative action.
The models emerged from one strand of the Tech Challenge for Atrocity Prevention, a competition run by the US Agency for International Development (USAID) and NGO Humanity United. The winners were announced last month (18 November) and will now work with the organiser to further develop and pilot their innovations.
The five winners from different countries who won between US$1,000 and US$12,000, were among nearly 100 entrants who developed algorithms to predict when and where mass atrocities are likely to happen.
Around 1.5 billion people live in countries affected by conflict, sometimes including atrocities such as genocides, mass rape and ethnic cleansing, according to the World Bank’s World Development Report 2011. Many of these countries are in the developing world.
The competition organisers hope the new algorithms could help governments and human rights organisations identify at-risk regions, potentially allowing them to intervene before mass atrocities happen.
The competition started from the premise that certain social and political measurements are linked to increased likelihood of atrocities. Yet because such factors interact in complex ways, organisations working to prevent atrocities lack a reliable method of predicting when and where they might happen next.
The algorithms use sociopolitical indicators and data on past atrocities as their inputs. The data was drawn from archives such as the Global Database of Events, Language and Tone, a data set that encodes more than 200 million globally newsworthy events, recording cultural information such as the people involved, their location and any religious connections.”
Link to the winners of the Model Challenge

The Brainstorm Begins: Initial Ideas for Evolving ICANN


Screen Shot 2013-12-09 at 6.41.19 PM“The ICANN Strategy Panel on Multistakeholder Innovation (MSI Panel) is underway working to curate a set of concrete proposals for ways that the Internet Corporation for Assigned Names & Numbers (ICANN) could prototype new institutional arrangements for the 21st century. The Panel is working to identify how ICANN can open itself to more global participation in its governance functions. Specifically, the MSI Panel is charged with:

  • Proposing new models for international engagement, consensus-driven policymaking and institutional structures to support such enhanced functions; and
  • Designing processes, tools and platforms that enable the global ICANN community to engage in these new forms of participatory decision-making.

To help answer this charter, the MSI Panel launched an “Idea Generation” or ideation platform, designed to brainstorm with the global public on how to evolve the way ICANN could operate given the innovations in governance happening across the world.

We’re now 3 weeks in to this Idea Generation stage – taking place online here: thegovlab.ideascale.com – and we wanted to share with you what the Panel and The GovLab has heard so far regarding what tools, technologies, platforms and techniques ICANN could learn from or adapt to help design an innovative approach to problem-solving within the Domain Name System going forward.

These initial ideas begin to paint a picture of what 21st century coordination of a shared global commons might involve. These brainstorms all point to certain core principles the Panel believes provide the groundwork for an institution to legitimately operate in the global public interest today. These principles include:

  • Openness –  Ensuring open channels as well as very low or no barriers to meaningful participation.
  • Transparency – Providing public access to information and deliberation data.
  • Accessibility – Developing simple and legible organizational communications.
  • Inclusivity and Lack of Domination – Ensuring access to global participation and that no one player, entity or interest dominates processes or outcomes.
  • Accountability – Creating mechanisms for the global public to check institutional power.
  • Effectiveness –  Improving decision-making through greater reliance on evidence and a focus on flexibility and agility.
  • Efficiency – Streamlining processes to better leverage time, resources and human capital.

With these core principles as the backdrop, the ideas we’ve heard so far roughly fall within the following categories…
See also thegovlab.ideascale.com