How Open Data Policies Unlock Innovation


Tim Cashman at Socrata: “Several trends made the Web 2.0 world we now live in possible. Arguably, the most important of these has been the evolution of online services as extensible technology platforms that enable users, application developers, and other collaborators to create value that extends far beyond the original offering itself.

The Era of ‘Government-as-a-Platform’

The same principles that have shaped the consumer web are now permeating government. Forward-thinking public sector organizations are catching on to the idea that, to stay relevant and vital, governments must go beyond offering a few basic services online. Some have even come to the realization that they are custodians of an enormously valuable resource: the data they collect through their day-to-day operations.  By opening up this data for public consumption online, innovative governments are facilitating the same kind of digital networks that consumer web services have fostered for years.  The era of government as a platform is here, and open data is the catalyst.

The Role of Open Data Policy in Unlocking Innovation in Government

The open data movement continues to transition from an emphasis on transparency to measuring the civic and economic impact of open data programs. As part of this transition, governments are realizing the importance of creating a formal policy to define strategic goals, describe the desired benefits, and provide the scope for data publishing efforts over time.  When well executed, open data policies yield a clear set of benefits. These range from spurring slow-moving bureaucracies into action to procuring the necessary funding to sustain open data initiatives beyond a single elected official’s term.

Four Types of Open Data Policies

There are four main types of policy levers currently in use regarding open data: executive orders, non-binding resolutions, new laws, new regulations, and codified laws. Each of these tools has specific advantages and potential limitations.

Executive Orders

The prime example of an open data executive order in action is President Barack Obama’s Open Data Initiative. While this executive order was short – only four paragraphs on two pages – the real policy magic was a mandate-by-reference that required all U.S. federal agencies to comply with a detailed set of time-bound actions. All of these requirements are publicly viewable on a GitHub repository – a free hosting service for open source software development projects – which is revolutionary in and of itself. Detailed discussions on government transparency took place not in closed-door boardrooms, but online for everyone to see, edit, and improve.

Non-Binding Resolutions

A classic example of a non-binding resolution can be found by doing an online search for the resolution of Palo Alto, California. Short and sweet, this town squire-like exercise delivers additional attention to the movement inside and outside of government. The lightweight policy tool also has the benefit of lasting a bit longer than any particular government official. Although, in recognition of the numerous resolutions that have ever come out of any small town, resolutions are only as timeless as people’s memory.

Internal Regulations

The New York State Handbook on Open Data is a great example of internal regulations put to good use. Originating from the Office of Information Technology Resources, the handbook is a comprehensive, clear, and authoritative guide on how open data is actually supposed to work. Also available on GitHub, the handbook resembles the federal open data project in many ways.

Codified Laws

The archetypal example of open data law comes from San Francisco.
Interestingly, what started as an “Executive Directive” from Mayor Gavin Newsom later turned into legislation and brought with it the power of stronger department mandates and a significant budget. Once enacted, laws are generally hard to revise. However, in the case of San Francisco, the city council has already revised the law two times in four years.
At the federal government level, the Digital Accountability and Transparency Act, or DATA Act, was introduced in both the U.S. House of Representatives (H.R. 2061) and the U.S. Senate (S. 994) in 2013. The act mandates the standardization and publication of a wide of variety of the federal government’s financial reports as open data. Although the Housed voted to pass the Data Act, it still awaits a vote in the Senate.

The Path to Government-as-a-Platform

Open data policies are an effective way to motivate action and provide clear guidance for open data programs. But they are not a precondition for public-sector organizations to embrace the government-as-a-platform model. In fact, the first step does not involve technology at all. Instead, it involves government leaders realizing that public data belongs to the people. And, it requires the vision to appreciate this data as a shared resource that only increases in value the more widely it is distributed and re-used for analytics, web and mobile apps, and more.
The consumer web has shown the value of open data networks in spades (think Facebook). Now, it’s government’s turn to create the next web.”

Online tools for engaging citizens in the legislative process


Andrew Mandelbaum  from OpeningParliament.org: “Around the world, parliaments, governments, civil society organizations, and even individual parliamentarians, are taking measures to make the legislative process more participatory. Some are creating their own tools — often open source, which allows others to use these tools as well — that enable citizens to markup legislation or share ideas on targeted subjects. Others are purchasing and implementing tools developed by private companies to good effect. In several instances, these initiatives are being conducted through collaboration between public institutions and civil society, while many compliment online and offline experiences to help ensure that a broader population of citizens is reached.
The list below provides examples of some of the more prominent efforts to engage citizens in the legislative process.
Brazil
Implementer: Brazilian Chamber of Deputies
Website: http://edemocracia.camara.gov.br/
Additional Information: OpeningParliament.org Case Study
Estonia
Implementer: Estonian President & Civil Society
Project Name: Rahvakogu (The People’s Assembly)
Website: http://www.rahvakogu.ee/
Additional InformationEnhancing Estonia’s Democracy Through Rahvakogu
Finland
Implementer: Finnish Parliament
Project Name: Inventing Finland again! (Keksitään Suomi uudelleen!)
Website: http://www.suomijoukkoistaa.fi/
Additional Information: Democratic Participation and Deliberation in Crowdsourced Legislative Processes: The Case of the Law on Off-Road Traffic in Finland
France
Implementer: SmartGov – Démocratie Ouverte
Website: https://www.parlement-et-citoyens.fr/
Additional Information: OpeningParliament Case Study
Italy
Implementer: Government of Italy
Project Name: Public consultation on constitutional reform
Website: http://www.partecipa.gov.it/
Spain
Implementer: Basque Parliament
Website: http://www.adi.parlamentovasco.euskolegebiltzarra.org/es/
Additional Information: Participation in Parliament
United Kingdom
Implementer: Cabinet Office
Project Name: Open Standards Consultation
Website: http://consultation.cabinetoffice.gov.uk/openstandards/
Additional Information: Open Policy Making, Open Standards Consulation; Final Consultation Documents
United States
Implementer: OpenGov Foundation
Project Name: The Madison Project
Tool: The Madison Project

Open Data is a Civil Right


Yo Yoshida, Founder & CEO, Appallicious in GovTech: “As Americans, we expect a certain standardization of basic services, infrastructure and laws — no matter where we call home. When you live in Seattle and take a business trip to New York, the electric outlet in the hotel you’re staying in is always compatible with your computer charger. When you drive from San Francisco to Los Angeles, I-5 doesn’t all-of-a-sudden turn into a dirt country road because some cities won’t cover maintenance costs. If you take a 10-minute bus ride from Boston to the city of Cambridge, you know the money in your wallet is still considered legal tender.

But what if these expectations of consistency were not always a given? What if cities, counties and states had absolutely zero coordination when it came to basic services? This is what it is like for us in the open data movement. There are so many important applications and products that have been built by civic startups and concerned citizens. However, all too often these efforts are confided to city limits, and unavailable to anyone outside of them. It’s time to start reimagining the way cities function and how local governments operate. There is a wealth of information housed in local governments that should be public by default to help fuel a new wave of civic participation.
Appallicious’ Neighborhood Score provides an overall health and sustainability score, block-by-block for every neighborhood in the city of San Francisco. The first time metrics have been applied to neighborhoods so we can judge how government allocates our resources, so we can better plan how to move forward. But, if you’re thinking about moving to Oakland, just a subway stop away from San Francisco and want to see the score for a neighborhood, our app can’t help you, because that city has yet to release the data sets we need.
In Contra Costa County, there is the lifesaving PulsePoint app, which notifies smartphone users who are trained in CPR when someone nearby may be in need of help. This is an amazing app—for residents of Contra Costa County. But if someone in neighboring Alameda County needs CPR, the app, unfortunately, is completely useless.
Buildingeye visualizes planning and building permit data to allow users to see what projects are being proposed in their area or city. However, buildingeye is only available in a handful of places, simply because most cities have yet to make permits publicly available. Think about what this could do for the construction sector — an industry that has millions of jobs for Americans. Buildingeye also gives concerned citizens access to public documents like never before, so they can see what might be built in their cities or on their streets.
Along with other open data advocates, I have been going from city-to-city, county-to-county and state-to-state, trying to get governments and departments to open up their massive amounts of valuable data. Each time one city, or one county, agrees to make their data publicly accessible, I can’t help but think it’s only a drop in the bucket. We need to think bigger.
Every government, every agency and every department in the country that has already released this information to the public is a case study that points to the success of open data — and why every public entity should follow their lead. There needs to be a national referendum that instructs that all government data should be open and accessible to the public.
Last May, President Obama issued an executive order requiring that going forward, any data generated by the federal government must be made available to the public in open, machine-readable formats. In the executive order, Obama stated that, “openness in government strengthens our democracy, promotes the delivery of efficient and effective services to the public, and contributes to economic growth.”
If this is truly the case, Washington has an obligation to compel local and state governments to release their data as well. Many have tried to spur this effort. California Lt. Gov. Gavin Newsom created the Citizenville Challenge to speed up adoption on the local level. The U.S. Conference of Mayors has also been vocal in promoting open data efforts. But none of these initiatives could have the same effect of a federal mandate.
What I am proposing is no small feat, and it won’t happen overnight. But there should be a concerted effort by those in the technology industry, specifically civic startups, to call on Congress to draft legislation that would require every city in the country to make their data open, free and machine readable. Passing federal legislation will not be an easy task — but creating a “universal open data” law is possible. It would require little to no funding, and it is completely nonpartisan. It’s actually not a political issue at all; it is, for lack of a better word, and administrative issue.
Often good legislation is blocked because lawmakers and citizens are concerned about project funding. While there should be support to help cities and towns achieve the capability of opening their data, a lot of the time, they don’t need it. In 2009, the city and county of San Francisco opened up its data with zero dollars. Many other cities have done the same. There will be cities and municipalities that will need financial assistance to accomplish this. But it is worth it, and it will not require a significant investment for a substantial return. There are free online open data portals, like ckan, dkan and a new effort from Accela, CivicData.com, to centralize open data efforts.
When the UK Government recently announced a £1.5 million investment to support open data initiatives, its Cabinet Office Minister said, “We know that it creates a more accountable, efficient and effective government. Open Data is a raw material for economic growth, supporting the creation of new markets, business and jobs and helping us compete in the global race.”
We should not fall behind these efforts. There is too much at stake for our citizens, not to mention our economy. A recent McKinsey report found that making open data has the potential to create $3 trillion in value worldwide.
Former Speaker Tip O’Neil famously said, “all politics are local.” But we in the civic startup space believe all data is local. Data is reporting potholes in your neighborhood and identifying high crime areas in your communities. It’s seeing how many farmers’ markets there are in your town compared to liquor stores. Data helps predict which areas of a city are most at risk during a heat wave and other natural disasters. A federal open data law would give the raw material needed to create tools to improve the lives of all Americans, not just those who are lucky enough to live in a city that has released this information on its own.
It’s a different way of thinking about how a government operates and the relationship it has with its citizens. Open data gives the public an amazing opportunity to be more involved with governmental decisions. We can increase accountability and transparency, but most importantly we can revolutionize the way local residents communicate and work with their government.
Access to this data is a civil right. If this is truly a government by, of and for the people, then its data needs to be available to all of us. By opening up this wealth of information, we will design a better government that takes advantage of the technology and skills of civic startups and innovative citizens….”

New Research Network to Study and Design Innovative Ways of Solving Public Problems


Network

MacArthur Foundation Research Network on Opening Governance formed to gather evidence and develop new designs for governing 

NEW YORK, NY, March 4, 2014 The Governance Lab (The GovLab) at New York University today announced the formation of a Research Network on Opening Governance, which will seek to develop blueprints for more effective and legitimate democratic institutions to help improve people’s lives.
Convened and organized by the GovLab, the MacArthur Foundation Research Network on Opening Governance is made possible by a three-year grant of $5 million from the John D. and Catherine T. MacArthur Foundation as well as a gift from Google.org, which will allow the Network to tap the latest technological advances to further its work.
Combining empirical research with real-world experiments, the Research Network will study what happens when governments and institutions open themselves to diverse participation, pursue collaborative problem-solving, and seek input and expertise from a range of people. Network members include twelve experts (see below) in computer science, political science, policy informatics, social psychology and philosophy, law, and communications. This core group is supported by an advisory network of academics, technologists, and current and former government officials. Together, they will assess existing innovations in governing and experiment with new practices and how institutions make decisions at the local, national, and international levels.
Support for the Network from Google.org will be used to build technology platforms to solve problems more openly and to run agile, real-world, empirical experiments with institutional partners such as governments and NGOs to discover what can enhance collaboration and decision-making in the public interest.
The Network’s research will be complemented by theoretical writing and compelling storytelling designed to articulate and demonstrate clearly and concretely how governing agencies might work better than they do today. “We want to arm policymakers and practitioners with evidence of what works and what does not,” says Professor Beth Simone Noveck, Network Chair and author of Wiki Government: How Technology Can Make Government Better, Democracy Stronger and Citi More Powerful, “which is vital to drive innovation, re-establish legitimacy and more effectively target scarce resources to solve today’s problems.”
“From prize-backed challenges to spur creative thinking to the use of expert networks to get the smartest people focused on a problem no matter where they work, this shift from top-down, closed, and professional government to decentralized, open, and smarter governance may be the major social innovation of the 21st century,” says Noveck. “The MacArthur Research Network on Opening Governance is the ideal crucible for helping  transition from closed and centralized to open and collaborative institutions of governance in a way that is scientifically sound and yields new insights to inform future efforts, always with an eye toward real-world impacts.”
MacArthur Foundation President Robert Gallucci added, “Recognizing that we cannot solve today’s challenges with yesterday’s tools, this interdisciplinary group will bring fresh thinking to questions about how our governing institutions operate, and how they can develop better ways to help address seemingly intractable social problems for the common good.”
Members
The MacArthur Research Network on Opening Governance comprises:
Chair: Beth Simone Noveck
Network Coordinator: Andrew Young
Chief of Research: Stefaan Verhulst
Faculty Members:

  • Sir Tim Berners-Lee (Massachusetts Institute of Technology (MIT)/University of Southampton, UK)
  • Deborah Estrin (Cornell Tech/Weill Cornell Medical College)
  • Erik Johnston (Arizona State University)
  • Henry Farrell (George Washington University)
  • Sheena S. Iyengar (Columbia Business School/Jerome A. Chazen Institute of International Business)
  • Karim Lakhani (Harvard Business School)
  • Anita McGahan (University of Toronto)
  • Cosma Shalizi (Carnegie Mellon/Santa Fe Institute)

Institutional Members:

  • Christian Bason and Jesper Christiansen (MindLab, Denmark)
  • Geoff Mulgan (National Endowment for Science Technology and the Arts – NESTA, United Kingdom)
  • Lee Rainie (Pew Research Center)

The Network is eager to hear from and engage with the public as it undertakes its work. Please contact Stefaan Verhulst to share your ideas or identify opportunities to collaborate.”

The Economics of Access to Information


Article by Mariano Mosquera at Edmond J. Safra Research Lab: “There has been an important development in the study of the right of access to public information and the so-called economics of information: by combining these two premises, it is possible to outline an economics theory of access to public information.


Moral Hazard
The legal development of the right of access to public information has been remarkable. Many international conventions, laws and national regulations have been passed on this matter. In this regard, access to information has consolidated within the framework of international human rights law.
The Inter-American Court of Human Rights was the first international court to acknowledge that access to information is a human right that is part of the right to freedom of speech. The Court recognized this right in two parts, as the individual right of any person to search for information and as a positive obligation of the state to ensure the individual’s right to receive the requested information.
This right and obligation can also be seen as the demand and supply of information.
The so-called economics of information has focused on the issue of information asymmetry between the principal and the agent. The principal (society) and the agent (state) enter into a contract.This contract is based on the idea that the agent’s specialization and professionalism (or the politician’s, according to Weber) enables him to attend to the principal’s affairs, such as public affairs in this case. This representation contract does not provide for a complete delegation,but rather it involves the principal’s commitment to monitoring the agent.
When we study corruption, it is important to note that monitoring aims to ensure that the agent adjusts its behavior to comply with the contract, in order to pursue public goals, and not to serve private interests. Stiglitz4 describes moral hazard as a situation arising from information asymmetry between the principal and the agent. The principal takes a risk when acting without comprehensive information about the agent’s actions. The moral hazard means that the handling of closed, privileged information by the agent could bring about negative consequences for the principal.
In this case, it is a risk related to corrupt practices, since a public official could use the state’s power and information to achieve private benefits, and not to resolve public issues in accordance with the principal-agent contract. This creates negative social consequences.
In this model, there are a number of safeguards against moral hazard, such as monitoring institutions (with members of the opposition) and rewards for efficient and effective administration,5 among others. Access to public information could also serve as an effective means of monitoring the agent, so that the agent adjusts its behavior to comply with the contract.
The Economic Principle of Public Information
According to this principal-agent model, public information should be defined as:
information whose social interpretation enables the state to act in the best interests of society. This definition is based on the idea of information for monitoring purposes and uses a systematic approach to feedback. This definition also implies that the state is not entirely effective at adjusting its behavior by itself.
Technically, as an economic principle of public information, public information is:
information whose interpretation by the principal is useful for the agent, so that the latter adjusts its behavior to comply with the principal-agent contract. It should be noted that this is very different from the legal definition of public information, such as “any information produced or held by the state.” This type of legal definition is focused only on supply, but not on demand.
In this principal-agent model, public information stems from two different rationales: the principal’s interpretation and the usefulness for the agent. The measure of the principal’s interpretation is the likelihood of being useful for the agent. The measure of usefulness for the agent is the likelihood of adjusting the principal-agent contract.
Another totally different situation is the development of institutions that ensure the application of this principle. For example, the channels of supplied, and demanded, information, and the channels of feedback, could be strengthened so that the social interpretation that is useful for the state actually reaches the public authorities that are able to adjust policies….”

Smart Governance: A Roadmap for Research and Practice


New report by Hans J. Scholl and Margit C. Scholl: “It has been the object of this article to make the case and present a roadmap for the study of the phenomena of smart governance as well as smart and open governance as an enactment of smart governance in practice. As a concept paper, this contribution aimed at sparking interest and at inspiring scholarly and practitioner discourse in this area of study inside the community of electronic government research and practice, and beyond. The roadmap presented here comprises and details seven elements of smart governance along with eight areas of focus in practice.
Smart governance along with its administrative enactment of smart and open government, it was argued, can help effectively address the three grand challenges to 21st century societal and individual well-being, which are (a) the Third Industrial Revolution with the information revolution at its core, (b) the rapidity of change and the lack of timely and effective government intervention, and (c) expansive government spending and exorbitant public debt financing. Although not seen as a panacea, it was also argued that smart governance principles could guide the relatively complex administrative enactment of smart and open government more intelligently than traditional static and inflexible governance approaches could do.
Since much of the road ahead metaphorically speaking leads through uncharted territory, dedicated research is needed that accompanies projects in this area and evaluates them. Research could further be embedded into practical projects providing for fast and systematic learning. We believe that such embedding of research into smart governance projects should become an integral part of smart projects’ agendas.”

The Problem With Serious Games–Solved


Emerging Technology From the arXiv:” Serious games are becoming increasingly popular but the inability to generate realistic new content has hampered their progress. Until now.

Here’s an imaginary scenario: you’re a law enforcement officer confronted with John, a 21-year-old male suspect who is accused of breaking into a private house on Sunday evening and stealing a laptop, jewellery and some cash. Your job is to find out whether John has an alibi and if so whether it is coherent and believable.
That’s exactly the kind of scenario that police officers the world over face on a regular basis. But how do you train for such a situation? How do you learn the skills necessary to gather the right kind of information?
An increasingly common way of doing this is with serious games, those designed primarily for purposes other than entertainment. In the last 10 years or so, medical, military and commercial organisations all over the world began to experiment with game-based scenarios that are designed to teach people how to perform their jobs and tasks in realistic situations.
But there is a problem with serious games which require realistic interaction is with another person. It’s relatively straightforward to design one or two scenarios that are coherent, lifelike and believable but it’s much harder to generate them continually on an ongoing basis.
Imagine in the example above, that John is a computer-generated character. What kind of activities could he describe that would serve as a believable, coherent alibi for Sunday evening? And how could he do it a thousand times, each describing a different realistic alibi. Therein lies the problem.
Today, Sigal Sina at Bar-Ilan University in Israel, and a couple pals, say they’ve solved this probelm. These guys have come up with a novel way of generating ordinary, realistic scenarios that can be cut and pasted into a serious game to serve exactly this purpose. The secret sauce in their new approach is to crowdsource the new scenarios from real people using Amazon’s Mechanical Turk service.
The approach is straightforward. Sina and co simply ask Turkers to answer a set of questions asking what they did during each one-hour period throughout various days, offering bonuses to those who provide the most varied detail.
They then analyse the answers, categorising activities by factors such as the times they are performed, the age and sex of the person doing it, the number of people involved and so on.
This then allows a computer game to cut and paste activities into the action at appropriate times. So for example, the computer can select an appropriate alibi for John on a Sunday evening by choosing an activity described by a male Turker for the same time while avoiding activitiesthat a woman might describe for a Friday morning, which might otherwise seem unbelievable. The computer also changes certain details in the narrative, such as names, locations and so on to make the narrative coherent with John’s profile….
That solves a significant problem with serious games. Until now, developers have had to spend an awful lot of time producing realistic content, a process known as procedural content generation. That’s always been straightforward for things like textures, models and terrain in game settings. Now, thanks to this new crowdsourcing technique, it can be just as easy for human interactions in serious games too.
Ref:  arxiv.org/abs/1402.5034 : Using the Crowd to Generate Content for Scenario-Based Serious-Games”

Can We Balance Data Protection With Value Creation?


A “privacy perspective” by Sara Degli Esposti: “In the last few years there has been a dramatic change in the opportunities organizations have to generate value from the data they collect about customers or service users. Customers and users are rapidly becoming collections of “data points” and organizations can learn an awful lot from the analysis of this huge accumulation of data points, also known as “Big Data.”

Organizations are perhaps thrilled, dreaming about new potential applications of digital data but also a bit concerned about hidden risks and unintended consequences. Take, for example, the human rights protections placed on personal data by the EU.  Regulators are watching closely, intending to preserve the eight basic privacy principles without compromising the free flow of information.
Some may ask whether it’s even possible to balance the two.
Enter the Big Data Protection Project (BDPP): an Open University study on organizations’ ability to leverage Big Data while complying with EU data protection principles. The study represents a chance for you to contribute to, and learn about, the debate on the reform of the EU Data Protection Directive. It is open to staff with interests in data management or use, from all types of organizations, both for-profit and nonprofit, with interests in Europe.
Join us by visiting the study’s page on the Open University website. Participants will receive a report with all the results. The BDP is a scientific project—no commercial organization is involved—with implications relevant to both policy-makers and industry representatives..
What kind of legislation do we need to create that positive system of incentive for organizations to innovate in the privacy field?
There is no easy answer.
That’s why we need to undertake empirical research into actual information management practices to understand the effects of regulation on people and organizations. Legal instruments conceived with the best intentions can be ineffective or detrimental in practice. However, other factors can also intervene and motivate business players to develop procedures and solutions which go far beyond compliance. Good legislation should complement market forces in bringing values and welfare to both consumers and organizations.
Is European data protection law keeping its promise of protecting users’ information privacy while contributing to the flourishing of the digital economy or not? Will the proposed General Data Protection Regulation (GDPR) be able to achieve this goal? What would you suggest to do to motivate organizations to invest in information security and take information privacy seriously?
Let’s consider for a second some basic ideas such as the eight fundamental data protection principles: notice, consent, purpose specification and limitation, data quality, respect of data subjects’ rights, information security and accountability. Many of these ideas are present in the EU 1995 Data Protection Directive, the U.S. Fair Information Practice Principles (FIPPs) andthe 1980 OECD Guidelines. The fundamental question now is, should all these ideas be brought into the future, as suggested in the proposed new GDPR, orshould we reconsider our approach and revise some of them, as recommended in the 21st century version of the 1980 OECD Guidelines?
As you may know, notice and consent are often taken as examples of how very good intentions can be transformed into actions of limited importance. Rather than increase people’s awareness of the growing data economy, notice and consent have produced a tick-box tendency accompanied by long and unintelligible privacy policies. Besides, consent is rarely freely granted. Individuals give their consent in exchange for some product or service or as part of a job relationship. The imbalance between the two goods traded—think about how youngsters perceive not having access to some social media as a form of social exclusion—and the lack of feasible alternatives often make an instrument, such as the current use made of consent, meaningless.
On the other hand, a principle such as data quality, which has received very limited attention, could offer opportunities to policy-makers and businesses to reopen the debate on users’ control of their personal data. Having updated, accurate data is something very valuable for organizations. Data quality is also key to the success of many business models. New partnerships between users and organizations could be envisioned under this principle.
Finally, data collection limitation and purpose specification could be other examples of the divide between theory and practice: The tendency we see is that people and businesses want to share, merge and reuse data over time and to do new and unexpected things. Of course, we all want to avoid function creep and prevent any detrimental use of our personal data. We probably need new, stronger mechanisms to ensure data are used for good purposes.
Digital data have become economic assets these days. We need good legislation to stop the black market for personal data and open the debate on how each of us wants to contribute to, and benefit from, the data economy.”

Think Like a Commoner: A Short Introduction to the Life of the Commons


New book by David Bollier: “In our age of predatory markets and make-believe democracy, our troubled political institutions have lost sight of real people and practical realities. But if you look to the edges, ordinary people are reinventing governance and provisioning on their own terms. The commons is arising as a serious, practical alternative to the corrupt Market/State.

The beauty of commons is that we can build them ourselves, right now. But the bigger challenge is, Can we learn to see the commons and, more importantly, to think like a commoner?…

The biggest “tragedy of the commons” is the misconception that commons are failures — relics from another era rendered unnecessary by the Market and State. Think Like a Commoner dispels such prejudices by explaining the rich history and promising future of the commons — an ageless paradigm of cooperation and fairness that is re-making our world.
With graceful prose and dozens of fascinating examples, David Bollier describes the quiet revolution that is pioneering practical new forms of self-governance and production controlled by people themselves. Think Like a Commoner explains how the commons:

  • Is an exploding field of DIY innovation ranging from Wikipedia and seed-sharing to community forests and collaborative consumption, and beyond;
  • Challenges the standard narrative of market economics by explaining how cooperation generates significant value and human fulfillment; and
  • Provides a framework of law and social action that can help us move beyond the pathologies of neoliberal capitalism.

We have a choice: Ignore the commons and suffer the ongoing private plunder of our common wealth. Or Think Like a Commoner and learn how to rebuild our society and reclaim our shared inheritance. This accessible, comprehensive introduction to the commons will surprise and enlighten you, and provoke you to action.”

Choosing Not to Choose


New paper by Cass Sunstein: “Choice can be an extraordinary benefit or an immense burden. In some contexts, people choose not to choose, or would do so if they were asked. For example, many people prefer not to make choices about their health or retirement plans; they want to delegate those choices to a private or public institution that they trust (and may well be willing to pay a considerable amount for such delegations). This point suggests that however well-accepted, the line between active choosing and paternalism is often illusory. When private or public institutions override people’s desire not to choose, and insist on active choosing, they may well be behaving paternalistically, through a form of choice-requiring paternalism. Active choosing can be seen as a form of libertarian paternalism, and a frequently attractive one, if people are permitted to opt out of choosing in favor of a default (and in that sense not to choose); it is a form of nonlibertarian paternalism insofar as people are required to choose. For both ordinary people and private or public institutions, the ultimate judgment in favor of active choosing, or in favor of choosing not to choose, depends largely on the costs of decisions and the costs of errors. But the value of learning, and of developing one’s own preferences and values, is also important, and may argue on behalf of active choosing, and against the choice not to choose. For law and policy, these points raise intriguing puzzles about the idea of “predictive shopping,” which is increasingly feasible with the rise of large data sets containing information about people’s previous choices. Some empirical results are presented about people’s reactions to predictive shopping; the central message is that most (but not all) people reject predictive shopping in favor of active choosing.”