How Maps Drive Decisions at EPA


Joseph Marks at NextGov: “The Environmental Protection Agency office charged with taking civil and criminal actions against water and air polluters used to organize its enforcement targeting meetings and conference calls around spreadsheets and graphs.

The USA National Wetlands Inventory is one of the interactive maps produced by the Geoplatform.gov tool.

Those spreadsheets detailed places with large oil and gas production and other possible pollutants where EPA might want to focus its own inspection efforts or reach out to state-level enforcement agencies.
During the past two years, the agency has largely replaced those spreadsheets and tables with digital maps, which make it easier for participants to visualize precisely where the top polluting areas are and how those areas correspond to population centers, said Harvey Simon, EPA’s geospatial information officer, making it easier for the agency to focus inspections and enforcement efforts where they will do the most good.
“Rather than verbally going through tables and spreadsheets you have a lot of people who are not [geographic information systems] practitioners who are able to share map information,” Simon said. “That’s allowed them to take a more targeted and data-driven approach to deciding what to do where.”
The change is a result of the EPA Geoplatform, a tool built off Esri’s ArcGIS Online product, which allows companies and government agencies to build custom Web maps using base maps provided by Esri mashed up with their own data.
When the EPA Geoplatform launched in May 2012 there were about 250 people registered to create and share mapping data within the agency. That number has grown to more than 1,000 during the past 20 months, Simon said.
“The whole idea of the platform effort is to democratize the use of geospatial information within the agency,” he said. “It’s relatively simple now to make a Web map and mash up data that’s useful for your work, so many users are creating Web maps themselves without any support from a consultant or from a GIS expert in their office.”
A governmentwide Geoplatform launched in 2012, spurred largely by agencies’ frustrations with the difficulty of sharing mapping data after the 2010 explosion of the Deepwater Horizon oil rig in the Gulf of Mexico. The platform’s goal was twofold. First officials wanted to share mapping data more widely between agencies so they could avoid duplicating each other’s work and to share data more easily during an emergency.
Second, the government wanted to simplify the process for viewing and creating Web maps so they could be used more easily by nonspecialists.
EPA’s geoplatform has essentially the same goals. The majority of the maps the agency builds using the platform aren’t  publicly accessible so the EPA doesn’t have to worry about scrubbing maps of data that could reveal personal information about citizens or proprietary data about companies. It publishes some maps that don’t pose any privacy concerns on EPA websites as well as on the national geoplatform and to Data.gov, the government data repository.
Once ArcGIS Online is judged compliant with the Federal Information Security Management Act, or FISMA, which is expected this month, EPA will be able to share significantly more nonpublic maps through the national geoplatform and rely on more maps produced by other agencies, Simon said.
EPA’s geoplatform has also made it easier for the agency’s environmental justice office to share common data….”

Open Data is a Civil Right


Yo Yoshida, Founder & CEO, Appallicious in GovTech: “As Americans, we expect a certain standardization of basic services, infrastructure and laws — no matter where we call home. When you live in Seattle and take a business trip to New York, the electric outlet in the hotel you’re staying in is always compatible with your computer charger. When you drive from San Francisco to Los Angeles, I-5 doesn’t all-of-a-sudden turn into a dirt country road because some cities won’t cover maintenance costs. If you take a 10-minute bus ride from Boston to the city of Cambridge, you know the money in your wallet is still considered legal tender.

But what if these expectations of consistency were not always a given? What if cities, counties and states had absolutely zero coordination when it came to basic services? This is what it is like for us in the open data movement. There are so many important applications and products that have been built by civic startups and concerned citizens. However, all too often these efforts are confided to city limits, and unavailable to anyone outside of them. It’s time to start reimagining the way cities function and how local governments operate. There is a wealth of information housed in local governments that should be public by default to help fuel a new wave of civic participation.
Appallicious’ Neighborhood Score provides an overall health and sustainability score, block-by-block for every neighborhood in the city of San Francisco. The first time metrics have been applied to neighborhoods so we can judge how government allocates our resources, so we can better plan how to move forward. But, if you’re thinking about moving to Oakland, just a subway stop away from San Francisco and want to see the score for a neighborhood, our app can’t help you, because that city has yet to release the data sets we need.
In Contra Costa County, there is the lifesaving PulsePoint app, which notifies smartphone users who are trained in CPR when someone nearby may be in need of help. This is an amazing app—for residents of Contra Costa County. But if someone in neighboring Alameda County needs CPR, the app, unfortunately, is completely useless.
Buildingeye visualizes planning and building permit data to allow users to see what projects are being proposed in their area or city. However, buildingeye is only available in a handful of places, simply because most cities have yet to make permits publicly available. Think about what this could do for the construction sector — an industry that has millions of jobs for Americans. Buildingeye also gives concerned citizens access to public documents like never before, so they can see what might be built in their cities or on their streets.
Along with other open data advocates, I have been going from city-to-city, county-to-county and state-to-state, trying to get governments and departments to open up their massive amounts of valuable data. Each time one city, or one county, agrees to make their data publicly accessible, I can’t help but think it’s only a drop in the bucket. We need to think bigger.
Every government, every agency and every department in the country that has already released this information to the public is a case study that points to the success of open data — and why every public entity should follow their lead. There needs to be a national referendum that instructs that all government data should be open and accessible to the public.
Last May, President Obama issued an executive order requiring that going forward, any data generated by the federal government must be made available to the public in open, machine-readable formats. In the executive order, Obama stated that, “openness in government strengthens our democracy, promotes the delivery of efficient and effective services to the public, and contributes to economic growth.”
If this is truly the case, Washington has an obligation to compel local and state governments to release their data as well. Many have tried to spur this effort. California Lt. Gov. Gavin Newsom created the Citizenville Challenge to speed up adoption on the local level. The U.S. Conference of Mayors has also been vocal in promoting open data efforts. But none of these initiatives could have the same effect of a federal mandate.
What I am proposing is no small feat, and it won’t happen overnight. But there should be a concerted effort by those in the technology industry, specifically civic startups, to call on Congress to draft legislation that would require every city in the country to make their data open, free and machine readable. Passing federal legislation will not be an easy task — but creating a “universal open data” law is possible. It would require little to no funding, and it is completely nonpartisan. It’s actually not a political issue at all; it is, for lack of a better word, and administrative issue.
Often good legislation is blocked because lawmakers and citizens are concerned about project funding. While there should be support to help cities and towns achieve the capability of opening their data, a lot of the time, they don’t need it. In 2009, the city and county of San Francisco opened up its data with zero dollars. Many other cities have done the same. There will be cities and municipalities that will need financial assistance to accomplish this. But it is worth it, and it will not require a significant investment for a substantial return. There are free online open data portals, like ckan, dkan and a new effort from Accela, CivicData.com, to centralize open data efforts.
When the UK Government recently announced a £1.5 million investment to support open data initiatives, its Cabinet Office Minister said, “We know that it creates a more accountable, efficient and effective government. Open Data is a raw material for economic growth, supporting the creation of new markets, business and jobs and helping us compete in the global race.”
We should not fall behind these efforts. There is too much at stake for our citizens, not to mention our economy. A recent McKinsey report found that making open data has the potential to create $3 trillion in value worldwide.
Former Speaker Tip O’Neil famously said, “all politics are local.” But we in the civic startup space believe all data is local. Data is reporting potholes in your neighborhood and identifying high crime areas in your communities. It’s seeing how many farmers’ markets there are in your town compared to liquor stores. Data helps predict which areas of a city are most at risk during a heat wave and other natural disasters. A federal open data law would give the raw material needed to create tools to improve the lives of all Americans, not just those who are lucky enough to live in a city that has released this information on its own.
It’s a different way of thinking about how a government operates and the relationship it has with its citizens. Open data gives the public an amazing opportunity to be more involved with governmental decisions. We can increase accountability and transparency, but most importantly we can revolutionize the way local residents communicate and work with their government.
Access to this data is a civil right. If this is truly a government by, of and for the people, then its data needs to be available to all of us. By opening up this wealth of information, we will design a better government that takes advantage of the technology and skills of civic startups and innovative citizens….”

Social Media as Government Watchdog


Gordon Crovitz in the Wall Street Journal: “Two new data points for the debate on whether greater access to the Internet leads to more freedom and fewer authoritarian regimes:

According to reports last week, Facebook plans to buy a company that makes solar-powered drones that can hover for years at high altitudes without refueling, which it would use to bring the Internet to parts of the world not yet on the grid. In contrast to this futuristic vision, Russia evoked land grabs of the analog Soviet era by invading Crimea after Ukrainians forced out Vladimir Putin‘s ally as president.
Internet idealists can point to another triumph in helping bring down Ukraine’s authoritarian government. Ukrainian citizens ignored intimidation including officious text messages: “Dear subscriber, you are registered as a participant in a mass disturbance.” Protesters made the most of social media to plan demonstrations and avoid attacks by security forces.
But Mr. Putin quickly delivered the message that social media only goes so far against a fully committed authoritarian. His claim that he had to invade to protect ethnic Russians in Crimea was especially brazen because there had been no loud outcry, on social media or otherwise, among Russian speakers in the region.
A new book reports the state of play on the Internet as a force for freedom. For a decade, Emily Parker, a former Wall Street Journal editorial-page writer and State Department staffer, has researched the role of the Internet in China, Cuba and Russia. The title of her book, “Now I Know Who My Comrades Are,” comes from a blogger in China who explained to Ms. Parker how the Internet helps people discover they are not alone in their views and aspirations for liberty.
Officials in these countries work hard to keep critics isolated and in fear. In Russia, Ms. Parker notes, there is also apathy because the Putin regime seems so entrenched. “Revolutions need a spark, often in the form of a political or economic crisis,” she observes. “Social media alone will not light that spark. What the Internet does create is a new kind of citizen: networked, unafraid, and ready for action.”
Asked about lessons from the invasion of Crimea, Ms. Parker noted that the Internet “chips away at Russia’s control over information.” She added: “Even as Russian state media tries to shape the narrative about Ukraine, ordinary Russians can go online to seek the truth.”
But this same shared awareness may also be accelerating a decline in U.S. influence. In the digital era, U.S. failure to make good on its promises reduces the stature of Washington faster than similar inaction did in the past.
Consider the Hungarian uprising of 1956, the first significant rebellion against Soviet control. The U.S. secretary of state, John Foster Dulles, said: “To all those suffering under communist slavery, let us say you can count on us.” Yet no help came as Soviet tanks rolled into Budapest, tens of thousands were killed, and the leader who tried to secede from the Warsaw Pact, Imre Nagy, was executed.
There were no Facebook posts or YouTube videos instantly showing the result of U.S. fecklessness. In the digital era, scenes of Russian occupation of Crimea are available 24/7. People can watch Mr. Putin’s brazen press conferences and see for themselves what he gets away with.
The U.S. stood by as Syrian civilians were massacred and gassed. There was instant global awareness when President Obama last year backed down from enforcing his “red line” when the Syrian regime used chemical weapons. American inaction in Syria sent a green light for Mr. Putin and others around the world to act with impunity.
Just in recent weeks, Iran tried to ship Syrian rockets to Gaza to attack Israel; Moscow announced it would use bases in Cuba, Venezuela and Nicaragua for its navy and bombers; and China budgeted a double-digit increase in military spending as President Obama cut back the U.S. military.
All institutions are more at risk in this era of instant communication and awareness. Reputations get lost quickly, whether it’s a misstep by a company, a gaffe by a politician, or a lack of resolve by an American president.
Over time, the power of the Internet to bring people together will help undermine authoritarian governments. But as Mr. Putin reminds us, in the short term a peaceful world depends more on a U.S. resolute in using its power and influence to deter aggression.”

Doctors’ #1 Source for Healthcare Information: Wikipedia


Article By Julie Beck in the Atlantic: “In spite of all of our teachers’ and bosses’ warnings that it’s not a trustworthy source of information, we all rely on Wikipedia. Not only when we can’t remember the name of that guy from that movie, which is a fairly low-risk use, but also when we find a weird rash or are just feeling a little off and we’re not sure why. One in three Americans have tried to diagnose a medical condition with the help of the Internet, and a new report says doctors are just as drawn to Wikipedia’s flickering flame.
According to the IMS Institute for Healthcare Informatics’ “Engaging patients through social media” report, Wikipedia is the top source of healthcare information for both doctors and patients. Fifty percent of physicians use Wikipedia for information, especially for specific conditions.
Generally, more people turn to Wikipedia for rare diseases than common conditions. The top five conditions looked up on the site over the past year were: tuberculosis, Crohn’s disease, pneumonia, multiple sclerosis, and diabetes. Patients tend to use Wikipedia as a “starting point for their online self education,” the report says. It also found a “direct correlation between Wikipedia page visits and prescription volumes.”
We already knew that more and more people were turning to the Internet in general and Wikipedia specifically for health information, and we could hardly stop them if we tried.

Being crowd-sourced, the information may well be neutral, but is it accurate? Knowing that doctors, too, are using these resources raises old concerns about the quality of information that comes up when you type your condition into Google.
But doctors are aware of this, and an effort called Wikiproject Medicine is dedicated to improving the quality of medical information on Wikipedia. The IMS report looked at changes to five articles—diabetes, multiple sclerosis, rheumatoid arthritis, breast cancer and prostate cancer—and found them to be in a state of constant flux. Those articles were changed, on average, between 16 and 46 times a month. But one of the major contributors to those articles was Dr. James Heilman, the founder of Wikiproject Medicine’s Medicine Translation task force.
“This task force’s goal is getting 200 medical articles to a good or featured status (only 0.1 percent of articles on Wikipedia have this status), simplifying the English and then translating this content to as many languages as possible,” the report says. “The aim is to improve the quality of the most read medical articles on Wikipedia and ensure that this quality will reach non-English speakers.”…”

New Research Network to Study and Design Innovative Ways of Solving Public Problems


Network

MacArthur Foundation Research Network on Opening Governance formed to gather evidence and develop new designs for governing 

NEW YORK, NY, March 4, 2014 The Governance Lab (The GovLab) at New York University today announced the formation of a Research Network on Opening Governance, which will seek to develop blueprints for more effective and legitimate democratic institutions to help improve people’s lives.
Convened and organized by the GovLab, the MacArthur Foundation Research Network on Opening Governance is made possible by a three-year grant of $5 million from the John D. and Catherine T. MacArthur Foundation as well as a gift from Google.org, which will allow the Network to tap the latest technological advances to further its work.
Combining empirical research with real-world experiments, the Research Network will study what happens when governments and institutions open themselves to diverse participation, pursue collaborative problem-solving, and seek input and expertise from a range of people. Network members include twelve experts (see below) in computer science, political science, policy informatics, social psychology and philosophy, law, and communications. This core group is supported by an advisory network of academics, technologists, and current and former government officials. Together, they will assess existing innovations in governing and experiment with new practices and how institutions make decisions at the local, national, and international levels.
Support for the Network from Google.org will be used to build technology platforms to solve problems more openly and to run agile, real-world, empirical experiments with institutional partners such as governments and NGOs to discover what can enhance collaboration and decision-making in the public interest.
The Network’s research will be complemented by theoretical writing and compelling storytelling designed to articulate and demonstrate clearly and concretely how governing agencies might work better than they do today. “We want to arm policymakers and practitioners with evidence of what works and what does not,” says Professor Beth Simone Noveck, Network Chair and author of Wiki Government: How Technology Can Make Government Better, Democracy Stronger and Citi More Powerful, “which is vital to drive innovation, re-establish legitimacy and more effectively target scarce resources to solve today’s problems.”
“From prize-backed challenges to spur creative thinking to the use of expert networks to get the smartest people focused on a problem no matter where they work, this shift from top-down, closed, and professional government to decentralized, open, and smarter governance may be the major social innovation of the 21st century,” says Noveck. “The MacArthur Research Network on Opening Governance is the ideal crucible for helping  transition from closed and centralized to open and collaborative institutions of governance in a way that is scientifically sound and yields new insights to inform future efforts, always with an eye toward real-world impacts.”
MacArthur Foundation President Robert Gallucci added, “Recognizing that we cannot solve today’s challenges with yesterday’s tools, this interdisciplinary group will bring fresh thinking to questions about how our governing institutions operate, and how they can develop better ways to help address seemingly intractable social problems for the common good.”
Members
The MacArthur Research Network on Opening Governance comprises:
Chair: Beth Simone Noveck
Network Coordinator: Andrew Young
Chief of Research: Stefaan Verhulst
Faculty Members:

  • Sir Tim Berners-Lee (Massachusetts Institute of Technology (MIT)/University of Southampton, UK)
  • Deborah Estrin (Cornell Tech/Weill Cornell Medical College)
  • Erik Johnston (Arizona State University)
  • Henry Farrell (George Washington University)
  • Sheena S. Iyengar (Columbia Business School/Jerome A. Chazen Institute of International Business)
  • Karim Lakhani (Harvard Business School)
  • Anita McGahan (University of Toronto)
  • Cosma Shalizi (Carnegie Mellon/Santa Fe Institute)

Institutional Members:

  • Christian Bason and Jesper Christiansen (MindLab, Denmark)
  • Geoff Mulgan (National Endowment for Science Technology and the Arts – NESTA, United Kingdom)
  • Lee Rainie (Pew Research Center)

The Network is eager to hear from and engage with the public as it undertakes its work. Please contact Stefaan Verhulst to share your ideas or identify opportunities to collaborate.”

State


New Social media platform called “State”: The simplest way to get your opinions heard.Just state about whatever matters to you, get counted and instantly see where you stand. When everyone’s opinion counts, the full picture emerges. This could make good things happen…
We set up State, because at the moment, most people never get heard. So we’re levelling the playing field for everyone by allowing them to express their opinions quickly and delivering them to the people who most need to hear them.
State lets people communicate in a lucid, non-competitive way. It’s a place where you don’t need hashtags, followers, or fame, just an opinion. The solution we lit upon was at the convergence of design simplicity and semantic intelligence. It allows people to express opinions in a quick and fun way that also provides enough information to interpret, count, and connect them.
For those in positions of leadership or influence, State offers the first many-to-one capability that can precisely map the prevailing sentiment on key issues. These are opinions shared spontaneously, not extracted from a survey.
We believe that everyone deserves a powerful voice online, no one should be left out, and when everyone’s opinions count, a more complete picture emerges. We firmly believe that this could make good things happen.

The Economics of Access to Information


Article by Mariano Mosquera at Edmond J. Safra Research Lab: “There has been an important development in the study of the right of access to public information and the so-called economics of information: by combining these two premises, it is possible to outline an economics theory of access to public information.


Moral Hazard
The legal development of the right of access to public information has been remarkable. Many international conventions, laws and national regulations have been passed on this matter. In this regard, access to information has consolidated within the framework of international human rights law.
The Inter-American Court of Human Rights was the first international court to acknowledge that access to information is a human right that is part of the right to freedom of speech. The Court recognized this right in two parts, as the individual right of any person to search for information and as a positive obligation of the state to ensure the individual’s right to receive the requested information.
This right and obligation can also be seen as the demand and supply of information.
The so-called economics of information has focused on the issue of information asymmetry between the principal and the agent. The principal (society) and the agent (state) enter into a contract.This contract is based on the idea that the agent’s specialization and professionalism (or the politician’s, according to Weber) enables him to attend to the principal’s affairs, such as public affairs in this case. This representation contract does not provide for a complete delegation,but rather it involves the principal’s commitment to monitoring the agent.
When we study corruption, it is important to note that monitoring aims to ensure that the agent adjusts its behavior to comply with the contract, in order to pursue public goals, and not to serve private interests. Stiglitz4 describes moral hazard as a situation arising from information asymmetry between the principal and the agent. The principal takes a risk when acting without comprehensive information about the agent’s actions. The moral hazard means that the handling of closed, privileged information by the agent could bring about negative consequences for the principal.
In this case, it is a risk related to corrupt practices, since a public official could use the state’s power and information to achieve private benefits, and not to resolve public issues in accordance with the principal-agent contract. This creates negative social consequences.
In this model, there are a number of safeguards against moral hazard, such as monitoring institutions (with members of the opposition) and rewards for efficient and effective administration,5 among others. Access to public information could also serve as an effective means of monitoring the agent, so that the agent adjusts its behavior to comply with the contract.
The Economic Principle of Public Information
According to this principal-agent model, public information should be defined as:
information whose social interpretation enables the state to act in the best interests of society. This definition is based on the idea of information for monitoring purposes and uses a systematic approach to feedback. This definition also implies that the state is not entirely effective at adjusting its behavior by itself.
Technically, as an economic principle of public information, public information is:
information whose interpretation by the principal is useful for the agent, so that the latter adjusts its behavior to comply with the principal-agent contract. It should be noted that this is very different from the legal definition of public information, such as “any information produced or held by the state.” This type of legal definition is focused only on supply, but not on demand.
In this principal-agent model, public information stems from two different rationales: the principal’s interpretation and the usefulness for the agent. The measure of the principal’s interpretation is the likelihood of being useful for the agent. The measure of usefulness for the agent is the likelihood of adjusting the principal-agent contract.
Another totally different situation is the development of institutions that ensure the application of this principle. For example, the channels of supplied, and demanded, information, and the channels of feedback, could be strengthened so that the social interpretation that is useful for the state actually reaches the public authorities that are able to adjust policies….”

This algorithm can predict a revolution


Russell Brandom at the Verge: “For students of international conflict, 2013 provided plenty to examine. There was civil war in Syria, ethnic violence in China, and riots to the point of revolution in Ukraine. For those working at Duke University’s Ward Lab, all specialists in predicting conflict, the year looks like a betting sheet, full of predictions that worked and others that didn’t pan out.

Guerrilla campaigns intensified, proving out the prediction

When the lab put out their semiannual predictions in July, they gave Paraguay a 97 percent chance of insurgency, largely based on reports of Marxist rebels. The next month, guerrilla campaigns intensified, proving out the prediction. In the case of China’s armed clashes between Uighurs and Hans, the models showed a 33 percent chance of violence, even as the cause of each individual flare-up was concealed by the country’s state-run media. On the other hand, the unrest in Ukraine didn’t start raising alarms until the action had already started, so the country was left off the report entirely.

According to Ward Lab’s staff, the purpose of the project isn’t to make predictions but to test theories. If a certain theory of geopolitics can predict an uprising in Ukraine, then maybe that theory is onto something. And even if these specialists could predict every conflict, it would only be half the battle. “It’s a success only if it doesn’t come at the cost of predicting a lot of incidents that don’t occur,” says Michael D. Ward, the lab’s founder and chief investigator, who also runs the blog Predictive Heuristics. “But it suggests that we might be on the right track.”

If a certain theory of geopolitics can predict an uprising in Ukraine, maybe that theory is onto something

Forecasting the future of a country wasn’t always done this way. Traditionally, predicting revolution or war has been a secretive project, for the simple reason that any reliable prediction would be too valuable to share. But as predictions lean more on data, they’ve actually become harder to keep secret, ushering in a new generation of open-source prediction models that butt against the siloed status quo.

Will this country’s government face an acute existential threat in the next six months?

The story of automated conflict prediction starts at the Defense Advance Research Projects Agency, known as the Pentagon’s R&D wing. In the 1990s, DARPA wanted to try out software-based approaches to anticipating which governments might collapse in the near future. The CIA was already on the case, with section chiefs from every region filing regular forecasts, but DARPA wanted to see if a computerized approach could do better. They looked at a simple question: will this country’s government face an acute existential threat in the next six months? When CIA analysts were put to the test, they averaged roughly 60 percent accuracy, so DARPA’s new system set the bar at 80 percent, looking at 29 different countries in Asia with populations over half a million. It was dubbed ICEWS, the Integrated Conflict Early Warning System, and it succeeded almost immediately, clearing 80 percent with algorithms built on simple regression analysis….

On the data side, researchers at Georgetown University are cataloging every significant political event of the past century into a single database called GDELT, and leaving the whole thing open for public research. Already, projects have used it to map the Syrian civil war and diplomatic gestures between Japan and South Korea, looking at dynamics that had never been mapped before. And then, of course, there’s Ward Lab, releasing a new sheet of predictions every six months and tweaking its algorithms with every development. It’s a mirror of the same open-vs.-closed debate in software — only now, instead of fighting over source code and security audits, it’s a fight over who can see the future the best.”

OpenRFPs


Clay Johnson: “Tomorrow at CodeAcross we’ll be launching our first community-based project, OpenRFPs. The goal is to liberate the data inside of every state RFP listing website in the country. We hope you’ll find your own state’s RFP site, and contribute a parser.
The Department of Better Technology’s goal is to improve the way government works by making it easier for small, innovative businesses to provide great technology to government. But those businesses can barely make it through the front door when the RFPs themselves are stored in archaic systems, with sloppy user interfaces and disparate data formats, or locked behind paywalls.
What are a business’s current choices? They can pay for subscription based services that are either shady vendors disguised as officious sounding associations, or have less than stellar user interfaces.
The sum of these experiences? The barrier to entry is so high that the small companies who are great at technology are effectively turned away at the door. The only ones who get through are those that can afford to spend employee time scouring for the right RFPs.
But the harm to business isn’t the big harm. Government contracting becomes an overlooked area by the transparency community, (especially at the state and local level,) because the data isn’t available en-masse unless it’s paid for commercially, which comes with a restrictive usage license.”

Think Like a Commoner: A Short Introduction to the Life of the Commons


New book by David Bollier: “In our age of predatory markets and make-believe democracy, our troubled political institutions have lost sight of real people and practical realities. But if you look to the edges, ordinary people are reinventing governance and provisioning on their own terms. The commons is arising as a serious, practical alternative to the corrupt Market/State.

The beauty of commons is that we can build them ourselves, right now. But the bigger challenge is, Can we learn to see the commons and, more importantly, to think like a commoner?…

The biggest “tragedy of the commons” is the misconception that commons are failures — relics from another era rendered unnecessary by the Market and State. Think Like a Commoner dispels such prejudices by explaining the rich history and promising future of the commons — an ageless paradigm of cooperation and fairness that is re-making our world.
With graceful prose and dozens of fascinating examples, David Bollier describes the quiet revolution that is pioneering practical new forms of self-governance and production controlled by people themselves. Think Like a Commoner explains how the commons:

  • Is an exploding field of DIY innovation ranging from Wikipedia and seed-sharing to community forests and collaborative consumption, and beyond;
  • Challenges the standard narrative of market economics by explaining how cooperation generates significant value and human fulfillment; and
  • Provides a framework of law and social action that can help us move beyond the pathologies of neoliberal capitalism.

We have a choice: Ignore the commons and suffer the ongoing private plunder of our common wealth. Or Think Like a Commoner and learn how to rebuild our society and reclaim our shared inheritance. This accessible, comprehensive introduction to the commons will surprise and enlighten you, and provoke you to action.”