The solutions to all our problems may be buried in PDFs that nobody reads


Christopher Ingraham at the Washington Post: “What if someone had already figured out the answers to the world’s most pressing policy problems, but those solutions were buried deep in a PDF, somewhere nobody will ever read them?
According to a recent report by the World Bank, that scenario is not so far-fetched. The bank is one of those high-minded organizations — Washington is full of them — that release hundreds, maybe thousands, of reports a year on policy issues big and small. Many of these reports are long and highly technical, and just about all of them get released to the world as a PDF report posted to the organization’s Web site.
The World Bank recently decided to ask an important question: Is anyone actually reading these things? They dug into their Web site traffic data and came to the following conclusions: Nearly one-third of their PDF reports had never been downloaded, not even once. Another 40 percent of their reports had been downloaded fewer than 100 times. Only 13 percent had seen more than 250 downloads in their lifetimes. Since most World Bank reports have a stated objective of informing public debate or government policy, this seems like a pretty lousy track record.
pdfs
Now, granted, the bank isn’t Buzzfeed. It wouldn’t be reasonable to expect thousands of downloads for reports with titles like “Detecting Urban Expansion and Land Tenure Security Assessment: The Case of Bahir Dar and Debre Markos Peri-Urban Areas of Ethiopia.” Moreover, downloads aren’t the be-all and end-all of information dissemination; many of these reports probably get some distribution by e-mail, or are printed and handed out at conferences. Still, it’s fair to assume that many big-idea reports with lofty goals to elevate the public discourse never get read by anyone other than the report writer and maybe an editor or two. Maybe the author’s spouse. Or mom.
I’m not picking on the World Bank here. In fact, they’re to be commended, strongly, for not only taking a serious look at the question but making their findings public for the rest of us to learn from. And don’t think for a second that this is just a World Bank problem. PDF reports are basically the bread and butter of Washington’s huge think tank industry, for instance. Every single one of these groups should be taking a serious look at their own PDF analytics the way the bank has.
Government agencies are also addicted to the PDF. As The Washington Post’s David Fahrenthold reported this week, federal agencies spend thousands of dollars and employee-hours each year producing Congressionally-mandated reports that nobody reads. And let’s not even get started on the situation in academia, where the country’s best and brightest compete for the honor of seeing their life’s work locked away behind some publisher’s paywall.”
Not every policy report is going to be a game-changer, of course. But the sheer numbers dictate that there are probably a lot of really, really good ideas out there that never see the light of day. This seems like an inefficient way for the policy community to do business, but what’s the alternative?
One final irony to ponder: You know that World Bank report, about how nobody reads its PDFs? It’s only available as a PDF. Given the attention it’s receiving, it may also be one of their most-downloaded reports ever.

Continued Progress and Plans for Open Government Data


Steve VanRoekel, and Todd Park at the White House:  “One year ago today, President Obama signed an executive order that made open and machine-readable data the new default for government information. This historic step is helping to make government-held data more accessible to the public and to entrepreneurs while appropriately safeguarding sensitive information and rigorously protecting privacy.
Freely available data from the U.S. government is an important national resource, serving as fuel for entrepreneurship, innovation, scientific discovery, and economic growth. Making information about government operations more readily available and useful is also core to the promise of a more efficient and transparent government. This initiative is a key component of the President’s Management Agenda and our efforts to ensure the government is acting as an engine to expand economic growth and opportunity for all Americans. The Administration is committed to driving further progress in this area, including by designating Open Data as one of our key Cross-Agency Priority Goals.
Over the past few years, the Administration has launched a number of Open Data Initiatives aimed at scaling up open data efforts across the Health, Energy, Climate, Education, Finance, Public Safety, and Global Development sectors. The White House has also launched Project Open Data, designed to share best practices, examples, and software code to assist federal agencies with opening data. These efforts have helped unlock troves of valuable data—that taxpayers have already paid for—and are making these resources more open and accessible to innovators and the public.
Other countries are also opening up their data. In June 2013, President Obama and other G7 leaders endorsed the Open Data Charter, in which the United States committed to publish a roadmap for our nation’s approach to releasing and improving government data for the public.
Building upon the Administration’s Open Data progress, and in fulfillment of the Open Data Charter, today we are excited to release the U.S. Open Data Action Plan. The plan includes a number of exciting enhancements and new data releases planned in 2014 and 2015, including:

  • Small Business Data: The Small Business Administration’s (SBA) database of small business suppliers will be enhanced so that software developers can create tools to help manufacturers more easily find qualified U.S. suppliers, ultimately reducing the transaction costs to source products and manufacture domestically.
  • Smithsonian American Art Museum Collection: The Smithsonian American Art Museum’s entire digitized collection will be opened to software developers to make educational apps and tools. Today, even museum curators do not have easily accessible information about their art collections. This information will soon be available to everyone.
  • FDA Adverse Drug Event Data: Each year, healthcare professionals and consumers submit millions of individual reports on drug safety to the Food and Drug Administration (FDA). These anonymous reports are a critical tool to support drug safety surveillance. Today, this data is only available through limited quarterly reports. But the Administration will soon be making these reports available in their entirety so that software developers can build tools to help pull potentially dangerous drugs off shelves faster than ever before.

We look forward to implementing the U.S. Open Data Action Plan, and to continuing to work with our partner countries in the G7 to take the open data movement global”.

United States federal government use of crowdsourcing grows six-fold since 2011


at E Pluribus Unum: “Citizensourcing and open innovation can work in the public sector, just as crowdsourcing can in the private sector. Around the world, the use of prizes to spur innovation has been booming for years. The United States of America has been significantly scaling up its use of prizes and challenges to solving grand national challenges since January 2011, when, President Obama signed an updated version of the America COMPETES Act into law.
According to the third congressionally mandated report released by the Obama administration today (PDF/Text), the number of prizes and challenges conducted under the America COMPETES Act has increased by 50% since 2012, 85% since 2012, and nearly six-fold overall since 2011. 25 different federal agencies offered prizes under COMPETES in fiscal year 2013, with 87 prize competitions in total. The size of the prize purses has also grown as well, with 11 challenges over $100,000 in 2013. Nearly half of the prizes conducted in FY 2013 were focused on software, including applications, data visualization tools, and predictive algorithms. Challenge.gov, the award-winning online platform for crowdsourcing national challenges, now has tens of thousands of users who have participated in more than 300 public-sector prize competitions. Beyond the growth in prize numbers and amounts, Obama administration highlighted 4 trends in public-sector prize competitions:

  • New models for public engagement and community building during competitions
  • Growth software and information technology challenges, with nearly 50% of the total prizes in this category
  • More emphasis on sustainability and “creating a post-competition path to success”
  • Increased focus on identifying novel approaches to solving problems

The growth of open innovation in and by the public sector was directly enabled by Congress and the White House, working together for the common good. Congress reauthorized COMPETES in 2010 with an amendment to Section 105 of the act that added a Section 24 on “Prize Competitions,” providing all agencies with the authority to conduct prizes and challenges that only NASA and DARPA has previously enjoyed, and the White House Office of Science and Technology Policy (OSTP), which has been guiding its implementation and providing guidance on the use of challenges and prizes to promote open government.
“This progress is due to important steps that the Obama Administration has taken to make prizes a standard tool in every agency’s toolbox,” wrote Cristin Dorgelo, assistant director for grand challenges in OSTP, in a WhiteHouse.gov blog post on engaging citizen solvers with prizes:

In his September 2009 Strategy for American Innovation, President Obama called on all Federal agencies to increase their use of prizes to address some of our Nation’s most pressing challenges. Those efforts have expanded since the signing of the America COMPETES Reauthorization Act of 2010, which provided all agencies with expanded authority to pursue ambitious prizes with robust incentives.
To support these ongoing efforts, OSTP and the General Services Administration have trained over 1,200 agency staff through workshops, online resources, and an active community of practice. And NASA’s Center of Excellence for Collaborative Innovation (COECI) provides a full suite of prize implementation services, allowing agencies to experiment with these new methods before standing up their own capabilities.

Sun Microsystems co-founder Bill Joy famously once said that “No matter who you are, most of the smartest people work for someone else.” This rings true, in and outside of government. The idea of governments using prizes like this to inspire technological innovation, however, is not reliant on Web services and social media, born from the fertile mind of a Silicon Valley entrepreneur. As the introduction to the third White House prize report  notes:

“One of the most famous scientific achievements in nautical history was spurred by a grand challenge issued in the 18th Century. The issue of safe, long distance sea travel in the Age of Sail was of such great importance that the British government offered a cash award of £20,000 pounds to anyone who could invent a way of precisely determining a ship’s longitude. The Longitude Prize, enacted by the British Parliament in 1714, would be worth some £30 million pounds today, but even by that measure the value of the marine chronometer invented by British clockmaker John Harrison might be a deal.”

Centuries later, the Internet, World Wide Web, mobile devices and social media offer the best platforms in history for this kind of approach to solving grand challenges and catalyzing civic innovation, helping public officials and businesses find new ways to solve old problem. When a new idea, technology or methodology that challenges and improves upon existing processes and systems, it can improve the lives of citizens or the function of the society that they live within….”

Open Data Could Unlock $230 Billion In Energy-Efficiency Savings


Jeff McMahon at Forbes: “Energy-efficiency startups just need access to existing data—on electricity usage, housing characteristics, renovations and financing—to unlock hundreds of billions of dollars in savings, two founders of  startups said in Chicago Tuesday.
“One of the big barriers to scaling energy efficiency is the lack of data in the market,” said Andy Frank of Sealed, a startup that encourages efficiency improvements by guaranteeing homeowners a lower bill than they’re paying now.
In a forum hosted by the Energy Policy Institute at Chicago, Frank and Matt Gee, founder of Effortless Energy, advocated an open-energy-data warehouse that would collect anonymized data from utilities, cities, contractors, and financiers, to make the data available for research, government, and industry.
“There needs to be some sort of entity that organizes all this information and has it in some sort of standard format,” said Gee, whose startup pays for home improvements up front and then splits the savings with investors and the homeowner.
According to Gee, the current $9.5 billion energy-efficiency market operates without data on the actual savings it produces for homeowners. He outlined the current market like this:

  1. A regulatory body, usually a public utility commission, mandates that a utility spend money on efficiency.
  2. The utility passes on the cost to customers through an efficiency surcharge (this is how the $9.5 billion is raised).
  3. The utility hires a program implementer.
  4. The program implementer sends auditors to customer homes.
  5. Potential savings from improvements like new insulation or new appliances are estimated based on models.
  6. Those modeled estimates determine what the contractor can do in the home.
  7. The modeled estimates determine what financing is available.

In some cases, utilities will hire consultants to estimate the savings generated from these improvements. California utilities spend $40 million a year estimating savings, Gee said, but actual savings are neither verified nor integrated in the process.
“Nowhere in this process do actual savings enter,” Gee said. “They don’t drive anyone’s incentives, which is just absolutely astounding, right? The opportunity here is that energy efficiency actually pays for itself. It should be something that’s self-financed.”
For that to happen, the market needs reliable information on how much energy is currently being wasted and how much is saved by improvements….”

Sharing in a Changing Climate


Helen Goulden in the Huffington Post: “Every month, a social research agency conducts a public opinion survey on 30,000 UK households. As part of this households are asked about what issues they think are the most important; things such as crime, unemployment, inequality, public health etc. Climate change has ranked so consistently low on these surveys that they don’t both asking any more.
On first glance, it would appear that most people don’t care about a changing climate.
Yet, that’s simply not true. Many people care deeply, but fleetingly – in the same way they may consider their own mortality before getting back to thinking about what to have for tea. And others care, but fail to change their behaviour in a way that’s proportionate to their concerns. Certainly that’s my unhappy stomping ground.
Besides what choices do we really have? Even the most progressive, large organisations have been glacial to move towards any form of real form of sustainability. For many years we have struggled with the Frankenstein-like task of stitching ‘sustainability’ onto existing business and economic models and the results, I think, speak for themselves.
That the Collaborative Economy presents us with an opportunity – in Napster-like ways – to disrupt and evolve toward something more sustainable is compelling idea. Looking out to a future filled with opportunities to reconfigure how we produce, consume and dispose of the things we want and need to live, work and play.
Whether the journey toward sustainability is short or long, it will be punctuated with a good degree of turbulence, disruption and some largely unpredictable events. How we deal with those events and what role communities, collaboration and technology play may set the framework and tone for how that future evolves. Crises and disruption to our entrenched living patterns present ripe opportunities for innovation and space for adopting new behaviours and practices.
No-one is immune from the impact of erratic and extreme weather events. And if we accept that these events are going to increase in frequency, we must draw the conclusion that emergency state and government resources may be drawn more thinly over time.
Across the world, there is a fairly well organised state and international infrastructure for dealing with emergencies , involving everyone from the Disaster Emergency Committee, the UN, central and local government and municipalities, not for profit organisations and of course, the military. There is a clear reason why we need this kind of state emergency response; I’m not suggesting that we don’t.
But through the rise of open data and mass participation in platforms that share location, identity and inventory, we are creating a new kind of mesh; a social and technological infrastructure that could considerably strengthen our ability to respond to unpredictable events.
In the last few years we have seen a sharp rise in the number of tools and crowdsourcing platforms and open source sensor networks that are focused on observing, predicting or responding to extreme events:
• Apps like Shake Alert, which emits a minute warning that an earthquake is coming
• Rio’s sensor network, which measures rainfall outside the city and can predict flooding
• Open Source sensor software Arduino which is being used to crowd-source weather and pollution data
• Propeller Health, which is using Asthma sensors on inhalers to crowd-source pollution hotspots
• Safecast, which was developed for crowdsourcing radiation levels in Japan
Increasingly we have the ability to deploy open source, distributed and networked sensors and devices for capturing and aggregating data that can help us manage our responses to extreme weather (and indeed, other kinds of) events.
Look at platforms like LocalMind and Foursquare. Today, I might be using them to find out whether there’s a free table at a bar or finding out what restaurant my friends are in. But these kind of social locative platforms present an infrastructure that could be life-saving in any kind of situation where you need to know where to go quickly to get out of trouble. We know that in the wake of disruptive events and disasters, like bombings, riots etc, people now intuitively and instinctively take to technology to find out what’s happening, where to go and how to co-ordinate response efforts.
During the 2013 Bart Strike in San Francisco, ventures like Liquid Space and SideCar enabled people to quickly find alternative places to work, or alternatives to public transport, to mitigate the inconvenience of the strike. The strike was a minor inconvenience compared to the impact of a hurricane and flood but nevertheless, in both those instances, ventures decided waive their fees; as did AirBnB when 1,400 New York AirBnB hosts opened their doors to people who had been left homeless through Hurricane Sandy in 2012.
The impulse to help is not new. The matching of people’s offers of help and resources to on-the-ground need, in real time, is.”

Sammies finalists are harnessing technology to help the public


Lisa Rein in the Washington Post: “One team of federal agents led Medicare investigations that resulted in more than 600 convictions in South Florida, recovering hundreds of millions of dollars. Another official boosted access to burial sites for veterans across the country. And one guided an initiative to provide safe drinking water to 5 million people in Uganda and Kenya. These are some of the 33 individuals and teams of federal employees nominated for the 13th annual Samuel J. Heyman Service to America Medals, among the highest honors in government. The 2014 finalists reflect the achievements of public servants in fields from housing to climate change, their work conducted in Washington and locations as far-flung as Antarctica and Alabama…
Many of them have excelled in harnessing new technology in ways that are pushing the limits of what government thought was possible even a few years ago. Michael Byrne of the Federal Communications Commission, for example, put detailed data about broadband availability in the hands of citizens and policymakers using interactive online maps and other visualizations. At the Environmental Protection Agency, Douglas James Norton made water quality data that had never been public available on the Web for citizens, scientists and state agencies.”

The Surprising Accuracy Of Crowdsourced Predictions About The Future


Adele Peters in FastCo-Exist:If you have a question about what’s going to happen next in Syria or North Korea, you might get more accurate predictions by asking a group of ordinary people than from foreign policy experts or even, possibly, CIA agents with classified information. Over the last few years, the Good Judgment Project has proven that crowdsourcing predictions is a surprisingly accurate way to forecast the future.

The project, sponsored by the U.S. Director of National Intelligence office, is currently working with 3,000 people to test their ability to predict outcomes in everything from world politics to the economy. They aren’t experts, just people who are interested in the news.

“We just needed lots of people; we had very few restrictions,” says Don Moore, an associate professor at University of California-Berkeley, who co-led the project. “We wanted people who were interested, and curious, who were moderately well-educated and at least aware enough of the world around them that they listened to the news.”
The group has tackled 250 questions in the experiment so far. None of them have been simple; current questions include whether Turkey will get a new constitution and whether the U.S. and the E.U. will reach a trade deal. But the group consistently got answers right more often than individual experts, just through some simple online research and, in some cases, discussions with each other.
The crowdsourced predictions are even reportedly more accurate than those from intelligence agents. One report says that when “superpredictors,” the people who are right most often, are grouped together in teams, they can outperform agents with classified information by as much as 30%. (The researchers can’t confirm this fact, since the accuracy of spies is, unsurprisingly, classified).
…Crowdsourcing could be useful for any type of prediction, Moore says, not only what’s happening in world politics. “Every major decision depends on a forecast of the future,” he explains. “A company deciding to launch a new product has to figure out what sales might be like. A candidate trying to decide whether to run for office has to forecast how they’ll do in the election. In trying to decide whom to marry, you have to decide what your future looks like together.”
“The way corporations do forecasting now is an embarrassment,” he adds. “Many of the tools we’re developing would be enormously helpful.”
The project is currently recruiting new citizen predictors here.”

 

Out in the Open: An Open Source Website That Gives Voters a Platform to Influence Politicians


Klint Finley in Wired: “This is the decade of the protest. The Arab Spring. The Occupy Movement. And now the student demonstrations in Taiwan.
Argentine political scientist Pia Mancini says we’re caught in a “crisis of representation.” Most of these protests have popped up in countries that are at least nominally democratic, but so many people are still unhappy with their elected leaders. The problem, Mancini says, is that elected officials have drifted so far from the people they represent, that it’s too hard for the average person to be heard.
“If you want to participate in the political system as it is, it’s really costly,” she says. “You need to study politics in university, and become a party member and work your way up. But not every citizen can devote their lives to politics.”

Democracy OS is designed to address that problem by getting citizens directly involved in debating specific proposals when their representatives are actually voting on them.

That’s why Mancini started the Net Democracy foundation, a not-for-profit that explores ways of improving civic engagement through technology. The foundation’s first project is something called Democracy OS, an online platform for debating and voting on political issues, and it’s already finding a place in the world. The federal government in Mexico is using this open-source tool to gather feedback on a proposed public data policy, and in Tunisia, a non-government organization called iWatch has adopted it in an effort to give the people a stronger voice.
Mancini’s dissatisfaction with electoral politics stems from her experience working for the Argentine political party Unión Celeste y Blanco from 2010 until 2012. “I saw some practices that I thought were harmful to societies,” she says. Parties were too interested in the appearances of the candidates, and not interested enough in their ideas. Worse, citizens were only consulted for their opinions once every two to four years, meaning politicians could get away with quite a bit in the meantime.
Democracy OS is designed to address that problem by getting citizens directly involved in debating specific proposals when their representatives are actually voting on them. It operates on three levels: one for gathering information about political issues, one for public debate about those issues, and one for actually voting on specific proposals.
Various communities now use a tool called Madison to discuss policy documents, and many activists and community organizations have adopted Loomio to make decisions internally. But Democracy OS aims higher: to provide a common platform for any city, state, or government to actually put proposals to a vote. “We’re able to actually overthrow governments, but we’re not using technology to decide what to do next,” Mancini says. “So the risk is that we create power vacuums that get filled with groups that are already very well organized. So now we need to take it a bit further. We need to decide what democracy for the internet era looks like.”
Image: Courtesy of Net Democracy

Software Shop as Political Party

Today Net Democracy is more than just a software development shop. It’s also a local political party based in Beunos Aires. Two years ago, the foundation started pitching the first prototype of the software to existing political parties as a way for them to gather feedback from constituents, but it didn’t go over well. “They said: ‘Thank you, this is cool, but we’re not interested,’” Mancini remembers. “So we decided to start our own political party.”
The Net Democracy Party hasn’t won any seats yet, but it promises that if it does, it will use Democracy OS to enable any local registered voter to tell party representatives how to vote. Mancini says the party representatives will always vote the way constituents tell them to vote through the software.

‘We’re not saying everyone should vote on every issue all the time. What were saying is that issues should be open for everyone to participate.’

She also uses the term “net democracy” to refer to the type of democracy that the party advocates, a form of delegative democracy that attempts to strike a balance between representative democracy and direct democracy. “We’re not saying everyone should vote on every issue all the time,” Mancini explains. “What were saying is that issues should be open for everyone to participate.”
Individuals will also be able to delegate their votes to other people. “So, if you’re not comfortable voting on health issues, you can delegate to someone else to vote for you in that area,” she says. “That way people with a lot of experience in an issue, like a community leader who doesn’t have lobbyist access to the system, can build more political capital.”
She envisions a future where decisions are made on two levels. Decisions that involve specific knowledge — macroeconomics, tax reforms, judiciary regulations, penal code, etc. — or that affect human rights are delegated “upwards” to representatives. But then decisions related to local issues — transport, urban development, city codes, etc. — cab be delegated “downwards” to the citizens.

The Secret Ballot Conundrum

Ensuring the integrity of the votes gathered via Democracy OS will be a real challenge. The U.S. non-profit organization Black Box Voting has long criticized electronic voting schemes as inherently flawed. “Our criticism of internet voting is that it is not transparent and cannot be made publicly transparent,” says Black Box Voting founder Bev Harris. “With transparency for election integrity defined as public ability to see and authenticate four things: who can vote, who did vote, vote count, and chain of custody.”
In short, there’s no known way to do a secret ballot online because any system for verifying that the votes were counted properly will inevitably reveal who voted for what.
Democracy OS deals with that by simply doing away with secret ballots. For now, the Net Democracy party will have people sign-up for Democracy OS accounts in person with their government issued ID cards. “There is a lot to be said about how anonymity allows you to speak more freely,” Mancini says. “But in the end, we decided to prioritize the reliability, accountability and transparency of the system. We believe that by making our arguments and decisions public we are fostering a civic culture. We will be more responsible for what we say and do if it’s public.”
But making binding decisions based on these online discussions would be problematic, since they would skew not just towards those tech savvy enough to use the software, but also towards those willing to have their names attached to their votes publicly. Fortunately, the software isn’t yet being used to gather real votes, just to gather public feedback….”

Winds of Change: The Progress of Open Government Policymaking in Latin America and the Caribbean


Inter-American Development Bank paper by Ramírez Alujas, Álvaro V.; and Dassen, Nicolás: “The year 2013 has become known as the year of Open Government. The continuing progress of the Open Government Partnership represents the consolidation of a process that, in less than two years, has strengthened the promotion and implementation of public policies. These policies are founded on the principles of transparency and access to public information, citizen participation, integrity, and the harnessing of technology on behalf of openness and accountability in 63 participating countries. The Latin American and Caribbean region, in particular, stands out with the most widespread participation, including 15 borrowing member countries of the Inter-American Development Bank (IDB). Fourteen of these have action plans in process for the implementation and/or evaluation of these policies, reinforcing their commitment to open government. Trinidad and Tobago, one of the 15 member countries, will soon present its own action plan. To date, various countries are developing public consultation processes and opportunities for participation for a new two-year period of commitments relating to open government. It is, therefore, worthwhile to review, country-by-country, the commitments that have been carried out and to consider the views expressed by relevant stakeholders. This analysis will further contribute to this emerging domain a new paradigm for public policy and management reform in the 21st century.”

The Universe Is Programmable. We Need an API for Everything


Keith Axline in Wired: “Think about it like this: In the Book of Genesis, God is the ultimate programmer, creating all of existence in a monster six-day hackathon.
Or, if you don’t like Biblical metaphors, you can think about it in simpler terms. Robert Moses was a programmer, shaping and re-shaping the layout of New York City for more than 50 years. Drug developers are programmers, twiddling enzymes to cure what ails us. Even pickup artists and conmen are programmers, running social scripts on people to elicit certain emotional results.

Keith Axline in Wired: “Everyone is becoming a programmer. The next step is to realize that everything is a program.

The point is that, much like the computer on your desk or the iPhone in your hand, the entire Universe is programmable. Just as you can build apps for your smartphones and new services for the internet, so can you shape and re-shape almost anything in this world, from landscapes and buildings to medicines and surgeries to, well, ideas — as long as you know the code.
That may sound like little more than an exercise in semantics. But it’s actually a meaningful shift in thinking. If we look at the Universe as programmable, we can start treating it like software. In short, we can improve almost everything we do with the same simple techniques that have remade the creation of software in recent years, things like APIs, open source code, and the massively popular code-sharing service GitHub.
The great thing about the modern software world is that you don’t have to build everything from scratch. Apple provides APIs, or application programming interfaces, that can help you build apps on their devices. And though Tim Cook and company only give you part of what you need, you can find all sorts of other helpful tools elsewhere, thanks to the open source software community.
The same is true if you’re building, say, an online social network. There are countless open source software tools you can use as the basic building blocks — many of them open sourced by Facebook. If you’re creating almost any piece of software, you can find tools and documentation that will help you fashion at least a small part of it. Chances are, someone has been there before, and they’ve left some instructions for you.
Now we need to discover and document the APIs for the Universe. We need a standard way of organizing our knowledge and sharing it with the world at large, a problem for which programmers already have good solutions. We need to give everyone a way of handling tasks the way we build software. Such a system, if it can ever exist, is still years away — decades at the very least — and the average Joe is hardly ready for it. But this is changing. Nowadays, programming skills and the DIY ethos are slowly spreading throughout the population. Everyone is becoming a programmer. The next step is to realize that everything is a program.

What Is an API?

The API may sound like just another arcane computer acronym. But it’s really one of the most profound metaphors of our time, an idea hiding beneath the surface of each piece of tech we use everyday, from iPhone apps to Facebook. To understand what APIs are and why they’re useful, let’s look at how programmers operate.
If I’m building a smartphone app, I’m gonna need — among so many other things — a way of validating a signup form on a webpage to make sure a user doesn’t, say, mistype their email address. That validation has nothing to do with the guts of my app, and it’s surprisingly complicated, so I don’t really want to build it from scratch. Apple doesn’t help me with that, so I start looking on the web for software frameworks, plugins, Software Developer Kits (SDKs) — anything that will help me build my signup tool.
Hopefully, I’ll find one. And if I do, chances are it will include some sort of documentation or “Readme file” explaining how this piece of code is supposed to be used so that I can tailor it to my app. This Readme file should contain installation instructions as well as the API for the code. Basically, an API lays out the code’s inputs and outputs. It shows what me what I have to send the code and what it will spit back out. It shows how I bolt it onto my signup form. So the name is actually quite explanatory: Application Programming Interface. An API is essentially an instruction manual for a piece of software.
Now, let’s combine this with the idea that everything is an application: molecules, galaxies, dogs, people, emotional states, abstract concepts like chaos. If you do something to any these things, they’ll respond in some way. Like software, they have inputs and outputs. What we need to do is discover and document their APIs.
We aren’t dealing with software code here. Inputs and outputs can themselves be anything. But we can closely document these inputs and their outputs — take what we know about how we interface with something and record it in a standard way that it can be used over and over again. We can create a Readme file for everything.
We can start by doing this in small, relatively easy ways. How about APIs for our cities? New Zealand just open sourced aerial images of about 95 percent of its land. We could write APIs for what we know about building in those areas, from properties of the soil to seasonal weather patterns to zoning laws. All this knowledge exists but it hasn’t been organized and packaged for use by anyone who is interested. And we could go still further — much further.
For example, between the science community, the medical industry and the billions of human experiences, we could probably have a pretty extensive API mapped out of the human stomach — one that I’d love to access when I’m up at 3am with abdominal pains. Maybe my microbiome is out of whack and there’s something I have on-hand that I could ingest to make it better. Or what if we cracked the API for the signals between our eyes and our brain? We wouldn’t need to worry about looking like Glassholes to get access to always-on augmented reality. We could just get an implant. Yes, these APIs will be slightly different for everyone, but that brings me to the next thing we need.

A GitHub for Everything

We don’t just need a Readme for the Universe. We need a way of sharing this Readme and changing it as need be. In short, we need a system like GitHub, the popular online service that lets people share and collaborate on software code.
Let’s go back to the form validator I found earlier. Say I made some modifications to it that I think other programmers would find useful. If the validator is on GitHub, I can create a separate but related version — a fork — that people can find and contribute to, in the same way I first did with the original software.

This creates a tree of knowledge, with giant groups of people creating and merging branches, working on their small section and then giving it back to the whole.

GitHub not only enables this collaboration, but every change is logged into separate versions. If someone were so inclined, they could go back and replay the building of the validator, from the very first save all the way up to my changes and whoever changes it after me. This creates a tree of knowledge, with giant groups of people creating and merging branches, working on their small section and then giving it back to the whole.
We should be able to funnel all existing knowledge of how things work — not just software code — into a similar system. That way, if my brain-eye interface needs to be different, I (or my personal eye technician) can “fork” the API. In a way, this sort of thing is already starting to happen. People are using GitHub to share government laws, policy documents, Gregorian chants, and the list goes on. The ultimate goal should be to share everything.
Yes, this idea is similar to what you see on sites like Wikipedia, but the stuff that’s shared on Wikipedia doesn’t let you build much more than another piece of text. We don’t just need to know what things are. We need to know how they work in ways that let us operate on them.

The Open Source Epiphany

If you’ve never programmed, all this can sound a bit, well, abstract. But once you enter the coding world, getting a loose grasp on the fundamentals of programming, you instantly see the utility of open source software. “Oooohhh, I don’t have to build this all myself,” you say. “Thank God for the open source community.” Because so many smart people contribute to open source, it helps get the less knowledgeable up to speed quickly. Those acolytes then pay it forward with their own contributions once they’ve learned enough.
Today, more and more people are jumping on this train. More and more people are becoming programmers of some shape or form. It wasn’t so long ago that basic knowledge of HTML was considered specialized geek speak. But now, it’s a common requirement for almost any desk job. Gone are the days when kids made fun of their parents for not being able to set the clock on the VCR. Now they get mocked for mis-cropping their Facebook profile photos.
These changes are all part of the tech takeover of our lives that is trickling down to the masses. It’s like how the widespread use of cars brought a general mechanical understanding of engines to dads everywhere. And this general increase in aptitude is accelerating along with the technology itself.
Steps are being taken to make programming a skill that most kids get early in school along with general reading, writing, and math. In the not too distant future, people will need to program in some form for their daily lives. Imagine the world before the average person knew how to write a letter, or divide two numbers, compared to now. A similar leap is around the corner…”