The Universe Is Programmable. We Need an API for Everything


Keith Axline in Wired: “Think about it like this: In the Book of Genesis, God is the ultimate programmer, creating all of existence in a monster six-day hackathon.
Or, if you don’t like Biblical metaphors, you can think about it in simpler terms. Robert Moses was a programmer, shaping and re-shaping the layout of New York City for more than 50 years. Drug developers are programmers, twiddling enzymes to cure what ails us. Even pickup artists and conmen are programmers, running social scripts on people to elicit certain emotional results.

Keith Axline in Wired: “Everyone is becoming a programmer. The next step is to realize that everything is a program.

The point is that, much like the computer on your desk or the iPhone in your hand, the entire Universe is programmable. Just as you can build apps for your smartphones and new services for the internet, so can you shape and re-shape almost anything in this world, from landscapes and buildings to medicines and surgeries to, well, ideas — as long as you know the code.
That may sound like little more than an exercise in semantics. But it’s actually a meaningful shift in thinking. If we look at the Universe as programmable, we can start treating it like software. In short, we can improve almost everything we do with the same simple techniques that have remade the creation of software in recent years, things like APIs, open source code, and the massively popular code-sharing service GitHub.
The great thing about the modern software world is that you don’t have to build everything from scratch. Apple provides APIs, or application programming interfaces, that can help you build apps on their devices. And though Tim Cook and company only give you part of what you need, you can find all sorts of other helpful tools elsewhere, thanks to the open source software community.
The same is true if you’re building, say, an online social network. There are countless open source software tools you can use as the basic building blocks — many of them open sourced by Facebook. If you’re creating almost any piece of software, you can find tools and documentation that will help you fashion at least a small part of it. Chances are, someone has been there before, and they’ve left some instructions for you.
Now we need to discover and document the APIs for the Universe. We need a standard way of organizing our knowledge and sharing it with the world at large, a problem for which programmers already have good solutions. We need to give everyone a way of handling tasks the way we build software. Such a system, if it can ever exist, is still years away — decades at the very least — and the average Joe is hardly ready for it. But this is changing. Nowadays, programming skills and the DIY ethos are slowly spreading throughout the population. Everyone is becoming a programmer. The next step is to realize that everything is a program.

What Is an API?

The API may sound like just another arcane computer acronym. But it’s really one of the most profound metaphors of our time, an idea hiding beneath the surface of each piece of tech we use everyday, from iPhone apps to Facebook. To understand what APIs are and why they’re useful, let’s look at how programmers operate.
If I’m building a smartphone app, I’m gonna need — among so many other things — a way of validating a signup form on a webpage to make sure a user doesn’t, say, mistype their email address. That validation has nothing to do with the guts of my app, and it’s surprisingly complicated, so I don’t really want to build it from scratch. Apple doesn’t help me with that, so I start looking on the web for software frameworks, plugins, Software Developer Kits (SDKs) — anything that will help me build my signup tool.
Hopefully, I’ll find one. And if I do, chances are it will include some sort of documentation or “Readme file” explaining how this piece of code is supposed to be used so that I can tailor it to my app. This Readme file should contain installation instructions as well as the API for the code. Basically, an API lays out the code’s inputs and outputs. It shows what me what I have to send the code and what it will spit back out. It shows how I bolt it onto my signup form. So the name is actually quite explanatory: Application Programming Interface. An API is essentially an instruction manual for a piece of software.
Now, let’s combine this with the idea that everything is an application: molecules, galaxies, dogs, people, emotional states, abstract concepts like chaos. If you do something to any these things, they’ll respond in some way. Like software, they have inputs and outputs. What we need to do is discover and document their APIs.
We aren’t dealing with software code here. Inputs and outputs can themselves be anything. But we can closely document these inputs and their outputs — take what we know about how we interface with something and record it in a standard way that it can be used over and over again. We can create a Readme file for everything.
We can start by doing this in small, relatively easy ways. How about APIs for our cities? New Zealand just open sourced aerial images of about 95 percent of its land. We could write APIs for what we know about building in those areas, from properties of the soil to seasonal weather patterns to zoning laws. All this knowledge exists but it hasn’t been organized and packaged for use by anyone who is interested. And we could go still further — much further.
For example, between the science community, the medical industry and the billions of human experiences, we could probably have a pretty extensive API mapped out of the human stomach — one that I’d love to access when I’m up at 3am with abdominal pains. Maybe my microbiome is out of whack and there’s something I have on-hand that I could ingest to make it better. Or what if we cracked the API for the signals between our eyes and our brain? We wouldn’t need to worry about looking like Glassholes to get access to always-on augmented reality. We could just get an implant. Yes, these APIs will be slightly different for everyone, but that brings me to the next thing we need.

A GitHub for Everything

We don’t just need a Readme for the Universe. We need a way of sharing this Readme and changing it as need be. In short, we need a system like GitHub, the popular online service that lets people share and collaborate on software code.
Let’s go back to the form validator I found earlier. Say I made some modifications to it that I think other programmers would find useful. If the validator is on GitHub, I can create a separate but related version — a fork — that people can find and contribute to, in the same way I first did with the original software.

This creates a tree of knowledge, with giant groups of people creating and merging branches, working on their small section and then giving it back to the whole.

GitHub not only enables this collaboration, but every change is logged into separate versions. If someone were so inclined, they could go back and replay the building of the validator, from the very first save all the way up to my changes and whoever changes it after me. This creates a tree of knowledge, with giant groups of people creating and merging branches, working on their small section and then giving it back to the whole.
We should be able to funnel all existing knowledge of how things work — not just software code — into a similar system. That way, if my brain-eye interface needs to be different, I (or my personal eye technician) can “fork” the API. In a way, this sort of thing is already starting to happen. People are using GitHub to share government laws, policy documents, Gregorian chants, and the list goes on. The ultimate goal should be to share everything.
Yes, this idea is similar to what you see on sites like Wikipedia, but the stuff that’s shared on Wikipedia doesn’t let you build much more than another piece of text. We don’t just need to know what things are. We need to know how they work in ways that let us operate on them.

The Open Source Epiphany

If you’ve never programmed, all this can sound a bit, well, abstract. But once you enter the coding world, getting a loose grasp on the fundamentals of programming, you instantly see the utility of open source software. “Oooohhh, I don’t have to build this all myself,” you say. “Thank God for the open source community.” Because so many smart people contribute to open source, it helps get the less knowledgeable up to speed quickly. Those acolytes then pay it forward with their own contributions once they’ve learned enough.
Today, more and more people are jumping on this train. More and more people are becoming programmers of some shape or form. It wasn’t so long ago that basic knowledge of HTML was considered specialized geek speak. But now, it’s a common requirement for almost any desk job. Gone are the days when kids made fun of their parents for not being able to set the clock on the VCR. Now they get mocked for mis-cropping their Facebook profile photos.
These changes are all part of the tech takeover of our lives that is trickling down to the masses. It’s like how the widespread use of cars brought a general mechanical understanding of engines to dads everywhere. And this general increase in aptitude is accelerating along with the technology itself.
Steps are being taken to make programming a skill that most kids get early in school along with general reading, writing, and math. In the not too distant future, people will need to program in some form for their daily lives. Imagine the world before the average person knew how to write a letter, or divide two numbers, compared to now. A similar leap is around the corner…”

Thanks-for-Ungluing launches!


Blog from Unglue.it: “Great books deserve to be read by all of us, and we ought to be supporting the people who create these books. “Thanks for Ungluing” gives readers, authors, libraries and publishers a new way to build, sustain, and nourish the books we love.
“Thanks for Ungluing” books are Creative Commons licensed and free to download. You don’t need to register or anything. But when you download, the creators can ask for your support. You can pay what you want. You can just scroll down and download the book. But when that book has become your friend, your advisor, your confidante, you’ll probably want to show your support and tell all your friends.
We have some amazing creators participating in this launch….”

To the Cloud: Big Data in a Turbulent World


Book by Vincent Mosco: “In the wake of revelations about National Security Agency activities—many of which occur “in the cloud”—this book offers both enlightenment and a critical view. Cloud computing and big data are arguably the most significant forces in information technology today. In clear prose, To the Cloud explores where the cloud originated, what it means, and how important it is for business, government, and citizens. It describes the intense competition among cloud companies like Amazon and Google, the spread of the cloud to government agencies like the controversial NSA, and the astounding growth of entire cloud cities in China. From advertising to trade shows, the cloud and big data are furiously marketed to the world, even as dark clouds loom over environmental, privacy, and employment issues that arise from the cloud. Is the cloud the long-promised information utility that will solve many of the world’s economic and social problems? Or is it just marketing hype? To the Cloud provides the first thorough analysis of the potential and the problems of a technology that may very well disrupt the world.”

Innovation Contests


Paper by David Pérez Castrillo and David Wettstein: “We study innovation contests with asymmetric information and identical contestants, where contestants’ efforts and innate abilities generate inventions of varying qualities. The designer offers a reward to the contestant achieving the highest quality and receives the revenue generated by the innovation. We characterize the equilibrium behavior, outcomes and payoffs for both nondiscriminatory and discriminatory (where the reward is contestant-dependent) contests. We derive conditions under which the designer obtains a larger payoff when using a discriminatory contest and describe settings where these conditions are satisfied.”

EU: Have your say on Future and Emerging Technologies!


European Commission: “Do you have a great idea for a new technology that is not possible yet? Do you think it can become realistic by putting Europe’s best minds on the task? Share your view and the European Commission – via the Future and Emerging Technologies (FET) programme@fet_eu#FET_eu– can make it happen. The consultation is open till 15 June 2014.

The aim of the public consultation launched today is to identify promising and potentially game-changing directions for future research in any technological domain.

Vice-President of the European Commission @NeelieKroesEU, responsible for the Digital Agenda, said: “From protecting the environment to curing disease – the choices and investments we make today will make a difference to the jobs and lives we enjoy tomorrow. Researchers and entrepreneurs, innovators, creators or interested bystanders – whoever you are, I hope you will take this opportunity to take part in determining Europe’s future“.

The consultation is organised as a series of discussions, in which contributors can suggest ideas for a new FET Proactive initiative or discuss the 9 research topics identified in the previous consultation to determine whether they are still relevant today.

The ideas collected via the public consultation will contribute to future FET work programmes, notably the next one (2016-17). This participative process has already been used to draft the current work programme (2014-15).

Background

€2,7 billion will be invested in Future and Emerging Technologies (FET) under the new research programme Horizon 2020#H2020 (2014-2020). This represents a nearly threefold increase in budget compared to the previous research programme, FP7. FET actions are part of the Excellent science pillar of Horizon 2020.

The objective of FET is to foster radical new technologies by exploring novel and high-risk ideas building on scientific foundations. By providing flexible support to goal-oriented and interdisciplinary collaborative research, and by adopting innovative research practices, FET research seizes the opportunities that will deliver long-term benefit for our society and economy.

FET Proactive initiatives aim to mobilise interdisciplinary communities around promising long-term technological visions. They build up the necessary base of knowledge and know-how for kick-starting a future technology line that will benefit Europe’s future industries and citizens in the decades to come. FET Proactive initiatives complement FET Open scheme, which funds small-scale projects on future technology, and FET Flagships, which are large-scale initiatives to tackle ambitious interdisciplinary science and technology goals.

FET previously launched an online consultation (2012-13) to identify research topics for the current work programme. Around 160 ideas were submitted. The European Commission did an exhaustive analysis and produced an informal clustering of these ideas into broad topics. 9 topics were identified as candidates for a FET Proactive initiative. Three are included in the current programme, namely Global Systems Science; Knowing, Doing, Being; and Quantum Simulation.”

Project leverages Instagram to clean up abandoned bikes on NY streets


Springwise: “We’ve already seen Canada’s Trashswag help document the useable goods that are left on the sidewalk. Now the Dead Pedal NY project is getting residents to document the city’s abandoned bikes via Instagram so authorities can do something about it.
Whether it’s because their bike has become damaged while parked or because it’s simply been abandoned, bike racks are plagued by broken frames that remain locked. This means less space for active cyclists to park their own bike. Created by art director Pat Gamble, Dead Pedal NY encourages those annoyed by the problem to take a photo and upload it with a geolocation tag onto Instagram, using the hashtag #deadpedalNY. Participants can identify abandoned bikes if they’re missing important parts, have a crushed or bent frame or if they’re mostly rusted. The collected images and locations then provides a resource for local authorities to remove the bikes and make them aware of how big the problem is.
The initiative helps those having trouble finding a free bike rack to vent their frustration in a positive way, and encourages local authorities to do more to make cycling a positive experience for city dwellers. Are there other ways Instagram can be leveraged to get citizens to report on issues in their neighborhood?
Website: www.deadpedalny.com

The false promise of the digital humanities


Adam Kirsch in the New Republic: “The humanities are in crisis again, or still. But there is one big exception: digital humanities, which is a growth industry. In 2009, the nascent field was the talk of the Modern Language Association (MLA) convention: “among all the contending subfields,” a reporter wrote about that year’s gathering, “the digital humanities seem like the first ‘next big thing’ in a long time.” Even earlier, the National Endowment for the Humanities created its Office of Digital Humanities to help fund projects. And digital humanities continues to go from strength to strength, thanks in part to the Mellon Foundation, which has seeded programs at a number of universities with large grantsmost recently, $1 million to the University of Rochester to create a graduate fellowship.

Despite all this enthusiasm, the question of what the digital humanities is has yet to be given a satisfactory answer. Indeed, no one asks it more often than the digital humanists themselves. The recent proliferation of books on the subjectfrom sourcebooks and anthologies to critical manifestosis a sign of a field suffering an identity crisis, trying to determine what, if anything, unites the disparate activities carried on under its banner. “Nowadays,” writes Stephen Ramsay in Defining Digital Humanities, “the term can mean anything from media studies to electronic art, from data mining to edutech, from scholarly editing to anarchic blogging, while inviting code junkies, digital artists, standards wonks, transhumanists, game theorists, free culture advocates, archivists, librarians, and edupunks under its capacious canvas.”

Within this range of approaches, we can distinguish a minimalist and a maximalist understanding of digital humanities. On the one hand, it can be simply the application of computer technology to traditional scholarly functions, such as the editing of texts. An exemplary project of this kind is the Rossetti Archive created by Jerome McGann, an online repository of texts and images related to the career of Dante Gabriel Rossetti: this is essentially an open-ended, universally accessible scholarly edition. To others, however, digital humanities represents a paradigm shift in the way we think about culture itself, spurring a change not just in the medium of humanistic work but also in its very substance. At their most starry-eyed, some digital humanistssuch as the authors of the jargon-laden manifesto and handbook Digital_Humanitieswant to suggest that the addition of the high-powered adjective to the long-suffering noun signals nothing less than an epoch in human history: “We live in one of those rare moments of opportunity for the humanities, not unlike other great eras of cultural-historical transformation such as the shift from the scroll to the codex, the invention of movable type, the encounter with the New World, and the Industrial Revolution.”

The language here is the language of scholarship, but the spirit is the spirit of salesmanshipthe very same kind of hyperbolic, hard-sell approach we are so accustomed to hearing about the Internet, or  about Apple’s latest utterly revolutionary product. Fundamental to this kind of persuasion is the undertone of menace, the threat of historical illegitimacy and obsolescence. Here is the future, we are made to understand: we can either get on board or stand athwart it and get run over. The same kind of revolutionary rhetoric appears again and again in the new books on the digital humanities, from writers with very different degrees of scholarly commitment and intellectual sophistication.

In Uncharted, Erez Aiden and Jean-Baptiste Michel, the creators of the Google Ngram Vieweran online tool that allows you to map the frequency of words in all the printed matter digitized by Googletalk up the “big data revolution”: “Its consequences will transform how we look at ourselves…. Big data is going to change the humanities, transform the social sciences, and renegotiate the relationship between the world of commerce and the ivory tower.” These breathless prophecies are just hype. But at the other end of the spectrum, even McGann, one of the pioneers of what used to be called “humanities computing,” uses the high language of inevitability: “Here is surely a truth now universally acknowledged: that the whole of our cultural inheritance has to be recurated and reedited in digital forms and institutional structures.”

If ever there were a chance to see the ideological construction of reality at work, digital humanities is it. Right before our eyes, options are foreclosed and demands enforced; a future is constructed as though it were being discovered. By now we are used to this process, since over the last twenty years the proliferation of new technologies has totally discredited the idea of opting out of “the future.”…

The promise and perils of giving the public a policy ‘nudge’


Nicholas Biddle and Katherine Curchin at the Conversation: “…These behavioural insights are more than just intellectual curiosities. They are increasingly being used by policymakers inspired by Richard Thaler and Cass Sunstein’s bestselling manifesto for libertarian paternalism, Nudge.
The British and New South Wales governments have set up behavioural insights units. Many other governments around Australia are following their lead.
Most of the attention so far has been on how behavioural insights could be employed to make people slimmer, greener, more altruistic or better savers. However, it’s time we started thinking and talking about the impact these ideas could have on social policy – programs and payments that aim to reduce disadvantage and narrow divergence in opportunity.
While applying behavioural insights can potentially improve the efficiency and effectiveness of social policy, unscrupulous or poorly thought through applications could be disturbing and damaging. It would appear behavioural insights inspired the UK government’s so-called “Nudge Unit” to force job seekers to undergo bogus personality tests – on pain of losing benefits if they refused.
The idea seemed to be that because people readily believe that any vaguely worded combination of character traits applies to them – which is why people connect with their star sign – the results of a fake psychometric test can dupe them into believing they have a go-getting personality.
In our view, this is not how behavioural insights should be applied. This UK example seems to be a particularly troubling case of the use of “nudges” in conjunction with, rather than instead of, coercion. This is the worst of both worlds: not libertarian paternalism, but authoritarian paternalism.
Ironically, this instance betrays a questionable understanding of behavioural insights or at the very least a very short-term focus. Research tells us that co-operative behaviour depends on the perception of fairness and successful framing requires trust.
Dishonest interventions, which make the government seem both unfair and untrustworthy, should have the longer-term effect of undermining its ability to elicit cooperation and successfully frame information.
Some critics have assumed nudge is inherently conservative or neoliberal. Yet these insights could inform progressive reform in many ways.
For example, taking behavioural insights seriously would encourage a redesign of employment services. There is plenty of scope for thinking more rigorously about how job seekers’ interactions with employment services unintentionally inhibit their motivation to search for work.

Beware accidental nudges

More than just a nudge here or there, behavioural insights can be used to reflect on almost all government decisions. Too often governments accidentally nudge citizens in the opposite direction to where they want them to go.
Take the disappointing take-up of the Matched Savings Scheme, which is part of New Income Management in the Northern Territory. It matches welfare recipients’ savings dollar-for-dollar up to a maximum of A$500 and is meant to get people into the habit of saving regularly.
No doubt saving is extremely hard for people on very low incomes. But another reason so few people embraced the savings program may be a quirk in its design: people had to save money out of their non-income-managed funds, but the $500 reward they received from the government went into their income-managed account.
To some people this appears to have signalled the government’s bad faith. It said to them: even if you demonstrate your responsibility with money, we still won’t trust you.
The Matched Savings Scheme was intended to be a carrot, not a stick. It was supposed to complement the coercive element of income management by giving welfare recipients an incentive to improve their budgeting. Instead it was perceived as an invitation to welfare recipients to be complicit in their own humiliation.
The promise of an extra $500 would have been a strong lure for Homo economicus, but it wasn’t for Homo sapiens. People out of work or on income support are no more or less rational than merchant bankers or economics professors. Their circumstances and choices are different though.
The idiosyncrasies of human decision-making don’t mean that the human brain is fundamentally flawed. Most of the biases that we mentioned earlier are adaptive. But they do mean that policy makers need to appreciate how we differ from rational utility maximisers.”
Real humans are not worse than economic man. We’re just different and we deserve policies made for Homo sapiens, not Homo economicus.

Open Government Data Gains Global Momentum


Wyatt Kash in Information Week: “Governments across the globe are deepening their strategic commitments and working more closely to make government data openly available for public use, according to public and private sector leaders who met this week at the inaugural Open Government Data Forum in Abu Dhabi, hosted by the United Nations and the United Arab Emirates, April 28-29.

Data experts from Europe, the Middle East, the US, Canada, Korea, and the World Bank highlighted how one country after another has set into motion initiatives to expand the release of government data and broaden its use. Those efforts are gaining traction due to multinational organizations, such as the Open Government Partnership, the Open Data Institute, The World Bank, and the UN’s e-government division, that are trying to share practices and standardize open data tools.
In the latest example, the French government announced April 24 that it is joining the Open Government Partnership, a group of 64 countries working jointly to make their governments more open, accountable, and responsive to citizens. The announcement caps a string of policy shifts, which began with the formal release of France’s Open Data Strategy in May 2011 and which parallel similar moves by the US.
The strategy committed France to providing “free access and reuse of public data… using machine-readable formats and open standards,” said Romain Lacombe, head of innovation for the French prime minister’s open government task force, Etalab. The French government is taking steps to end the practice of selling datasets, such as civil and case-law data, and is making them freely reusable. France launched a public data portal, Data.gouv.fr, in December 2011 and joined a G8 initiative to engage with open data innovators worldwide.
For South Korea, open data is not just about achieving greater transparency and efficiency, but is seen as digital fuel for a nation that by 2020 expects to achieve “ambient intelligence… when all humans and things are connected together,” said Dr. YoungSun Lee, who heads South Korea’s National Information Society Agency.
He foresees open data leading to a shift in the ways government will function: from an era of e-government, where information is delivered to citizens, to one where predictive analysis will foster a “creative government,” in which “government provides customized services for each individual.”
The open data movement is also propelling innovative programs in the United Arab Emirates. “The role of open data in directing economic and social decisions pertaining to investments… is of paramount importance” to the UAE, said Dr. Ali M. Al Khouri, director general of the Emirates Identity Authority. It also plays a key role in building public trust and fighting corruption, he said….”

Findings of the Big Data and Privacy Working Group Review


John Podesta at the White House Blog: “Over the past several days, severe storms have battered Arkansas, Oklahoma, Mississippi and other states. Dozens of people have been killed and entire neighborhoods turned to rubble and debris as tornadoes have touched down across the region. Natural disasters like these present a host of challenges for first responders. How many people are affected, injured, or dead? Where can they find food, shelter, and medical attention? What critical infrastructure might have been damaged?
Drawing on open government data sources, including Census demographics and NOAA weather data, along with their own demographic databases, Esri, a geospatial technology company, has created a real-time map showing where the twisters have been spotted and how the storm systems are moving. They have also used these data to show how many people live in the affected area, and summarize potential impacts from the storms. It’s a powerful tool for emergency services and communities. And it’s driven by big data technology.
In January, President Obama asked me to lead a wide-ranging review of “big data” and privacy—to explore how these technologies are changing our economy, our government, and our society, and to consider their implications for our personal privacy. Together with Secretary of Commerce Penny Pritzker, Secretary of Energy Ernest Moniz, the President’s Science Advisor John Holdren, the President’s Economic Advisor Jeff Zients, and other senior officials, our review sought to understand what is genuinely new and different about big data and to consider how best to encourage the potential of these technologies while minimizing risks to privacy and core American values.
Over the course of 90 days, we met with academic researchers and privacy advocates, with regulators and the technology industry, with advertisers and civil rights groups. The President’s Council of Advisors for Science and Technology conducted a parallel study of the technological trends underpinning big data. The White House Office of Science and Technology Policy jointly organized three university conferences at MIT, NYU, and U.C. Berkeley. We issued a formal Request for Information seeking public comment, and hosted a survey to generate even more public input.
Today, we presented our findings to the President. We knew better than to try to answer every question about big data in three months. But we are able to draw important conclusions and make concrete recommendations for Administration attention and policy development in a few key areas.
There are a few technological trends that bear drawing out. The declining cost of collection, storage, and processing of data, combined with new sources of data like sensors, cameras, and geospatial technologies, mean that we live in a world of near-ubiquitous data collection. All this data is being crunched at a speed that is increasingly approaching real-time, meaning that big data algorithms could soon have immediate effects on decisions being made about our lives.
The big data revolution presents incredible opportunities in virtually every sector of the economy and every corner of society.
Big data is saving lives. Infections are dangerous—even deadly—for many babies born prematurely. By collecting and analyzing millions of data points from a NICU, one study was able to identify factors, like slight increases in body temperature and heart rate, that serve as early warning signs an infection may be taking root—subtle changes that even the most experienced doctors wouldn’t have noticed on their own.
Big data is making the economy work better. Jet engines and delivery trucks now come outfitted with sensors that continuously monitor hundreds of data points and send automatic alerts when maintenance is needed. Utility companies are starting to use big data to predict periods of peak electric demand, adjusting the grid to be more efficient and potentially averting brown-outs.
Big data is making government work better and saving taxpayer dollars. The Centers for Medicare and Medicaid Services have begun using predictive analytics—a big data technique—to flag likely instances of reimbursement fraud before claims are paid. The Fraud Prevention System helps identify the highest-risk health care providers for waste, fraud, and abuse in real time and has already stopped, prevented, or identified $115 million in fraudulent payments.
But big data raises serious questions, too, about how we protect our privacy and other values in a world where data collection is increasingly ubiquitous and where analysis is conducted at speeds approaching real time. In particular, our review raised the question of whether the “notice and consent” framework, in which a user grants permission for a service to collect and use information about them, still allows us to meaningfully control our privacy as data about us is increasingly used and reused in ways that could not have been anticipated when it was collected.
Big data raises other concerns, as well. One significant finding of our review was the potential for big data analytics to lead to discriminatory outcomes and to circumvent longstanding civil rights protections in housing, employment, credit, and the consumer marketplace.
No matter how quickly technology advances, it remains within our power to ensure that we both encourage innovation and protect our values through law, policy, and the practices we encourage in the public and private sector. To that end, we make six actionable policy recommendations in our report to the President:
Advance the Consumer Privacy Bill of Rights. Consumers deserve clear, understandable, reasonable standards for how their personal information is used in the big data era. We recommend the Department of Commerce take appropriate consultative steps to seek stakeholder and public comment on what changes, if any, are needed to the Consumer Privacy Bill of Rights, first proposed by the President in 2012, and to prepare draft legislative text for consideration by stakeholders and submission by the President to Congress.
Pass National Data Breach Legislation. Big data technologies make it possible to store significantly more data, and further derive intimate insights into a person’s character, habits, preferences, and activities. That makes the potential impacts of data breaches at businesses or other organizations even more serious. A patchwork of state laws currently governs requirements for reporting data breaches. Congress should pass legislation that provides for a single national data breach standard, along the lines of the Administration’s 2011 Cybersecurity legislative proposal.
Extend Privacy Protections to non-U.S. Persons. Privacy is a worldwide value that should be reflected in how the federal government handles personally identifiable information about non-U.S. citizens. The Office of Management and Budget should work with departments and agencies to apply the Privacy Act of 1974 to non-U.S. persons where practicable, or to establish alternative privacy policies that apply appropriate and meaningful protections to personal information regardless of a person’s nationality.
Ensure Data Collected on Students in School is used for Educational Purposes. Big data and other technological innovations, including new online course platforms that provide students real time feedback, promise to transform education by personalizing learning. At the same time, the federal government must ensure educational data linked to individual students gathered in school is used for educational purposes, and protect students against their data being shared or used inappropriately.
Expand Technical Expertise to Stop Discrimination. The detailed personal profiles held about many consumers, combined with automated, algorithm-driven decision-making, could lead—intentionally or inadvertently—to discriminatory outcomes, or what some are already calling “digital redlining.” The federal government’s lead civil rights and consumer protection agencies should expand their technical expertise to be able to identify practices and outcomes facilitated by big data analytics that have a discriminatory impact on protected classes, and develop a plan for investigating and resolving violations of law.
Amend the Electronic Communications Privacy Act. The laws that govern protections afforded to our communications were written before email, the internet, and cloud computing came into wide use. Congress should amend ECPA to ensure the standard of protection for online, digital content is consistent with that afforded in the physical world—including by removing archaic distinctions between email left unread or over a certain age.
We also identify several broader areas ripe for further study, debate, and public engagement that, collectively, we hope will spark a national conversation about how to harness big data for the public good. We conclude that we must find a way to preserve our privacy values in both the domestic and international marketplace. We urgently need to build capacity in the federal government to identify and prevent new modes of discrimination that could be enabled by big data. We must ensure that law enforcement agencies using big data technologies do so responsibly, and that our fundamental privacy rights remain protected. Finally, we recognize that data is a valuable public resource, and call for continuing the Administration’s efforts to open more government data sources and make investments in research and technology.
While big data presents new challenges, it also presents immense opportunities to improve lives, the United States is perhaps better suited to lead this conversation than any other nation on earth. Our innovative spirit, technological know-how, and deep commitment to values of privacy, fairness, non-discrimination, and self-determination will help us harness the benefits of the big data revolution and encourage the free flow of information while working with our international partners to protect personal privacy. This review is but one piece of that effort, and we hope it spurs a conversation about big data across the country and around the world.
Read the Big Data Report.
See the fact sheet from today’s announcement.