Evgeny Morozov in MIT Technology Review: “Intellectually, at least, it’s clear what needs to be done: we must confront the question not only in the economic and legal dimensions but also in a political one, linking the future of privacy with the future of democracy in a way that refuses to reduce privacy either to markets or to laws. What does this philosophical insight mean in practice?
First, we must politicize the debate about privacy and information sharing. Articulating the existence—and the profound political consequences—of the invisible barbed wire would be a good start. We must scrutinize data-intensive problem solving and expose its occasionally antidemocratic character. At times we should accept more risk, imperfection, improvisation, and inefficiency in the name of keeping the democratic spirit alive.
Second, we must learn how to sabotage the system—perhaps by refusing to self-track at all. If refusing to record our calorie intake or our whereabouts is the only way to get policy makers to address the structural causes of problems like obesity or climate change—and not just tinker with their symptoms through nudging—information boycotts might be justifiable. Refusing to make money off your own data might be as political an act as refusing to drive a car or eat meat. Privacy can then reëmerge as a political instrument for keeping the spirit of democracy alive: we want private spaces because we still believe in our ability to reflect on what ails the world and find a way to fix it, and we’d rather not surrender this capacity to algorithms and feedback loops.
Third, we need more provocative digital services. It’s not enough for a website to prompt us to decide who should see our data. Instead it should reawaken our own imaginations. Designed right, sites would not nudge citizens to either guard or share their private information but would reveal the hidden political dimensions to various acts of information sharing. We don’t want an electronic butler—we want an electronic provocateur. Instead of yet another app that could tell us how much money we can save by monitoring our exercise routine, we need an app that can tell us how many people are likely to lose health insurance if the insurance industry has as much data as the NSA, most of it contributed by consumers like us. Eventually we might discern such dimensions on our own, without any technological prompts.
Finally, we have to abandon fixed preconceptions about how our digital services work and interconnect. Otherwise, we’ll fall victim to the same logic that has constrained the imagination of so many well-meaning privacy advocates who think that defending the “right to privacy”—not fighting to preserve democracy—is what should drive public policy. While many Internet activists would surely argue otherwise, what happens to the Internet is of only secondary importance. Just as with privacy, it’s the fate of democracy itself that should be our primary goal.
New and forthcoming book by Cass Sunstein: “Based on a series of pathbreaking lectures given at Yale University in 2012, this powerful, thought-provoking work by national best-selling author Cass R. Sunstein combines legal theory with behavioral economics to make a fresh argument about the legitimate scope of government, bearing on obesity, smoking, distracted driving, health care, food safety, and other highly volatile, high-profile public issues. Behavioral economists have established that people often make decisions that run counter to their best interests—producing what Sunstein describes as “behavioral market failures.” Sometimes we disregard the long term; sometimes we are unrealistically optimistic; sometimes we do not see what is in front of us. With this evidence in mind, Sunstein argues for a new form of paternalism, one that protects people against serious errors but also recognizes the risk of government overreaching and usually preserves freedom of choice.
Against those who reject paternalism of any kind, Sunstein shows that “choice architecture”—government-imposed structures that affect our choices—is inevitable, and hence that a form of paternalism cannot be avoided. He urges that there are profoundly moral reasons to ensure that choice architecture is helpful rather than harmful—and that it makes people’s lives better and longer.”
New NBER working paper by George J. Borjas and Kirk B. Doran: “Knowledge generation is key to economic growth, and scientific prizes are designed to encourage it. But how does winning a prestigious prize affect future output? We compare the productivity of Fields medalists (winners of the top mathematics prize) to that of similarly brilliant contenders. The two groups have similar publication rates until the award year, after which the winners’ productivity declines. The medalists begin to “play the field,” studying unfamiliar topics at the expense of writing papers. It appears that tournaments can have large post-prize effects on the effort allocation of knowledge producers.”
New book by Clive Thompson: “It’s undeniable—technology is changing the way we think. But is it for the better? Amid a chorus of doomsayers, Clive Thompson delivers a resounding “yes.” The Internet age has produced a radical new style of human intelligence, worthy of both celebration and analysis. We learn more and retain it longer, write and think with global audiences, and even gain an ESP-like awareness of the world around us. Modern technology is making us smarter, better connected, and often deeper—both as individuals and as a society.
In Smarter Than You Think Thompson shows that every technological innovation—from the written word to the printing press to the telegraph—has provoked the very same anxieties that plague us today. We panic that life will never be the same, that our attentions are eroding, that culture is being trivialized. But as in the past, we adapt—learning to use the new and retaining what’s good of the old.”
: “Thanks in part to Thaler and Sunstein’s work, the power of nudges has become well-established—including on many college campuses, where students around the country are beginning the fall semester. While online education and software-driven pedagogy on college campuses have received a good deal of attention, a less visible set of technology-driven initiatives also has gained a foothold: behavioral nudges designed to keep students on track to succeed. Just as e-commerce entrepreneurs have drawn on massive troves of consumer data to create algorithms for firms such as Netflix and Amazon, which unbundle the traditional storefront consumer experience through customized, online delivery, architects of campus technology nudges also rely on data analytics or data mining to improve the student experience.
By giving students information-driven suggestions that lead to smarter actions, technology nudges are intended to tackle a range of problems surrounding the process by which students begin college and make their way to graduation.
New approaches are certainly needed….
There are many reasons for low rates of persistence and graduation, including financial problems, the difficulty of juggling non-academic responsibilities such as work and family, and, for some first-generation students, culture shock. But academic engagement and success are major contributors. That’s why colleges are using behavioral nudges, drawing on data analytics and behavioral psychology, to focus on problems that occur along the academic pipeline:
• Poor student organization around the logistics of going to college
• Unwise course selections that increase the risk of failure and extend time to degree
• Inadequate information about academic progress and the need for academic help
• Unfocused support systems that identify struggling students but don’t directly engage with them
• Difficulty tapping into counseling services
These new ventures, whether originating within colleges or created by outside entrepreneurs, are doing things with data that just couldn’t be done in the past—creating giant databases of student course records, for example, to find patterns of success and failure that result when certain kinds of students take certain kinds of courses.”
New paper by Gabriele Camera, Marco Casari and Maria Bigoni in PNAS:”What makes money essential for the functioning of modern society? Through an experiment, we present evidence for the existence of a relevant behavioral dimension in addition to the standard theoretical arguments. Subjects faced repeated opportunities to help an anonymous counterpart who changed over time. Cooperation required trusting that help given to a stranger today would be returned by a stranger in the future. Cooperation levels declined when going from small to large groups of strangers, even if monitoring and payoffs from cooperation were invariant to group size. We then introduced intrinsically worthless tokens. Tokens endogenously became money: subjects took to reward help with a token and to demand a token in exchange for help. Subjects trusted that strangers would return help for a token. Cooperation levels remained stable as the groups grew larger. In all conditions, full cooperation was possible through a social norm of decentralized enforcement, without using tokens. This turned out to be especially demanding in large groups. Lack of trust among strangers thus made money behaviorally essential. To explain these results, we developed an evolutionary model. When behavior in society is heterogeneous, cooperation collapses without tokens. In contrast, the use of tokens makes cooperation evolutionarily stable.”
Richard Thaler in the New York Times: “I HAVE written here before about the potential gains to government from involving social and behavioral scientists in designing public policies. My enthusiasm comes in part from my experiences as an academic adviser to the Behavioral Insights Team created in Britain by Prime Minister David Cameron.
Thus I was pleased to hear reports that the White House is building a similar initiative here in the United States. Maya Shankar, a cognitive scientist and senior policy adviser at the White House Office of Science and Technology Policy, is coordinating this cross-agency group, called the Social and Behavioral Science Team; it is part of a larger effort to use evidence and innovation to promote government performance and efficiency. I am among a number of academics who have shared ideas with the administration about how research findings in social and behavioral science can improve policy.
It makes sense for social scientists to become more involved in policy, because many of society’s most challenging problems are, in essence, behavioral. Using social scientists’ findings to create plausible interventions, then testing their efficacy with randomized controlled trials, can improve — and sometimes save — people’s lives, all while reducing the need for more government spending to fix problems later.
Here are three examples of social science issues that have attracted the team’s attention…
THE 30-MILLION-WORD GAP One of society’s thorniest problems is that children from poor families start school lagging badly behind their more affluent classmates in readiness. By the age of 3, children from affluent families have vocabularies that are roughly double those of children from poor families, according to research published in 1995….
DOMESTIC VIOLENCE The team will primarily lend support and expertise to federal agency initiatives. One example concerns the effort to reduce domestic violence, a problem for which there is no quick fix….
HEALTH COMPLIANCE One reason for high health care costs is that patients fail to follow their treatment regimen….”
David Brooks in the New York Times: “We’re entering the age of what’s been called “libertarian paternalism.” Government doesn’t tell you what to do, but it gently biases the context so that you find it easier to do things you think are in your own self-interest.
Government could design forms where the default option is to donate organs or save more for retirement. Individuals would have to actively opt out to avoid doing these things. Government could tell air-conditioner makers to build in a little red light to announce when the filter needs changing. That would make homes more energy efficient, since people are too lazy to change the filters promptly otherwise. Government could crack down on companies that exploit common cognitive errors to induce you to pay more for your mortgage, bank account, credit card or car warranty. Or, most notoriously, government could make it harder for you to buy big, sugary sodas.
But this raises a philosophic question. Do we want government stepping in to protect us from our own mistakes? Many people argue no…
I’d call it social paternalism. Most of us behave somewhat decently because we are surrounded by social norms and judgments that make it simpler for us to be good. To some gentle extent, government policy should embody those norms, a preference for saving over consumption, a preference for fitness over obesity, a preference for seat belts and motorcycle helmets even though some people think it’s cooler not to wear them. In some cases, there could be opt-out provisions.
These days, we have more to fear from a tattered social fabric than from a suffocatingly tight one. Some modest paternalism might be just what we need.”
Paper by Cass Sunstein: “In recent years, social scientists have been incorporating empirical findings about human behavior into economic models. These findings offer important insights for thinking about regulation and its likely consequences. They also offer some suggestions about the appropriate design of effective, low-cost, choice-preserving approaches to regulatory problems, including disclosure requirements, default rules, and simplification. A general lesson is that small, inexpensive policy initiatives can have large and highly beneficial effects. In the United States, a large number of recent practices and reforms reflect an appreciation of this lesson. They also reflect an understanding of the need to ensure that regulations have strong empirical foundations, both through careful analysis of costs and benefits in advance and through retrospective review of what works and what does not.”
William H. Simon in the Boston Review: “Cass Sunstein went to Washington with the aim of putting some theory into practice. As administrator of the Office of Information and Regulatory Affairs (OIRA) during President Obama’s first term, he drew on the behavioral economics he helped develop as an academic. In his new book, Simpler, he reports on these efforts and elaborates a larger vision in which they exemplify “the future of government.”
…Simpler reports some notable achievements, but it exaggerates the practical value of the behaviorist toolkit. The Obama administration’s most important policy initiatives make only minor use of it. Despite its upbeat tone, the book implies an oddly constrained conception of the means and ends of government. It sometimes calls to mind a doctor putting on a cheerful face to say that, while there is little he can do to arrest the disease, he will try to make the patient as comfortable as possible.
…The obverse of Sunstein’s preoccupation with choice architecture is his relative indifference to other approaches to making administration less rigid. Recall that among the problems Sunstein sees with conventional regulation are, first, that it mandates conduct in situations where the regulator doesn’t know with confidence what is the right thing to do, and second, that it is insufficiently sensitive to relevant local variations in taste or circumstances.
The most common way to deal with the first problem—insufficient information—is to build learning into the process of intervention: the regulator intervenes provisionally, studies the effects of her intervention, and adapts as she learns. It is commonplace for statutes to mandate or fund demonstration or pilot projects. More importantly, statutes often demand that both top administrators and frontline workers reassess and adjust their practices continuously. This approach is the central and explicit thrust of Race to the Top’s “instructional improvement systems,” and it recurs prominently in all the statutes mentioned so far.”