Are We Puppets in a Wired World?
Sue Halpern in The New York Review of Books: “Also not obvious was how the Web would evolve, though its open architecture virtually assured that it would. The original Web, the Web of static homepages, documents laden with “hot links,” and electronic storefronts, segued into Web 2.0, which, by providing the means for people without technical knowledge to easily share information, recast the Internet as a global social forum with sites like Facebook, Twitter, FourSquare, and Instagram.
Once that happened, people began to make aspects of their private lives public, letting others know, for example, when they were shopping at H+M and dining at Olive Garden, letting others know what they thought of the selection at that particular branch of H+M and the waitstaff at that Olive Garden, then modeling their new jeans for all to see and sharing pictures of their antipasti and lobster ravioli—to say nothing of sharing pictures of their girlfriends, babies, and drunken classmates, or chronicling life as a high-paid escort, or worrying about skin lesions or seeking a cure for insomnia or rating professors, and on and on.
The social Web celebrated, rewarded, routinized, and normalized this kind of living out loud, all the while anesthetizing many of its participants. Although they likely knew that these disclosures were funding the new information economy, they didn’t especially care…
The assumption that decisions made by machines that have assessed reams of real-world information are more accurate than those made by people, with their foibles and prejudices, may be correct generally and wrong in the particular; and for those unfortunate souls who might never commit another crime even if the algorithm says they will, there is little recourse. In any case, computers are not “neutral”; algorithms reflect the biases of their creators, which is to say that prediction cedes an awful lot of power to the algorithm creators, who are human after all. Some of the time, too, proprietary algorithms, like the ones used by Google and Twitter and Facebook, are intentionally biased to produce results that benefit the company, not the user, and some of the time algorithms can be gamed. (There is an entire industry devoted to “optimizing” Google searches, for example.)
But the real bias inherent in algorithms is that they are, by nature, reductive. They are intended to sift through complicated, seemingly discrete information and make some sort of sense of it, which is the definition of reductive.”
Books reviewed:
To Save Everything, Click Here: The Folly of Technological Solutionism
Hacking the Future: Privacy, Identity and Anonymity on the Web
From Gutenberg to Zuckerberg: What You Really Need to Know About the Internet
Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie, or Die
Big Data: A Revolution That Will Transform How We Live, Work, and Think
Status Update: Celebrity, Publicity, and Branding in the Social Media Age
Privacy and Big Data: The Players, Regulators and Stakeholders
The End of Hypocrisy
New paper by Henry Farrell and Martha Finnemore in Foreign Affairs: “The U.S. government seems outraged that people are leaking classified materials about its less attractive behavior. It certainly acts that way: three years ago, after Chelsea Manning, an army private then known as Bradley Manning, turned over hundreds of thousands of classified cables to the anti-secrecy group WikiLeaks, U.S. authorities imprisoned the soldier under conditions that the UN special rapporteur on torture deemed cruel and inhumane. The Senate’s top Republican, Mitch McConnell, appearing on Meet the Press shortly thereafter, called WikiLeaks’ founder, Julian Assange, “a high-tech terrorist.””
More recently, following the disclosures about U.S. spying programs by Edward Snowden, a former National Security Agency analyst, U.S. officials spent a great deal of diplomatic capital trying to convince other countries to deny Snowden refuge. And U.S. President Barack Obama canceled a long-anticipated summit with Russian President Vladimir Putin when he refused to comply.
Despite such efforts, however, the U.S. establishment has often struggled to explain exactly why these leakers pose such an enormous threat. Indeed, nothing in the Manning and Snowden leaks should have shocked those who were paying attention…
The deeper threat that leakers such as Manning and Snowden pose is more subtle than a direct assault on U.S. national security: they undermine Washington’s ability to act hypocritically and get away with it. Their danger lies not in the new information that they reveal but in the documented confirmation they provide of what the United States is actually doing and why…”
What the Government Does with Americans’ Data
New paper from the Brennan Center for Justice: “After the attacks of September 11, 2001, the government’s authority to collect, keep, and share information about Americans with little or no basis to suspect wrongdoing dramatically expanded. While the risks and benefits of this approach are the subject of intense debate, one thing is certain: it results in the accumulation of large amounts of innocuous information about law-abiding citizens. But what happens to this data? In the search to find the needle, what happens to the rest of the haystack? For the first time in one report, the Brennan Center takes a comprehensive look at the multiple ways U.S. intelligence agencies collect, share, and store data on average Americans. The report, which surveys across five intelligence agencies, finds that non-terrorism related data can be kept for up to 75 years or more, clogging national security databases and creating opportunities for abuse, and recommends multiple reforms that seek to tighten control over the government’s handling of Americans’ information.”
Open Data and Open Government: Rethinking Telecommunications Policy and Regulation
New paper by Ewan Sutherland: “While attention has been given to the uses of big data by network operators and to the provision of open data by governments, there has been no systematic attempt to re-examine the regulatory systems for telecommunications. The power of public authorities to access the big data held by operators could transform regulation by simplifying proof of bias or discrimination, making operators more susceptible to behavioural remedies, while it could also be used to deliver much finer granularity of decision making. By opening up data held by government and its agencies to enterprises, think tanks and research groups it should be possible to transform market regulation.“
The small-world effect is a modern phenomenon
New paper by Seth A. Marvel, Travis Martin, Charles R. Doering, David Lusseau, M. E. J. Newman: “The “small-world effect” is the observation that one can find a short chain of acquaintances, often of no more than a handful of individuals, connecting almost any two people on the planet. It is often expressed in the language of networks, where it is equivalent to the statement that most pairs of individuals are connected by a short path through the acquaintance network. Although the small-world effect is well-established empirically for contemporary social networks, we argue here that it is a relatively recent phenomenon, arising only in the last few hundred years: for most of mankind’s tenure on Earth the social world was large, with most pairs of individuals connected by relatively long chains of acquaintances, if at all. Our conclusions are based on observations about the spread of diseases, which travel over contact networks between individuals and whose dynamics can give us clues to the structure of those networks even when direct network measurements are not available. As an example we consider the spread of the Black Death in 14th-century Europe, which is known to have traveled across the continent in well-defined waves of infection over the course of several years. Using established epidemiological models, we show that such wave-like behavior can occur only if contacts between individuals living far apart are exponentially rare. We further show that if long-distance contacts are exponentially rare, then the shortest chain of contacts between distant individuals is on average a long one. The observation of the wave-like spread of a disease like the Black Death thus implies a network without the small-world effect.”
The New Eye of Government: Citizen Sentiment Analysis in Social Media
Interview with Richard Thaler
Interview with Richard Thaler, University of Chicago behavioral economist, by Douglas Clement Editor, The Region: “…Region: One thing we haven’t talked about yet is your work on reciprocity and cooperation. And let’s use another British example, Golden Balls. You did some fascinating research on this British game show. Can you tell that story and what it illustrated?
Thaler: You know, it’s funny, this goes back to Gary’s line [about behavior in real markets as opposed to labs]. As you know, this game show ends in a prisoner’s dilemma. And there have been thousands of experiments run on one-shot prisoner’s dilemmas. We know that economic theory says that the rational strategy is to defect; theory says everyone will defect. It’s the dominant strategy.
In experiments, about 40 to 50 percent of the people cooperate, but it involves small stakes. In this paper we write about the actual game show, there’s one trial, a round in the actual game show—you may have seen the clip of it—where it’s not small stakes at all; it’s around 100,000 pounds. And that’s one of the things we were interested in: What happens when you raise the stakes?
This is what happens: You get a plot like this (see hand-drawn plot and actual plot). I just happened to have drawn this for another visitor, a grad student.
So, yes, the economists were right. If you raise the stakes, cooperation falls. But it falls to the same level you see in the lab. The interesting behavioral thing is, when the stakes are small, compared to what other people are playing for in the game show, then cooperation gets even higher.
This goes to bounded self-interest. Economists assume people are unboundedly unscrupulous—or I’ll say self-interested, a more polite term. But there have been lots of experiments where you leave a wallet out and depending on the place—I don’t remember the exact data—but a large percentage get returned. Now, some wallets also get picked clean first, but … so I wrote about this too. (He displays a photo of a roadside rhubarb stand.)
Region: What is this?
Thaler: This is significant. Notice the features of this. It’s a roadside stand; they’re selling rhubarb. And it’s got an honor box with a lock on it.
I think this is exactly the right model of human nature, that if you put this stuff out there, enough people will leave money that it’s worth the farmer’s time to put it out. But if you left the money in a box that was unlocked, somebody would take it.
Region: It takes just one dishonest person to “undo” the honesty of many others …
Thaler: Right. If you ask somebody directions, most people will tell you. It’s very fortunate that we don’t live in a society where everybody is out to take advantage of us. For instance, if you have work done in your house or on your car, there’s absolutely no way for you to monitor what they’re doing, unless you’re willing to spend the time watching them and you happen to know a lot about the work, materials and methods being used.
So it has to involve trust. Trust is really important in society, and anything we can do to increase trust is worthwhile. There’s probably nothing you could do to help an economy grow faster than to increase the amount of trust in society….
Platform Strategies for Open Government Innovation
New paper by B. Cleland, B. Galbraith, B. Quinn, and P. Humphreys: “The concept of Open Innovation, that inflows and outflows of knowledge can accelerate innovation, has attracted a great deal of research in recent years (Dahlander and Gann, 2010; Fredberg et al, 2008). At the same time there has been a growing policy interest in Open Government, based in part on the assumption that open processes in the public sector can enable private sector innovation (Yu and Robinson, 2012). However, as pointed out by Huizingh (2011), there is a lack of practical guidance for managers. Furthermore, the specific challenges of implementing Open Innovation in the public sector have not been adequately addressed (Lee et al., 2012). Recent literature on technology platforms suggests a potentially useful framework for understanding the processes that underpin Open Innovation (Janssen and Estevez, 2013; O’Reilly, 2011). The paper reviews the literature on Open Innovation, e-Government and Platforms in order to shed light on the challenges of Open Government. It has been proposed that re-thinking government as a platform provider offers significant opportunities for value creation (Orszag, 2009), but a deeper understanding of platform architecture will be required to properly exploit those opportunities. Based on an examination of the literature we identify the core issues that are likely to characterise this new phenomenon.”
And Data for All: On the Validity and Usefulness of Open Government Data
Paper presented at the the 13th International Conference on Knowledge Management and Knowledge Technologies: “Open Government Data (OGD) stands for a relatively young trend to make data that is collected and maintained by state authorities available for the public. Although various Austrian OGD initiatives have been started in the last few years, less is known about the validity and the usefulness of the data offered. Based on the data-set on Vienna’s stock of trees, we address two questions in this paper. First of all, we examine the quality of the data by validating it according to knowledge from a related discipline. It shows that the data-set we used correlates with findings from meteorology. Then, we explore the usefulness and exploitability of OGD by describing a concrete scenario in which this data-set can be supportive for citizens in their everyday life and by discussing further application areas in which OGD can be beneficial for different stakeholders and even commercially used.”