Book Description (Edited By Emmanuelle Avril and Christine Zumello): “The advent of globalisation and the continued development of new information technology has created an environment in which the one certainty for organisations is that they cannot cling to archaic, centralised and hierarchical models. The increased fluidity and speed of the global environment call for horizontal networked structures, where decisions are achieved through collaborative mechanisms, rather than pyramidal models. New processes have been emerging, in particular the practices of deliberative and participatory governance, with increased stakeholder and citizen inclusion and participation, greater use and reliance on networks of organisations, and efforts to resolve conflict through dialogue. New forms of organizations, networks, coalitions and partnerships, as well as the promises of open sourcing and the collaborative horizontal model point towards a new governance apparatus in which relationship-based patterns can project and protect a human dimension in this digital world. This book will prove invaluable to all those who are interested in participatory governance and organisational change.”
The Big Data Debate: Correlation vs. Causation
Gil Press: “In the first quarter of 2013, the stock of big data has experienced sudden declines followed by sporadic bouts of enthusiasm. The volatility—a new big data “V”—continues and Ted Cuzzillo summed up the recent negative sentiment in “Big data, big hype, big danger” on SmartDataCollective:
“A remarkable thing happened in Big Data last week. One of Big Data’s best friends poked fun at one of its cornerstones: the Three V’s. The well-networked and alert observer Shawn Rogers, vice president of research at Enterprise Management Associates, tweeted his eight V’s: ‘…Vast, Volumes of Vigorously, Verified, Vexingly Variable Verbose yet Valuable Visualized high Velocity Data.’ He was quick to explain to me that this is no comment on Gartner analyst Doug Laney’s three-V definition. Shawn’s just tired of people getting stuck on V’s.”…
Cuzzillo is joined by a growing chorus of critics that challenge some of the breathless pronouncements of big data enthusiasts. Specifically, it looks like the backlash theme-of-the-month is correlation vs. causation, possibly in reaction to the success of Viktor Mayer-Schönberger and Kenneth Cukier’s recent big data book in which they argued for dispensing “with a reliance on causation in favor of correlation”…
In “Steamrolled by Big Data,” The New Yorker’s Gary Marcus declares that “Big Data isn’t nearly the boundless miracle that many people seem to think it is.”…
Matti Keltanen at The Guardian agrees, explaining “Why ‘lean data’ beats big data.” Writes Keltanen: “…the lightest, simplest way to achieve your data analysis goals is the best one…The dirty secret of big data is that no algorithm can tell you what’s significant, or what it means. Data then becomes another problem for you to solve. A lean data approach suggests starting with questions relevant to your business and finding ways to answer them through data, rather than sifting through countless data sets. Furthermore, purely algorithmic extraction of rules from data is prone to creating spurious connections, such as false correlations… today’s big data hype seems more concerned with indiscriminate hoarding than helping businesses make the right decisions.”
In “Data Skepticism,” O’Reilly Radar’s Mike Loukides adds this gem to the discussion: “The idea that there are limitations to data, even very big data, doesn’t contradict Google’s mantra that more data is better than smarter algorithms; it does mean that even when you have unlimited data, you have to be very careful about the conclusions you draw from that data. It is in conflict with the all-too-common idea that, if you have lots and lots of data, correlation is as good as causation.”
Isn’t more-data-is-better the same as correlation-is-as-good-as-causation? Or, in the words of Chris Andersen, “with enough data, the numbers speak for themselves.”
“Can numbers actually speak for themselves?” non-believer Kate Crawford asks in “The Hidden Biases in Big Data” on the Harvard Business Review blog and answers: “Sadly, they can’t. Data and data sets are not objective; they are creations of human design…
And David Brooks in The New York Times, while probing the limits of “the big data revolution,” takes the discussion to yet another level: “One limit is that correlations are actually not all that clear. A zillion things can correlate with each other, depending on how you structure the data and what you compare. To discern meaningful correlations from meaningless ones, you often have to rely on some causal hypothesis about what is leading to what. You wind up back in the land of human theorizing…”
New book: Disasters and the Networked Economy
Book description: “Mainstream quantitative analysis and simulations are fraught with difficulties and are intrinsically unable to deal appropriately with long-term macroeconomic effects of disasters. In this new book, J.M. Albala-Bertrand develops the themes introduced in his past book, The Political Economy of Large Natural Disasters (Clarendon Press, 1993), to show that societal networking and disaster localization constitute part of an essential framework to understand disaster effects and responses.
The author’s last book argued that disasters were a problem of development, rather than a problem for development. This volume takes the argument forward both in terms of the macroeconomic effects of disaster and development policy, arguing that economy and society are not inert objects, but living organisms. Using a framework based on societal networking and the economic localization of disasters, the author shows that societal functionality (defined as the capacity of a system to survive, reproduce and develop) is unlikely to be impaired by natural disasters.”
Department of Better Technology
Next City reports: “…opening up government can get expensive. That’s why two developers this week launched the Department of Better Technology, an effort to make open government tools cheaper, more efficient and easier to engage with.
As founder Clay Johnson explains in a post on the site’s blog, a federal website that catalogues databases on government contracts, which launched last year, cost $181 million to build — $81 million more than a recent research initiative to map the human brain.
“I’d like to say that this is just a one-off anomaly, but government regularly pays millions of dollars for websites,” writes Johnson, the former director of Sunlight Labs at the Sunlight Foundation and author the 2012 book The Information Diet.
The first undertaking of Johnson and his partner, GovHub co-founder Adam Becker, is a tool meant to make it simpler for businesses to find government projects to bid on, as well as help officials streamline the process of managing procurements. In a pilot experiment, Johnson writes, the pair found that not only were bids coming in faster and at a reduced price, but more people were doing the bidding.
Per Johnson, “many of the bids that came in were from businesses that had not ordinarily contracted with the federal government before.”
The Department of Better Technology will accept five cities to test a beta version of this tool, called Procure.io, in 2013.”
The transformation of democratic taxation states into post-democratic banking states
John Keane, Professor of Politics, in The Conversation: “The extraordinary bounce-back reveals the most disturbing, but least obvious, largely invisible, feature of the unfinished European crisis: the transformation of democratic taxation states into post-democratic banking states.
What is meant by this mouthful? The Austrian economist Joseph Schumpeter long ago pointed out how modern European states (at first they were monarchies, later most became republics) fed upon taxes extracted from their subject populations. The point is still emphasised by government and politics textbooks. Usually this is done by noting that under democratic conditions elected governments are expected to satisfy the needs and respond to the demands of citizens by providing various goods and services paid for through taxation granted by their consent. Behind this observation stands the presumption that the creation and circulation of money is the prerogative of the state. ‘Money is a creature of the legal order’, wrote Georg Friedrich Knapp in his classic State Theory of Money (1905)….
Slowly but surely, in most European democracies, the power to create and regulate money has effectively been privatised. Without much public commentary or public resistance, governments of recent decades have surrendered their control over a vital resource, with the result that commercial banks and credit institutions now have much more ‘spending power’ than elected governments. In a most interesting new book, the acclaimed historian Harold James has described how this out-flanking of European states by banks and credit institutions was reinforced at the supra-national level, disastrously it turns out, by the formation of the independent European Central Bank….
The principle of no taxation without representation was one of the most important of these innovations. Born of deep tensions between citizen creditors and monarchs in the prosperous Low Countries, it proved to be revolutionary. In late 16th-century cities such as Amsterdam and Bruges, influential men with money to invest demanded, as citizens, that they should only agree to lend money to governments, and to pay their taxes, if in return they were granted the power to decide who governs them. The principle was first formulated in the name of democracy (democratie) in a remarkable Dutch-language pamphlet called The Discourse (it’s analysed in detail in The Life and Death of Democracy. Its author is unknown….
Sure, these political proposals and reforms are better than nothing, but if my short history of banks and democracy is plausible then it suggests that a much tougher and more innovative program of democratisation is needed. If the aim is to ‘throw as many wrenches as possible into the works of haute finance’ (Wolfgang Streeck), then organised pressures from below, from both voters and civil society networks, will be vital.”
Churnalism
‘Churnalism’ is a news article that is published as journalism, but is essentially a press release without much added.
The Sunlight Foundation and the Media Standards Trust launched Churnalism US, “a new web tool and browser extension that allows anyone to compare the news you read against existing content to uncover possible instances of plagiarism” (churned from their blog post).
The new tool is inspired by the UK site “churnalism.com” (a project of the Media Standards Trust). According to the FAQ of Churnalism.com:
‘Churnalism’ is a news article that is published as journalism, but is essentially a press release without much added. In his landmark book, Flat Earth News, Nick Davies wrote how ‘churnalism’ is produced by “Journalists who are no longer gathering news but are reduced instead to passive processors of whatever material comes their way, churning out stories, whether real event or PR artifice, important or trivial, true or false” (p.59).
According to the Cardiff University research that informed Davies’ book, 54% of news articles have some form of PR in them. The word ‘churnalism’ has been attributed to BBC journalist Waseem Zakir.
“Of course not all churnalism is bad. Some press releases are clearly in the public interest (medical breakthroughs, government announcements, school closures and so on). But even in these cases, it is better that people should know what press release the article is based on than for the source of the article to remain hidden.”
In a detailed blog post, Drew Vogel, a developer on Churnalism US, explains the nuts and bolts behind the site, which is fueled by a full-text search database named SuperFastMatch.
Kaitlin Devine, another developer on Churnalism, provides a two-minute tutorial on how Churnalism US works:
The GovLab
Steven Johnson, author of Future Perfect : “Peer-to-Patent stands as one of my favorite examples of peer progressive thinking at work. It brings in outside minds not directly affiliated with the government to help the government solve the problems it faces, effectively making a more porous boundary between citizen and state….I say all this to explain why I’m excited to be flying to NY tonight to help Noveck with her latest project, the Governance Lab at NYU, an extended, multidisciplinary investigation in new forms of participatory governance, backed by the Knight Foundation and the MacArthur Foundation…
I wrote Future Perfect in large part to capture all the thrilling new experiments and research into peer collaboration that I saw flourishing all around me, and to give those diverse projects the umbrella name of peer progressivism so that they could be more easily conceived as a unified movement. But I also wrote the book with the explicit assumption that we had a lot to learn about these systems. For starters, peer networks take a number of different forms: crowdfunding projects like Kickstarter are quite different from crowd-authored projects like open source software or Wikipedia; prize-backed challenges are a completely different beast altogether. For movement-building, it’s important to stress the commonalities between these different networks, but for practical application, we need to study the distinctions. And we need to avoid the easy assumption that decentralized, peer-based approaches will always outperform centralized ones.
The New Digital Age: Reshaping the Future of People, Nations and Business
The New Digital Age: Reshaping the Future of People, Nations and Business by Eric Schmidt and Jared Cohen, Knopf, 2013
Scientific American: “Schmidt, executive chairman of Google, and Cohen, director of Google Ideas and a foreign policy wonk who has advised Hillary Clinton, deliver their vision of the future in this ambitious, fascinating account. For gadget geeks, the book is filled with tantalizing examples of futuristic goods and services: robotic plumbers; automated haircuts; computers that read body language; and 3-D holographs of weddings projected into the living rooms of relatives who couldn’t attend. Not surprisingly, the authors are bullish on how connectivity—access to the Internet that will soon be nearly universal—will transform education, terrorism, journalism, government, privacy and war. The result, they argue, though not perfect, will be “more egalitarian, more transparent and more interesting than we can even imagine.”
Seeing is believing
Christopher Caldwell in the Financial Times , reviewing the new book of film academic Stephen Apkon called The Age of the Image , argues that “The written word is becoming the language of a scholarly establishment”:
“Until recently, it was the essence of statesmanship, scholarship and justice to purge strong emotion from our deliberations. Images today, though, are so plentiful and sharp that they dominate our thought processes. Although Mr Apkon relishes the immediacy of YouTube, he fears that political advertisers will soon be able to craft stories around “hidden mental hungers”, easily manipulating voters.
Citizens tend to think about voting in one of two ways. First, you base your vote on your identity. You are a farmer, so you choose the candidate best disposed towards farmers. The second theory is that you vote on arguments, independent of identity. You believe a sales tax should replace income tax, so you vote for the candidate who shares that opinion. But today’s image-based communication has little to do with identity or arguments. It has to do with the lowest-common-denominator traits that mark you as a human animal.”