The Art of Data Visualization (video)


PBS Off Book: “Humans have a powerful capacity to process visual information, skills that date far back in our evolutionary lineage. And since the advent of science, we have employed intricate visual strategies to communicate data, often utilizing design principles that draw on these basic cognitive skills. In a modern world where we have far more data than we can process, the practice of data visualization has gained even more importance. From scientific visualization to pop infographics, designers are increasingly tasked with incorporating data into the media experience. Data has emerged as such a critical part of modern life that it has entered into the realm of art, where data-driven visual experiences challenge viewers to find personal meaning from a sea of information, a task that is increasingly present in every aspect of our information-infused lives.”

Technologies Of Choice?


New MIT book by Dorothea Kleine:  “Information and communication technologies (ICTs)–especially the Internet and the mobile phone–have changed the lives of people all over the world. These changes affect not just the affluent populations of income-rich countries but also disadvantaged people in both global North and South, who may use free Internet access in telecenters and public libraries, chat in cybercafes with distant family members, and receive information by text message or email on their mobile phones. Drawing on Amartya Sen’s capabilities approach to development–which shifts the focus from economic growth to a more holistic, freedom-based idea of human development–Dorothea Kleine in Technologies of Choice? examines the relationship between ICTs, choice, and development.
Kleine proposes a conceptual framework, the Choice Framework, that can be used to analyze the role of technologies in development processes. She applies the Choice Framework to a case study of microentrepreneurs in a rural community in Chile. Kleine combines ethnographic research at the local level with interviews with national policy makers, to contrast the high ambitions of Chile’s pioneering ICT policies with the country’s complex social and economic realities. She examines three key policies of Chile’s groundbreaking Agenda Digital: public access, digital literacy, and an online procurement system. The policy lesson we can learn from Chile’s experience, Kleine concludes, is the necessity of measuring ICT policies against a people-centered understanding of development that has individual and collective choice at its heart.”

The open parliament in the age of the internet


Worldbank’s Tiago Peixoto reviewing Cristiano Faria’s book on Open Parliaments : “There are many reasons to read Cristiano’s piece, one of them being the scarcity of literature dealing with the usage of ICT by the legislative branch. I was honoured to be invited to write the preface to this book, in which I list a few other reasons why I think this book is very worthwhile reading. I have reproduced the preface below, with the addition of some hyperlinks.
***
Towards the end of the 18th Century, not long after the French Revolution, engineer Claude Chappe invented the optical telegraph. Also known as the Napoleonic Telegraph, this technological innovation enabled the transmission of messages over great distances at unprecedented speeds for its time. This novelty did not go unnoticed by the intellectuals of the period: the possibility of establishing a telegraph network that could connect individuals at high speed and lowered costs was seen as a unique opportunity for direct democracy to flourish.
The difficulties associated with direct democracy, so eloquently expressed by Rousseau just a few years earlier, no longer seemed relevant: simply opening the code used by the telegraph operators would suffice for a whirlpool of ideas to flow between citizens and government, bringing a new era of participatory decision-making. Events, however, took a different turn, and as time went by the enthusiasm for a democratic renewal faded away.
In the course of the centuries that followed, similar stories abounded. The emergence of each new ICT gave rise to a period of enthusiasm surrounding a renewal in politics and government, only to be followed by bitter disillusionment. While the causes of these historical experiences are multiple, it is safe to say that the failure of these technologies to deliver their much-heralded potential is underscored by a lack of understanding of the role of political institutions. These institutions are, inexorably, sources of obstacles and challenges that go beyond the reach of technological solutions.
Indeed, one could argue that despite the historical evidence, even today a certain amount of ingenuity permeates the majority of academic works in the domain of electronic democracy and open government, overestimating technological innovation and neglecting the role of institutions, actors, and their respective strategies.
Not falling prey to the techno-determinist temptation but rather carrying out an analysis grounded in institutions, organizational processes and actors’ strategies, is one of the many strengths of Cristiano Faria’s work…”

New Book: New Technology, Organizational Change and Governance


Book Description (Edited By Emmanuelle Avril and Christine Zumello): “The advent of globalisation and the continued development of new information technology has created an environment in which the one certainty for organisations is that they cannot cling to archaic, centralised and hierarchical models. The increased fluidity and speed of the global environment call for horizontal networked structures, where decisions are achieved through collaborative mechanisms, rather than pyramidal models. New processes have been emerging, in particular the practices of deliberative and participatory governance, with increased stakeholder and citizen inclusion and participation, greater use and reliance on networks of organisations, and efforts to resolve conflict through dialogue. New forms of organizations, networks, coalitions and partnerships, as well as the promises of open sourcing and the collaborative horizontal model point towards a new governance apparatus in which relationship-based patterns can project and protect a human dimension in this digital world. This book will prove invaluable to all those who are interested in participatory governance and organisational change.”

The Big Data Debate: Correlation vs. Causation


Gil Press: “In the first quarter of 2013, the stock of big data has experienced sudden declines followed by sporadic bouts of enthusiasm. The volatility—a new big data “V”—continues and Ted Cuzzillo summed up the recent negative sentiment in “Big data, big hype, big danger” on SmartDataCollective:
“A remarkable thing happened in Big Data last week. One of Big Data’s best friends poked fun at one of its cornerstones: the Three V’s. The well-networked and alert observer Shawn Rogers, vice president of research at Enterprise Management Associates, tweeted his eight V’s: ‘…Vast, Volumes of Vigorously, Verified, Vexingly Variable Verbose yet Valuable Visualized high Velocity Data.’ He was quick to explain to me that this is no comment on Gartner analyst Doug Laney’s three-V definition. Shawn’s just tired of people getting stuck on V’s.”…
Cuzzillo is joined by a growing chorus of critics that challenge some of the breathless pronouncements of big data enthusiasts. Specifically, it looks like the backlash theme-of-the-month is correlation vs. causation, possibly in reaction to the success of Viktor Mayer-Schönberger and Kenneth Cukier’s recent big data book in which they argued for dispensing “with a reliance on causation in favor of correlation”…
In “Steamrolled by Big Data,” The New Yorker’s Gary Marcus declares that “Big Data isn’t nearly the boundless miracle that many people seem to think it is.”…
Matti Keltanen at The Guardian agrees, explaining “Why ‘lean data’ beats big data.” Writes Keltanen: “…the lightest, simplest way to achieve your data analysis goals is the best one…The dirty secret of big data is that no algorithm can tell you what’s significant, or what it means. Data then becomes another problem for you to solve. A lean data approach suggests starting with questions relevant to your business and finding ways to answer them through data, rather than sifting through countless data sets. Furthermore, purely algorithmic extraction of rules from data is prone to creating spurious connections, such as false correlations… today’s big data hype seems more concerned with indiscriminate hoarding than helping businesses make the right decisions.”
In “Data Skepticism,” O’Reilly Radar’s Mike Loukides adds this gem to the discussion: “The idea that there are limitations to data, even very big data, doesn’t contradict Google’s mantra that more data is better than smarter algorithms; it does mean that even when you have unlimited data, you have to be very careful about the conclusions you draw from that data. It is in conflict with the all-too-common idea that, if you have lots and lots of data, correlation is as good as causation.”
Isn’t more-data-is-better the same as correlation-is-as-good-as-causation? Or, in the words of Chris Andersen, “with enough data, the numbers speak for themselves.”
“Can numbers actually speak for themselves?” non-believer Kate Crawford asks in “The Hidden Biases in Big Data” on the Harvard Business Review blog and answers: “Sadly, they can’t. Data and data sets are not objective; they are creations of human design…
And David Brooks in The New York Times, while probing the limits of “the big data revolution,” takes the discussion to yet another level: “One limit is that correlations are actually not all that clear. A zillion things can correlate with each other, depending on how you structure the data and what you compare. To discern meaningful correlations from meaningless ones, you often have to rely on some causal hypothesis about what is leading to what. You wind up back in the land of human theorizing…”

New book: Disasters and the Networked Economy


booksBook description: “Mainstream quantitative analysis and simulations are fraught with difficulties and are intrinsically unable to deal appropriately with long-term macroeconomic effects of disasters. In this new book, J.M. Albala-Bertrand develops the themes introduced in his past book, The Political Economy of Large Natural Disasters (Clarendon Press, 1993), to show that societal networking and disaster localization constitute part of an essential framework to understand disaster effects and responses.
The author’s last book argued that disasters were a problem of development, rather than a problem for development. This volume takes the argument forward both in terms of the macroeconomic effects of disaster and development policy, arguing that economy and society are not inert objects, but living organisms. Using a framework based on societal networking and the economic localization of disasters, the author shows that societal functionality (defined as the capacity of a system to survive, reproduce and develop) is unlikely to be impaired by natural disasters.”

Department of Better Technology


logo-250Next City reports: “…opening up government can get expensive. That’s why two developers this week launched the Department of Better Technology, an effort to make open government tools cheaper, more efficient and easier to engage with.

As founder Clay Johnson explains in a post on the site’s blog, a federal website that catalogues databases on government contracts, which launched last year, cost $181 million to build — $81 million more than a recent research initiative to map the human brain.

“I’d like to say that this is just a one-off anomaly, but government regularly pays millions of dollars for websites,” writes Johnson, the former director of Sunlight Labs at the Sunlight Foundation and author the 2012 book The Information Diet.

The first undertaking of Johnson and his partner, GovHub co-founder Adam Becker, is a tool meant to make it simpler for businesses to find government projects to bid on, as well as help officials streamline the process of managing procurements. In a pilot experiment, Johnson writes, the pair found that not only were bids coming in faster and at a reduced price, but more people were doing the bidding.

Per Johnson, “many of the bids that came in were from businesses that had not ordinarily contracted with the federal government before.”
The Department of Better Technology will accept five cities to test a beta version of this tool, called Procure.io, in 2013.”

The transformation of democratic taxation states into post-democratic banking states


John Keane, Professor of Politics, in The Conversation: “The extraordinary bounce-back reveals the most disturbing, but least obvious, largely invisible, feature of the unfinished European crisis: the transformation of democratic taxation states into post-democratic banking states.
What is meant by this mouthful? The Austrian economist Joseph Schumpeter long ago pointed out how modern European states (at first they were monarchies, later most became republics) fed upon taxes extracted from their subject populations. The point is still emphasised by government and politics textbooks. Usually this is done by noting that under democratic conditions elected governments are expected to satisfy the needs and respond to the demands of citizens by providing various goods and services paid for through taxation granted by their consent. Behind this observation stands the presumption that the creation and circulation of money is the prerogative of the state. ‘Money is a creature of the legal order’, wrote Georg Friedrich Knapp in his classic State Theory of Money (1905)….
Slowly but surely, in most European democracies, the power to create and regulate money has effectively been privatised. Without much public commentary or public resistance, governments of recent decades have surrendered their control over a vital resource, with the result that commercial banks and credit institutions now have much more ‘spending power’ than elected governments. In a most interesting new book, the acclaimed historian Harold James has described how this out-flanking of European states by banks and credit institutions was reinforced at the supra-national level, disastrously it turns out, by the formation of the independent European Central Bank….
The principle of no taxation without representation was one of the most important of these innovations. Born of deep tensions between citizen creditors and monarchs in the prosperous Low Countries, it proved to be revolutionary. In late 16th-century cities such as Amsterdam and Bruges, influential men with money to invest demanded, as citizens, that they should only agree to lend money to governments, and to pay their taxes, if in return they were granted the power to decide who governs them. The principle was first formulated in the name of democracy (democratie) in a remarkable Dutch-language pamphlet called The Discourse (it’s analysed in detail in The Life and Death of Democracy. Its author is unknown….
Sure, these political proposals and reforms are better than nothing, but if my short history of banks and democracy is plausible then it suggests that a much tougher and more innovative program of democratisation is needed. If the aim is to ‘throw as many wrenches as possible into the works of haute finance’ (Wolfgang Streeck), then organised pressures from below, from both voters and civil society networks, will be vital.”
 

Churnalism


‘Churnalism’ is a news article that is published as journalism, but is essentially a press release without much added.

The Sunlight Foundation and the Media Standards Trust launched Churnalism US, “a new web tool and browser extension that allows anyone to compare the news you read against existing content to uncover possible instances of plagiarism” (churned from their blog post).

The new tool is inspired by the UK site “churnalism.com” (a project of the Media Standards Trust). According to the FAQ of Churnalism.com:

‘Churnalism’ is a news article that is published as journalism, but is essentially a press release without much added. In his landmark book, Flat Earth NewsNick Davies wrote how ‘churnalism’ is produced by “Journalists who are no longer gathering news but are reduced instead to passive processors of whatever material comes their way, churning out stories, whether real event or PR artifice, important or trivial, true or false” (p.59).

According to the Cardiff University research that informed Davies’ book, 54% of news articles have some form of PR in them. The word ‘churnalism’ has been attributed to BBC journalist Waseem Zakir.

“Of course not all churnalism is bad. Some press releases are clearly in the public interest (medical breakthroughs, government announcements, school closures and so on). But even in these cases, it is better that people should know what press release the article is based on than for the source of the article to remain hidden.”

In a detailed blog post, Drew Vogel, a developer on Churnalism US, explains the nuts and bolts behind the site, which is fueled by a full-text search database named SuperFastMatch.

Kaitlin Devine, another developer on Churnalism, provides a two-minute tutorial on how Churnalism US works:

The GovLab


Steven Johnson, author of Future Perfect : “Peer-to-Patent stands as one of my favorite examples of peer progressive thinking at work. It brings in outside minds not directly affiliated with the government to help the government solve the problems it faces, effectively making a more porous boundary between citizen and state….I say all this to explain why I’m excited to be flying to NY tonight to help Noveck with her latest project, the Governance Lab at NYU, an extended, multidisciplinary investigation in new forms of participatory governance, backed by the Knight Foundation and the MacArthur Foundation…
I wrote Future Perfect in large part to capture all the thrilling new experiments and research into peer collaboration that I saw flourishing all around me, and to give those diverse projects the umbrella name of peer progressivism so that they could be more easily conceived as a unified movement. But I also wrote the book with the explicit assumption that we had a lot to learn about these systems. For starters, peer networks take a number of different forms: crowdfunding projects like Kickstarter are quite different from crowd-authored projects like open source software or Wikipedia; prize-backed challenges are a completely different beast altogether. For movement-building, it’s important to stress the commonalities between these different networks, but for practical application, we need to study the distinctions. And we need to avoid the easy assumption that decentralized, peer-based approaches will always outperform centralized ones.