Steven Weber, professor in the School of Information and Political Science department at UC Berkeley, in Policy by the Numbers: “It’s commonly said that most people overestimate the impact of technology in the short term, and underestimate its impact over the longer term.
Where is Big Data in 2013? Starting to get very real, in our view, and right on the cusp of underestimation in the long term. The short term hype cycle is (thankfully) burning itself out, and the profound changes that data science can and will bring to human life are just now coming into focus. It may be that Data Science is right now about where the Internet itself was in 1993 or so. That’s roughly when it became clear that the World Wide Web was a wind that would blow across just about every sector of the modern economy while transforming foundational things we thought were locked in about human relationships, politics, and social change. It’s becoming a reasonable bet that Data Science is set to do the same—again, and perhaps even more profoundly—over the next decade. Just possibly, more quickly than that….
Can data, no matter how big, change the world for the better? It may be the case that in some fields of human endeavor and behavior, the scientific analysis of big data by itself will create such powerful insights that change will simply have to happen, that businesses will deftly re-organize, that health care will remake itself for efficiency and better outcomes, that people will adopt new behaviors that make them happier, healthier, more prosperous and peaceful. Maybe. But almost everything we know about technology and society across human history argues that it won’t be so straightforward. …join senior industry and academic leaders at DataEDGE at UC Berkeley on May 30-31 to engage in what will be a lively and important conversation aimed at answering today’s questions about the data science revolution—and formulating tomorrow’s.
New Paper (By Juliet E. Carlisle and Robert C. Patton) analyzing Facebook and the 2008 Presidential Election in Political Research Quaterly: “This research conceptualizes political engagement in Facebook and examines the political activity of Facebook users during the 2008 presidential primary (T1) and general election (T2). Using a resource model, we test whether factors helpful in understanding offline political participation also explain political participation in Facebook. We consider resources (socioeconomic status [SES]) and political interest and also test whether network size works to increase political activity. We find that individual political activity in Facebook is not as extensive as popular accounts suggest. Moreover, the predictors associated with the resource model and Putnam’s theory of social capital do not hold true in Facebook.”
Mariano Blejman and Miguel Paz @ IJNet Blog: “We need a central repository where you can share the data that you have proved to be reliable. Our answer to this need: OpenData Latinoamérica, which we are leading as ICFJ Knight International Journalism Fellows.
Inspired by the open data portal created by ICFJ Knight International Journalism Fellow Justin Arenstein in Africa, OpenData Latinoamérica aims to improve the use of data in this region where data sets too often fail to show up where they should, and when they do, are scattered about the web at governmental repositories and multiple independent repositories where the data is removed too quickly.
The portal will be used at two big upcoming events: Bolivia’s first DataBootCamp and the Conferencia de Datos Abiertos (Open Data Conference) in Montevideo, Uruguay. Then, we’ll hold a series of hackathons and scrape-athons in Chile, which is in a period of presidential elections in which citizens increasingly demand greater transparency. Releasing data and developing applications for accountability will be the key.”
New Paper by Prof. Archon Fung in Politics and Society: “In Infotopia, citizens enjoy a wide range of information about the organizations upon which they rely for the satisfaction of their vital interests. The provision of that information is governed by principles of democratic transparency. Democratic transparency both extends and critiques current enthusiasms about transparency. It urges us to conceptualize information politically, as a resource to turn the behavior of large organizations in socially beneficial ways. Transparency efforts have targets, and we should think of those targets as large organizations: public and civic, but especially private and corporate. Democratic transparency consists of four principles. First, information about the operations and actions of large organizations that affect citizens’ interests should be rich, deep, and readily available to the public. Second, the amount of available information should be proportionate to the extent to which those organizations jeopardize citizens’ interests. Third, information should be organized and provided in ways that are accessible to individuals and groups that use that information. Finally, the social, political, and economic structures of society should be organized in ways that allow individuals and groups to take action based on Infotopia’s public disclosures.”
“The framers designed a constitutional system in which the government would play a vigorous role in securing the liberty and well-being of a large and diverse population. They built a political system around a number of key elements, including debate and deliberation, divided powers competing with one another, regular order in the legislative process, and avenues to limit and punish corruption. America in recent years has struggled to adhere to each of these principles, leading to a crisis of governability and legitimacy. The roots of this problem are twofold. The first is a serious mismatch between our political parties, which have become as polarized and vehemently adversarial as parliamentary parties, and a separation-of-powers governing system that makes it extremely difficult for majorities to act. The second is the asymmetric character of the polarization. The Republican Party has become a radical insurgency—ideologically extreme, scornful of facts and compromise, and dismissive of the legitimacy of its political opposition. Securing the common good in the face of these developments will require structural changes but also an informed and strategically focused citizenry.”
National Academies of Sciences: “Over the course of several decades, copyright protection has been expanded and extended through legislative changes occasioned by national and international developments. The content and technology industries affected by copyright and its exceptions, and in some cases balancing the two, have become increasingly important as sources of economic growth, relatively high-paying jobs, and exports. Since the expansion of digital technology in the mid-1990s, they have undergone a technological revolution that has disrupted long-established modes of creating, distributing, and using works ranging from literature and news to film and music to scientific publications and computer software.
In the United States and internationally, these disruptive changes have given rise to a strident debate over copyright’s proper scope and terms and means of its enforcement–a debate between those who believe the digital revolution is progressively undermining the copyright protection essential to encourage the funding, creation, and distribution of new works and those who believe that enhancements to copyright are inhibiting technological innovation and free expression.
Copyright in the Digital Era: Building Evidence for Policy examines a range of questions regarding copyright policy by using a variety of methods, such as case studies, international and sectoral comparisons, and experiments and surveys. This report is especially critical in light of digital age developments that may, for example, change the incentive calculus for various actors in the copyright system, impact the costs of voluntary copyright transactions, pose new enforcement challenges, and change the optimal balance between copyright protection and exceptions.”
The Guardian: “Since 2010 David Cameron’s pet project has been tasked with finding ways to improve society’s behaviour – and now the ‘nudge unit’ is going into business by itself. But have its initiatives really worked?….
The idea behind the unit is simpler than you might believe. People don’t always act in their own interests – by filing their taxes late, for instance, overeating, or not paying fines until the bailiffs call. As a result, they don’t just harm themselves, they cost the state a lot of money. By looking closely at how they make their choices and then testing small changes in the way the choices are presented, the unit tries to nudge people into leading better lives, and save the rest of us a fortune. It is politics done like science, effectively – with Ben Goldacre’s approval – and, in many cases, it appears to work….”
The White House Blog: “We can’t talk about We the People without getting into the numbers — more than 8 million users, more than 200,000 petitions, more than 13 million signatures. The sheer volume of participation is, to us, a sign of success.
And there’s a lot we can learn from a set of data that rich and complex, but we shouldn’t be the only people drawing from its lessons.
So starting today, we’re making it easier for anyone to do their own analysis or build their own apps on top of the We the People platform. We’re introducing the first version of our API, and we’re inviting you to use it.
Get started here: petitions.whitehouse.gov/developers
This API provides read-only access to data on all petitions that passed the 150 signature threshold required to become publicly-available on the We the People site. For those who don’t need real-time data, we plan to add the option of a bulk data download in the near future. Until that’s ready, an incomplete sample data set is available for download here.”
Next City reports: “…opening up government can get expensive. That’s why two developers this week launched the Department of Better Technology, an effort to make open government tools cheaper, more efficient and easier to engage with.
As founder Clay Johnson explains in a post on the site’s blog, a federal website that catalogues databases on government contracts, which launched last year, cost $181 million to build — $81 million more than a recent research initiative to map the human brain.
The first undertaking of Johnson and his partner, GovHub co-founder Adam Becker, is a tool meant to make it simpler for businesses to find government projects to bid on, as well as help officials streamline the process of managing procurements. In a pilot experiment, Johnson writes, the pair found that not only were bids coming in faster and at a reduced price, but more people were doing the bidding.
Per Johnson, “many of the bids that came in were from businesses that had not ordinarily contracted with the federal government before.”
The Department of Better Technology will accept five cities to test a beta version of this tool, called Procure.io, in 2013.”
The last few years, we have seen a variety of experimentation with new ways to engage citizens in the decisions making process especially at the local or community level. Little is known however on what works and why. The National League of Cities, working with the John S. and James L. Knight Foundation, released a report today reviewing the impact of experimentation within 14 communities in the US, highlighting several “bright spots”. The so-called scans focus on four aspects of community engagement:
The use of new tools and strategies
The ability to reach a broad spectrum of people, including those not typically “engaged”
Notable successes and outcomes
Sustainable efforts to use a range of strategies
A slide-deck summarizing the findings of the report: