From the Digital Economy “Communities and Culture” Network: “The term ‘Big Data’ carries a great deal of currency in business and academic spheres. Data and their subsequent analysis are obviously not new. ‘Bigness’ in this context often refers to three characteristics that differentiate it from so-called ‘small’ data: volume, variety, and velocity. These three attributes of ‘bigness’, promising to open novel, macro-level perspectives on complex issues (Boyd and Crawford 2011), led enthusiasts like Chris Anderson to claim that ‘with enough data, the numbers speak for themselves”. But is this actually the case? Critical voices like Manovich (2011) argue that data never exist in ‘raw’ forms but are rather influenced by humans who—whether intentionally or not—select and construct them in certain ways.
These debates about data are relevant to wider discussions about digital change in society because they point to a more general concern about the potential of all sizes of data to selectively reveal dimensions of social phenomena on which decisions or policies are based. Crucially, if data generation and analysis is not entirely neutral but rather carries assumptions about what is ‘worthwhile’ or ‘acceptable’ to measure in the first place, then it raises critical questions of whether preferences for certain types of research—particularly work conducted under the auspices of a Big Data ‘brand’—reflect coherent sets of values and worldviews. What assumptions underpin preferences for ‘evidence-based’ research based on data? What qualities does such a phrase signify or confer to research? Which ‘sizes’ of data qualify as ‘evidence’ in the first place, or, to play on Anderson’s words, what kinds of data are allowed to speak for themselves in the realms of policy, media, and advocacy?
Hosted at the ESRC Centre on Migration, Policy, and Society (COMPAS) and The Migration Observatory at the University of Oxford, this project critically interrogates the values that inform demands by civil society organisations for research that is ‘data-driven’ or ‘evidence-based’. Specifically, it aims to document the extent to which perceived advantages of data ‘bigness’ (volume, variety, and velocity) influence these demands.
Read the report.“
Francis Fukuyama’s ‘Political Order and Political Decay’
Book Review by David Runciman of “Political Order and Political Decay: From the Industrial Revolution to the Globalisation of Democracy”, by Francis Fukuyama in the Financial TImes: “It is not often that a 600-page work of political science ends with a cliffhanger. But the first volume of Francis Fukuyama’s epic two-part account of what makes political societies work, published three years ago, left the big question unanswered. That book took the story of political order from prehistoric times to the dawn of modern democracy in the aftermath of the French Revolution. Fukuyama is still best known as the man who announced in 1989 that the birth of liberal democracy represented the end of history: there were simply no better ideas available. But here he hinted that liberal democracies were not immune to the pattern of stagnation and decay that afflicted all other political societies. They too might need to be replaced by something better. So which was it: are our current political arrangements part of the solution, or part of the problem?
Political Order and Political Decay is his answer. He squares the circle by insisting that democratic institutions are only ever one component of political stability. In the wrong circumstances they can be a destabilising force as well. His core argument is that three building blocks are required for a well-ordered society: you need a strong state, the rule of law and democratic accountability. And you need them all together. The arrival of democracy at the end of the 18th century opened up that possibility but by no means guaranteed it. The mere fact of modernity does not solve anything in the domain of politics (which is why Fukuyama is disdainful of the easy mantra that failing states just need to “modernise”).
The explosive growth in industrial capacity and wealth that the world has experienced in the past 200 years has vastly expanded the range of political possibilities available, for better and for worse (just look at the terrifying gap between the world’s best functioning societies – such as Denmark – and the worst – such as the Democratic Republic of Congo). There are now multiple different ways state capacity, legal systems and forms of government can interact with each other, and in an age of globalisation multiple different ways states can interact with each other as well. Modernity has speeded up the process of political development and it has complicated it. It has just not made it any easier. What matters most of all is getting the sequence right. Democracy doesn’t come first. A strong state does. …”
It’s All for Your Own Good
Book Review by Jeremy Waldron of Why Nudge? The Politics of Libertarian Paternalism by Cass R. Sunstein and Conspiracy Theories and Other Dangerous Ideas
by Cass R. Sunstein: “…Nudging is about the self-conscious design of choice architecture. Put a certain choice architecture together with a certain heuristic and you will get a certain outcome. That’s the basic equation. So, if you want a person to reach a desirable outcome and you can’t change the heuristic she’s following, then you have to meddle with the choice architecture, setting up one that when matched with the given heuristic delivers the desirable outcome. That’s what we do when we nudge.
All of this sounds like a marketer’s dream, and I will say something about its abusive possibilities later. But Sunstein and Thaler have in mind that governments might do this in a way that promotes the interests of their citizens. Governments might also encourage businesses and employers to use it in the interests of their customers and employees. The result would be a sort of soft paternalism: paternalism without the constraint; a nudge rather than a shove; doing for people what they would do for themselves if they had more time or greater ability to pick out the better choice….
…allowing dignity to just drop out of the picture is offensive. For by this stage, dignity is not being mentioned at all. Sunstein does acknowledge that people might feel infantilized by being nudged. He says that “people should not be regarded as children; they should be treated with respect.” But saying that is not enough. We actually have to reconcile nudging with a steadfast commitment to self-respect.
Consider the earlier point about heuristics—the rules for behavior that we habitually follow. Nudging doesn’t teach me not to use inappropriate heuristics or to abandon irrational intuitions or outdated rules of thumb. It does not try to educate my choosing, for maybe I am unteachable. Instead it builds on my foibles. It manipulates my sense of the situation so that some heuristic—for example, a lazy feeling that I don’t need to think about saving for retirement—which is in principle inappropriate for the choice that I face, will still, thanks to a nudge, yield the answer that rational reflection would yield. Instead of teaching me to think actively about retirement, it takes advantage of my inertia. Instead of teaching me not to automatically choose the first item on the menu, it moves the objectively desirable items up to first place.
I still use the same defective strategies but now things have been arranged to make that work out better. Nudging takes advantage of my deficiencies in the way one indulges a child. The people doing this (up in Government House) are not exactly using me as a mere means in violation of some Kantian imperative. They are supposed to be doing it for my own good. Still, my choosing is being made a mere means to my ends by somebody else—and I think this is what the concern about dignity is all about….”
The Impact of Open Government on Innovation: Does Government Transparency Drive Innovation?
Paper by Deogirikar, Anjelika: “This study adds to the body of research on open government by empirically measuring the association of government transparency and innovation. The study uses Transparency International’s Corruption Perceptions Index (CPI) as a proxy measure of government transparency. It assumes that an increase in government transparency increases applied innovation activity, which is measured as the number of annual patents by country residents. The study also tests whether the association is different for countries participating in the Open Government Partnership (OGP), a voluntary multi-stakeholder international collaboration of 63 countries who have committed to make their governments more transparent. The analysis uses fixed effects regression on panel data from 1996 to 2011 for 95 countries, including 54 OGP members. Although the empirical results do not support the hypothesis that transparency and innovation are positively correlated for countries participating in the OGP, this finding contributes to the literature on open government by making an initial attempt to quantify the association of transparency and innovation. Additional future research demonstrating a positive relationship between transparency and innovation could help to justify implementation of open government policies and participation in the Open Government Partnership.”
Forget GMOs. The Future of Food Is Data—Mountains of It
Cade Metz at Wired: “… Led by Dan Zigmond—who previously served as chief data scientist for YouTube, then Google Maps—this ambitious project aims to accelerate the work of all the biochemists, food scientists, and chefs on the first floor, providing a computer-generated shortcut to what Hampton Creek sees as the future of food. “We’re looking at the whole process,” Zigmond says of his data team, “trying to figure out what it all means and make better predictions about what is going to happen next.”
Zigmond’s project is the first major effort to apply “big data” to the development of food, and though it’s only just getting started—with some experts questioning how effective it will be—it could spur additional research in the field. The company may license its database to others, and Hampton Creek founder and CEO Josh Tetrick says it may even open source the data, so to speak, freely sharing it with everyone. “We’ll see,” says Tetrick, a former college football linebacker who founded Hampton Creek after working on economic and social campaigns in Liberia and Kenya. “That would be in line with who we are as a company.”…
Initially, Zigmond and his team will model protein interactions on individual machines, using tools like the R programming language (a common means of crunching data) and machine learning algorithms much like those that recommend products on Amazon.com. As the database expands, they plan to arrange for much larger and more complex models that run across enormous clusters of computer servers, using the sort of sweeping data-analysis software systems employed by the likes of Google. “Even as we start to get into the tens and hundreds of thousands and millions of proteins,” Zigmond says, “it starts to be more than you can handle with traditional database techniques.”
In particular, Zigmond is exploring the use of deep learning, a form of artificial intelligence that goes beyond ordinary machine learning. Google is using deep learning to drive the speech recognition system in Android phones. Microsoft is using it to translate Skype calls from one language to another. Zigmond believes it can help model the creation of new foods….”
Redesigning that first encounter with online government
Nancy Scola in the Washington Post: “Teardowns,” Samuel Hulick calls them, and by that he means his step-by-step dissections of how some of world’s most popular digital services — Gmail, Evernote, Instragram — welcome new users. But the term might give an overly negative sense of what Hulick is up to. The Portland, Ore., user-experience designer highlights both the good and bad in his critiques, and his annotated slideshows, under the banner of UserOnboard, have gained a following among design aficionados.
Using the original UserOnboard is like taking a tour through some of the digital sites you know best — but with an especially design-savvy friend by your side pointing out the kinks. “The user experience,” or UX on these sites, “is often tacked on haphazardly,” says Hulick, who launched UserOnboard in December 2013 and who is also the author of the recent book “The Elements of User Onboarding.” What’s he looking for in a good UX, he says, is something non-designers can spot, too. “If you were the Web site, what tone would you take? How would you guide people through your process?”
Hulick reviews what’s working and what’s not, and adds a bit of sass: Gmail pre-populates its inbox with a few welcome messages: “Preloading some emails is a nice way to deal with the ‘cold start’ problem,” Hulick notes. Evernote nudges new users to check out its blog and other apps: “It’s like a restaurant rolling out the dessert cart while I’m still trying to decide if I even want to eat there.” Instagram’s first backdrop is a photo of someone taking a picture: “I’m learning how to Instagram by osmosis!”….
CitizenOnboard’s pitch is to get the public to do that same work. They suggest starting with state food stamp programs. Hulick tackled his. The onboarding for Oregon’s SNAP service is 118 slides long, but that’s because there is much to address. In one step, applications must, using a drop-down menu, identify how those in their family are related to one another. “It took a while to figure out who should be the relation ‘of’ the other,” Hulick notes in his teardown. “In fact, I’m still not 100% sure I got it right.”…”
Proof: How Crowdsourced Election Monitoring Makes a Difference
Patrick Meier at iRevolution: “My colleagues Catie Bailard & Steven Livingston have just published the results of their empirical study on the impact of citizen-based crowdsourced election monitoring. Readers of iRevolution may recall that my doctoral dissertation analyzed the use of crowdsourcing in repressive environments and specifically during contested elections. This explains my keen interest in the results of my colleagues’ news data-driven study, which suggests that crowdsourcing does have a measurable and positive impact on voter turnout.
Catie and Steven are “interested in digitally enabled collective action initiatives” spearheaded by “nonstate actors, especially in places where the state is incapable of meeting the expectations of democratic governance.” They are particularly interested in measuring the impact of said initiatives. “By leveraging the efficiencies found in small, incremental, digitally enabled contributions (an SMS text, phone call, email or tweet) to a public good (a more transparent election process), crowdsourced elections monitoring constitutes [an] important example of digitally-enabled collective action.” To be sure, “the successful deployment of a crowdsourced elections monitoring initiative can generate information about a specific political process—information that would otherwise be impossible to generate in nations and geographic spaces with limited organizational and administrative capacity.”
To this end, their new study tests for the effects of citizen-based crowdsourced election monitoring efforts on the 2011 Nigerian presidential elections. More specifically, they analyzed close to 30,000 citizen-generated reports of failures, abuses and successes which were publicly crowdsourced and mapped as part of the Reclaim Naija project. Controlling for a number of factors, Catie and Steven find that the number and nature of crowdsourced reports is “significantly correlated with increased voter turnout.”
In conclusion, the authors argue that “digital technologies fundamentally change information environments and, by doing so, alter the opportunities and constraints that the political actors face.” This new study is an important contribution to the literature and should be required reading for anyone interested in digitally-enabled, crowdsourced collective action. Of course, the analysis focuses on “just” one case study, which means that the effects identified in Nigeria may not occur in other crowdsourced, election monitoring efforts. But that’s another reason why this study is important—it will no doubt catalyze future research to determine just how generalizable these initial findings are.”
Announcing New U.S. Open Government Commitments on the Third Anniversary of the Open Government Partnership
US White House Fact Sheet: “Three years ago, President Obama joined with the leaders of seven other nations to launch the Open Government Partnership (OGP), an international partnership between governments and civil society to promote transparency, fight corruption, energize civic engagement, and leverage new technologies to open up governments worldwide. The United States and other founding countries pledged to transform the way that governments serve their citizens in the 21st century. Today, as heads of state of OGP participating countries gather at the UN General Assembly, this partnership has grown from 8 to 65 nations and hundreds of civil society organizations around the world. These countries are embracing the challenge by taking steps in partnership with civil society to increase the ability of citizens to engage their governments, access government data to fuel entrepreneurship and innovation, and promote accountability….
The United States is committed to continuing to lead by example in OGP. Since assuming office, President Obama has prioritized making government more open and accountable and has taken substantial steps to increase citizen participation, collaboration with civil society, and transparency in government. The United States will remain a global leader of international efforts to promote transparency, stem corruption and hold to account those who exploit the public’s trust for private gain. Yesterday, President Obama announced several steps the United States is taking to deepen our support for civil society globally.
Today, to mark the third anniversary of OGP, President Obama is announcing four new and expanded open government initiatives that will advance our efforts through the end of 2015.
1. Promote Open Education to Increase Awareness and Engagement
Open education is the open sharing of digital learning materials, tools, and practices that ensures free access to and legal adoption of learning resources. The United States is committed to open education and will:
- Raise open education awareness and identify new partnerships. The U.S. Department of State, the U.S. Department of Education, and the Office of Science and Technology Policy will jointly host a workshop on challenges and opportunities in open education internationally with stakeholders from academia, industry, and government.
- Pilot new models for using open educational resources to support learning. The State Department will conduct three pilots overseas by December 2015 that use open educational resources to support learning in formal and informal learning contexts. The pilots’ results, including best practices, will be made publicly available for interested educators.
- Launch an online skills academy. The Department of Labor (DOL), with cooperation from the Department of Education, will award $25 million through competitive grants to launch an online skills academy in 2015 that will offer open online courses of study, using technology to create high-quality, free, or low-cost pathways to degrees, certificates, and other employer-recognized credentials.
2. Deliver Government Services More Effectively Through Information Technology
The Administration is committed to serving the American people more effectively and efficiently through smarter IT delivery. The newly launched U.S. Digital Service will work to remove barriers to digital service delivery and remake the experience that people and businesses have with their government. To improve delivery of Federal services, information, and benefits, the Administration will:
- Expand digital service delivery expertise in government. Throughout 2015, the Administration will continue recruiting top digital talent from the private and public sectors to expand services across the government. These individuals —who have expertise in technology, procurement, human resources, and financing —will serve as digital professionals in a number of capacities in the Federal government, including the new U.S. Digital Service and 18F digital delivery team within the U.S. General Services Administration, as well as within Federal agencies. These teams will take best practices from the public and private sectors and scale them across agencies with a focus on the customer experience.
- Build digital services in the open. The Administration will expand its efforts to build digital services in the open. This includes using open and transparent processes intended to better understand user needs, testing pilot digital projects, and designing and developing digital services at scale. In addition, building on the recently published Digital Services Playbook, the Administration will continue to openly publish best practices on collaborative websites that enable the public to suggest improvements.
- Adopt an open source software policy. Using and contributing back to open source software can fuel innovation, lower costs, and benefit the public. No later than December 31, 2015, the Administration will work through the Federal agencies to develop an open source software policy that, together with the Digital Services Playbook, will support improved access to custom software code developed for the Federal government.
3. Increase Transparency in Spending
The Administration has made an increasing amount of Federal spending data publicly available and searchable, allowing nationwide stakeholders to perform analysis of Federal spending. The Administration will build on these efforts by committing to:
- Improve USAspending.gov. In 2015, the Administration will launch a refreshed USAspending.gov website that will improve the site’s design and user experience, including better enabling users to explore the data using interactive maps and improving the search functionality and application programming interface.
- Improve accessibility and reusability of Federal financial data. In 2015, as part of implementation of the DATA Act,[2] the Administration will work to improve the accessibility and reusability of Federal financial data by issuing data element definition standards and standards for exchanging financial data. The Administration, through the Office of Management and Budget, will leverage industry data exchange standards to the extent practicable to maximize the sharing and utilization of Federal financial data.
- Explore options for visualization and publication of additional Federal financial data. The Administration, through the Treasury Department, will use small-scale pilots to help explore options for visualizing and publishing Federal financial data from across the government as required by the DATA Act.
- Continue to engage stakeholders. The Administration will continue to engage with a broad group of stakeholders to seek input on Federal financial transparency initiatives including DATA Act implementation, by hosting town hall meetings, conducting interactive workshops, and seeking input via open innovation collaboration tools.
4. Use Big Data to Support Greater Openness and Accountability
President Obama has recognized the growing importance of “big data” technologies for our economy and the advancement of public good in areas such as education, energy conservation, and healthcare. The Administration is taking action to ensure responsible uses of big data to promote greater openness and accountability across a range of areas and sectors. As part of the work it is doing in this area, the Administration has committed to:
- Enhance sharing of best practices on data privacy for state and local law enforcement. Federal agencies with expertise in law enforcement, privacy, and data practices will seek to enhance collaboration and information sharing about privacy best practices among state and local law enforcement agencies receiving Federal grants.
- Ensure privacy protection for big data analyses in health. Big data introduces new opportunities to advance medicine and science, improve health care, and support better public health. To ensure that individual privacy is protected while capitalizing on new technologies and data, the Administration, led by the Department of Health and Human Services, will: (1) consult with stakeholders to assess how Federal laws and regulations can best accommodate big data analyses that promise to advance medical science and reduce health care costs; and (2) develop recommendations for ways to promote and facilitate research through access to data while safeguarding patient privacy and autonomy.
- Expand technical expertise in government to stop discrimination. U.S. Government departments and agencies will work to expand their technical expertise to identify outcomes facilitated by big data analytics that may have a discriminatory impact on protected classes. …”
Data Visualization: Principles and Practice (Second Edition)
Book by Alexandru C. Telea : “This book explores the study of processing and visually representing data sets. Data visualization is closely related to information graphics, information visualization, scientific visualization, and statistical graphics. This second edition presents a better treatment of the relationship between traditional scientific visualization and information visualization, a description of the emerging field of visual analytics, and updated techniques using the GPU and new generations of software tools and packages. This edition is also enhanced with exercises and downloadable code and data sets. See also Supplemental Material.
Plenario
About Plenario: “Plenario makes it possible to rethink the way we use open data. Instead of being constrained by the data that is accessible and usable, let’s start by formulating our questions and then find the data to answer them. Plenario makes this easy by tying together all datasets on one map and one timeline—because in the real world, everything affects everything else…
The problem
Over the past few years, levels of government from the federal administration to individual municipalities like the City of Chicago have begun embracing open data, releasing datasets publicly for free. This movement has vastly increased the amount of data available, but existing platforms and technologies are designed mainly to view and access individual datasets one at a time. This restriction contradicts decades of research contending that no aspect of the urban landscape is truly isolated; in today’s cities, everything is connected to everything else.
Furthermore, researchers are often limited in the questions they can ask by the data available to answer them. It is not uncommon to spend 75% of one’s time locating, downloading, cleaning, and standardizing the relevant datasets—leaving precious little resources for the important work.
What we do
Plenario is designed to take us from “spreadsheets on the web”1 to truly smart open data. This rests on two fundamental breakthroughs:
2) Unite all datasets along a single spatial and temporal index, making it possible to do complex aggregations with one query.
With these advances, Plenario allows users to study regions over specified time periods using all relevant data, regardless of original source, and represent the data as a single time series. By providing a single, centralized hub for open data, the Plenario platform enables urban scientists to ask the right questions with as few constraints as possible….
being implemented by the Urban Center for Computation and Data and DataMade“