Experiments in Democracy


Jeremy Rozansky, assistant editor of National Affairs in The New Atlantis: ” In his debut book Uncontrolled, entrepreneur and policy analyst Jim Manzi argues that social scientists and policymakers should instead adopt the “experimental method.” The essential tool of this method is the randomized field trial (RFT), a technique that already informs many of our successful private enterprises. Perhaps the best known example of RFTs — one that Manzi uses to illustrate the concept — is the kind of clinical trial performed to test new medicines, wherein researchers “undertake a painstaking series of replicated controlled experiments to measure the effects of various interventions under various conditions,” as he puts it.
 
The central argument of Uncontrolled is that RFTs should be adopted more widely by businesses as well as government. The book is helpful and holds much wisdom — although the approach he recommends is ultimately just another streetlamp in the night, casting a pale light that tapers off after a few yards. Much still lies beyond its glow….
The econometric method now dominates the social sciences because it helps to cope with the problem of high causal density. It begins with a large data set: economic records, election results, surveys, and other similar big pools of data. Then the social scientist uses statistical techniques to model the interactions of sundry independent variables (causes) and a dependent variable (the effect). But for this method to work properly, social scientists must know all the causally important variables beforehand, because a hidden conditional could easily yield a false positive.
The experimental method, which Manzi prefers, offers a different way of coping with high causal density: sidestepping the problem of isolating exact causes. To sort out whether a given treatment or policy works, a scientist or social scientist can try it out on a random section of a population, and compare the results to a different section of the population where the treatment or policy was not implemented. So while econometric models aim to identify which particular variables are responsible for different results, RFTs have more modest aims, as they do not seek to identify every hidden conditional. By using the RFT approach, we may not know precisely why we achieved a desired effect, since we do not model all possible variables. But we can gain some ability to know that we will achieve a desired effect, at least under certain conditions.
Strictly speaking, even a randomized field trial only tells us with certainty that some exact technique worked with some specific population on some specific date in the past when conducted by some specific experimenters. We cannot know whether a given treatment or policy will work again under the same conditions at a later date, much less on a different population, much less still on the population as a whole. But scientists must always be cautious about moving from particular results to general conclusions; this is why experiments need to be replicated. And the more we do replicate them, the more information we can gain from those particular results, and the more reliably they can build toward teaching us which treatments or policies might work or (more often) which probably won’t. The result is that the RFT approach is very well suited to the business of government, since policymakers usually only need to know whether a given policy will work — whether it will produce a desired outcome.”
 

New certificates launched to help everyone discover, understand, and use open data


Press Release from the Open Data Institute: “The ODI is today launching Open Data Certificates to help everyone find, understand and use open data that is being released. The new certificates are being announced by CEO Gavin Starks at a G8 Summit event: Open for Growth. The certificates have been created in response to business, government, and citizen needs to bring rigour to the publication, dissemination and usage of open data. Over the last six months, ODI has been collaborating with dozens of organisations around the world to define the certificates. Today sees their first Beta release.
The certificate is made up of two components:1)  a visual mark that shows the quality level of the data
2) a human and machine-readable description of the data being released
There are four levels of certificates:
Raw: A great start at the basics of publishing open data.
Pilot: Data users receive extra support from, and provide feedback to the publisher.
Standard: Regularly published open data with robust support that people can rely on.
Expert: An exceptional example of information infrastructure.
Benefits of the certificates include helping:

  • publishers of data understand how they can better connect with their users;
  • users of data to understand its quality, licensing, structure, and its usability;
  • businesses, entrepreneurs and innovators have confidence that the data has value to them;
  • policy-makers benchmark and compare the progress and quality of the data released.

Commercial and public sector organisations have already committed to the certificates including:
– Open Corporates: corporate information for over 50 million companies worldwide
– OpenStreetMap: the free wiki world map offering worldwide open geodata
–  legislation.gov.uk: 500 years of UK legislation information
– amee: an environmental score for each of the 2.7 million companies in Britain
– MastodonC:  energy monitoring data analysis from Retrofit for the Future projects
– Placr: transport data covering all 360,000 stops and stations nationwide
Certificates are created online, for free, at http://certificates.theodi.org/. The process involves publishers answering a series of questions, each of which affect the certificate generated at the end.Read Minister for the Cabinet Office, Francis Maude’s opening remarks at the conference, with the emphasis firmly on open data and transparency”

Mozilla Science Lab


Mark Surman in Mozilla Blog: “We’re excited to announce the launch of the Mozilla Science Lab, a new initiative that will help researchers around the world use the open web to shape science’s future.
Scientists created the web — but the open web still hasn’t transformed scientific practice to the same extent we’ve seen in other areas like media, education and business. For all of the incredible discoveries of the last century, science is still largely rooted in the “analog” age. Credit systems in science are still largely based around “papers,” for example, and as a result researchers are often discouraged from sharing, learning, reusing, and adopting the type of open and collaborative learning that the web makes possible.
The Science Lab will foster dialog between the open web community and researchers to tackle this challenge. Together they’ll share ideas, tools, and best practices for using next-generation web solutions to solve real problems in science, and explore ways to make research more agile and collaborative….
With support from the Alfred P. Sloan Foundation, Mozilla Science Lab will start by convening a broad conversation about open web approaches and skills training, working with existing tool developers and supporting a global community of researchers.
Get involved
Stay tuned for more about how you can join the conversation. In the mean time, you can:

UK launches Information Economy Strategy


Open Data Institute: “The Information Economy Strategy sets out a range of key actions, including:

  • Digitally transforming 25 of the top 50 UK public services over the next 300 days, including plans to give businesses a single, online view of their tax records
  • Launching a new programme to help 1.6 million SMEs scale up their business online over the next five years.
  • Publishing a data capability strategy in October 2013, developed in partnership with government, industry and academia. The strategy will build on the recommendations in Stephan Shakespeare’s review of Public Sector Information and the Prime Minister’s Council for Science and Technology’s report on algorithms, and will be published alongside the Open Government Partnership National Action Plan.
  • Establishing the world’s first facility for testing state of the art 5G mobile technology, working with industry and the University of Surrey.”

ResearchGate Tackles Social Networking for Scientists


Screen-Shot-2013-06-04-at-11.19.01-AM-610x398Meredith Salisbury from Techonomy: “Social networking for scientists has been tried before, but not until recently have we seen investors placing big bets in this area. Earlier this year, the academic networking site Mendeley was acquired by scientific publisher Elsevier for somewhere in the ballpark of $70 million. And today brings a new data point: Berlin-based ResearchGate, a site designed to facilitate collaborations and data sharing among scientists around the world, has raised $35 million in a series C round from investors including Bill Gates….While social networking has upended how business happens in other industries, the centuries-old traditions of the scientific field have largely blocked this kind of change. Sure, scientists sign up for Facebook and LinkedIn like anybody else. But use a social networking tool to facilitate research, find partners, and share data that hasn’t yet been published? That’s been a tough sell in the hyper-competitive, highly specialized scientific community….But ResearchGate’s Madisch believes he is making inroads—and that his latest round of funding, along with its big-name investors, is proof of that. The site boasts 2.8 million users, and features a number of tools and capabilities designed to lure scientists. Madisch knows that scientists are unlikely to share data that could be included in a valuable peer-reviewed publication, so instead he encourages users to share data from failed experiments that will never be submitted for publication anyway. There’s a lot less possessiveness around that data, and Madisch contends that failures are just as important as successes in helping people understand what works under certain circumstances.Another widget calculates a scientist’s reputation score based on interactions within ResearchGate; this number offers an alternative way to look at any scientist’s impact within the field beyond the current gold standard, which simply associates a person’s value with the reputation of the journals he or she gets published in.
What’s most important to Madisch, though, is the site’s ability to connect scientists around the world and allow better research to happen faster. He cites the example of a young child who died from an unknown cause in Nigeria; a local doctor sent samples to a scientist in Italy he found on ResearchGate, and together they identified a new pathogen responsible for the child’s death. Further analysis was conducted by a Dutch researcher, also found through the networking site.
For Madisch, this anecdote embodies the ResearchGate mentality: The more you collaborate, the more successful you’ll be.”

It’s Time to Rewrite the Internet to Give Us Better Privacy, and Security


Larry Lessig in The Daily Beast: “Almost 15 years ago, as I was just finishing a book about the relationship between the Net (we called it “cyberspace” then) and civil liberties, a few ideas seemed so obvious as to be banal: First, life would move to the Net. Second, the Net would change as it did so. Gone would be simple privacy, the relatively anonymous default infrastructure for unmonitored communication; in its place would be a perpetually monitored, perfectly traceable system supporting both commerce and the government. That, at least, was the future that then seemed most likely, as business raced to make commerce possible and government scrambled to protect us (or our kids) from pornographers, and then pirates, and now terrorists.

But another future was also possible, and this was my third, and only important point: Recognizing these obvious trends, we just might get smart about how code (my shorthand for the technology of the Internet) regulates us, and just possibly might begin thinking smartly about how we could embed in that code the protections that the Constitution guarantees us. Because—and here was the punchline, the single slogan that all 724 people who read that book remember—code is law. And if code is law, then we need to be as smart about how code regulates us as we are about how the law does so….
But what astonishes me is that today, more than a decade into the 21st century, the world has remained mostly oblivious to these obvious points about the relationship between law and code….
the fact is that there is technology that could be deployed that would give many the confidence that none of us now have. “Trust us” does not compute. But trust and verify, with high-quality encryption, could. And there are companies, such as Palantir, developing technologies that could give us, and more importantly, reviewing courts, a very high level of confidence that data collected or surveilled was not collected or used in an improper way. Think of it as a massive audit log, recording how and who used what data for what purpose. We could code the Net in a string of obvious ways to give us even better privacy, while also enabling better security.

The Use of Data Visualization in Government


Report by Genie Stowers for The IBM Center for The Business of Government: “The purpose of this report is to help public sector managers understand one of the more important areas of data analysis today—data visualization. Data visualizations are more sophisticated, fuller graphic designs than the traditional spreadsheet charts, usually with more than two variables and, typically, incorporating interactive features. Data are here to stay, growing exponentially, and data analysis is taking off, pushed forward as a result of the convergence of:
• New technologies
• Open data and big data movements
• The drive to more effectively engage citizens
• The creation and distribution of more and more data…
This report contains numerous examples of visualizations that include geographical and health data, or population and time data, or financial data represented in both absolute and relative terms—and each communicates more than simply the data that underpin it.In addition to these many examples of visualizations, the report discusses the history of this technique, and describes tools that can be used to create visualizations from many different kinds of data sets. Government managers can use these tools—including Many Eyes, Tableau, and HighCharts—to create their own visualizations from their agency’s data.
The report presents case studies on how visualization techniques are now being used by two local governments, one state government,and three federal government agencies. Each case study discusses the audience for visualization. Understanding audience is important, as government organizations provide useful visualizations to different audiences, including the media, political oversight organizations, constituents, and internal program teams.To assist in effectively communicating to these audiences, the report details attributes of meaningful visualizations: relevance,meaning, beauty, ease of use, legibility, truthfulness, accuracy,and consistency among them.”

Big Data Is Not Our Master. Humans create technology. Humans can control it.


Chris Hughes in New Republic: “We’ve known for a long time that big companies can stalk our every digital move and customize our every Web interaction. Our movements are tracked by credit cards, Gmail, and tollbooths, and we haven’t seemed to care all that much.
That is, until this week’s news of government eavesdropping, with the help of these very same big companies—Verizon, Facebook, and Google, among others. For the first time, America is waking up to the realities of what all this information—known in the business as “big data”—enables governments and corporations to do….
We are suddenly wondering, Can the rise of enormous data systems that enable this surveillance be stopped or controlled? Is it possible to turn back the clock?
Technologists see the rise of big data as the inevitable march of history, impossible to prevent or alter. Viktor Mayer-Schönberger and Kenneth Cukier’s recent book Big Data is emblematic of this argument: They say that we must cope with the consequences of these changes, but they never really consider the role we play in creating and supporting these technologies themselves….
But these well-meaning technological advocates have forgotten that as a society, we determine our own future and set our own standards, norms, and policy. Talking about technological advancements as if they are pre-ordained science erases the role of human autonomy and decision-making in inventing our own future. Big data is not a Leviathan that must be coped with, but a technological trend that we have made possible and support through social and political policy.”

A Citizen’s Guide to Open Government, E-Government, and Government 2.0


Inside the MPA@UNC Blog: “Engaged citizens want clear, credible information from the government about how it’s carrying on its business. They don’t want to thumb through thousands of files or wait month after month or go through the rigors of filing claims through FOIA (Freedom of Information Act). They want government information, services, and communication to be forthcoming and swift. The Open Government, Government 2.0, and E-Governance movements fill the need of connecting citizens with the government and each other to foster a more open, collaborative, and efficient public sector through the use of new technology and public data.
Open Government is defined by the OECD (Organisation for Economic Cooperation and Development) as “the transparency of government actions, the accessibility of government services and information, and the responsiveness of government to new ideas, demands and needs.”
E-Government is defined by the World Bank as “the use by government agencies of information technologies that have the ability to transform relations with citizens, businesses, and other arms of government. These technologies can serve a variety of different ends: better delivery of government services to citizens, improved interactions with business and industry, citizen empowerment through access to information, or more efficient government management. The resulting benefits can be less corruption, increased transparency, greater convenience, revenue growth, and/or cost reductions.”
Government 2.0 is defined by Gartner Research as “the use of Web 2.0 technologies, both internally and externally, to increase collaboration and transparency and potentially transform the way government agencies relate to citizens and operate.”
Open Government and E-Government paved the way for Government 2.0, a collaborative technology whose mission is to improve government transparency and efficiency. How? Gov 2.0 has been called the next generation of government because it not only utilizes new technologies such as social media, cloud computing, and other apps, it is a means to increase citizen participation….
We have compiled a list of organizations, blogs, guides, and tools to help citizens and public service leaders better understand the Open Government, E-Government, and Government 2.0 movement….”

The "audience" as participative, idea generating, decision making citizens: will they transform government?


Paper by Teresa Harrison in latest issue of the Journal “Participations“: “From a purely technical perspective, no one, including Tim Berners-Lee, has ever been able to pinpoint exactly what makes Web 2.0 unique. What may be most accurate to say, is that the enormous popularity of social networking and other social media technologies hinges on a radical reconceptualisation of the audience, now routinely incorporated into ICT applications. Once treated as passive consumers of content created by others, designers of these applications now appreciate and exploit, the fact that new media users (formerly known as the “audience”) actively create content online to serve their own goals, frequently as they interact with others. Users of Web 2.0 applications display and tinker with their identities, express themselves on all kinds of topics, invent new products and ideas, and, as Don Tapscott, Tim O’Reilly and legions of other business gurus hasten to remind us, are willing, so far at least, to lend their problem solving, creative efforts and intellectual products to businesses seeking to innovate, or just looking for free marketing. Hoping to harness this largely uncompensated labour, organisations of all types (both commercial and non-profit) have been quick to find ways to attract these “producers” to their projects.
Governments have not been the swiftest in this regard, however, they may present the most ambitious and optimistic agenda for involving internet users. Nations around the world now hope to use new media to engage their citizens in some variation of participatory governance. Where once the prospects for “town hall style democracy,” were doomed by the limitations and inefficiencies of one-way media transactions, the networked interactivity of social media now makes it technically feasible to invite citizen participation on a routine basis. What is not yet clear is how citizens will react over the long term to these invitations and what kinds of social issues and software applications will best attract and immerse them into new citizenship practices.”