Jeremy Rozansky, assistant editor of National Affairs in The New Atlantis: ” In his debut book Uncontrolled, entrepreneur and policy analyst Jim Manzi argues that social scientists and policymakers should instead adopt the “experimental method.” The essential tool of this method is the randomized field trial (RFT), a technique that already informs many of our successful private enterprises. Perhaps the best known example of RFTs — one that Manzi uses to illustrate the concept — is the kind of clinical trial performed to test new medicines, wherein researchers “undertake a painstaking series of replicated controlled experiments to measure the effects of various interventions under various conditions,” as he puts it.
The central argument of Uncontrolled is that RFTs should be adopted more widely by businesses as well as government. The book is helpful and holds much wisdom — although the approach he recommends is ultimately just another streetlamp in the night, casting a pale light that tapers off after a few yards. Much still lies beyond its glow….
The econometric method now dominates the social sciences because it helps to cope with the problem of high causal density. It begins with a large data set: economic records, election results, surveys, and other similar big pools of data. Then the social scientist uses statistical techniques to model the interactions of sundry independent variables (causes) and a dependent variable (the effect). But for this method to work properly, social scientists must know all the causally important variables beforehand, because a hidden conditional could easily yield a false positive.
The experimental method, which Manzi prefers, offers a different way of coping with high causal density: sidestepping the problem of isolating exact causes. To sort out whether a given treatment or policy works, a scientist or social scientist can try it out on a random section of a population, and compare the results to a different section of the population where the treatment or policy was not implemented. So while econometric models aim to identify which particular variables are responsible for different results, RFTs have more modest aims, as they do not seek to identify every hidden conditional. By using the RFT approach, we may not know precisely why we achieved a desired effect, since we do not model all possible variables. But we can gain some ability to know that we will achieve a desired effect, at least under certain conditions.
Strictly speaking, even a randomized field trial only tells us with certainty that some exact technique worked with some specific population on some specific date in the past when conducted by some specific experimenters. We cannot know whether a given treatment or policy will work again under the same conditions at a later date, much less on a different population, much less still on the population as a whole. But scientists must always be cautious about moving from particular results to general conclusions; this is why experiments need to be replicated. And the more we do replicate them, the more information we can gain from those particular results, and the more reliably they can build toward teaching us which treatments or policies might work or (more often) which probably won’t. The result is that the RFT approach is very well suited to the business of government, since policymakers usually only need to know whether a given policy will work — whether it will produce a desired outcome.”
New certificates launched to help everyone discover, understand, and use open data
The certificate is made up of two components:1) a visual mark that shows the quality level of the data
2) a human and machine-readable description of the data being released
There are four levels of certificates:
Raw: A great start at the basics of publishing open data.
Pilot: Data users receive extra support from, and provide feedback to the publisher.
Standard: Regularly published open data with robust support that people can rely on.
Expert: An exceptional example of information infrastructure.
Benefits of the certificates include helping:
- publishers of data understand how they can better connect with their users;
- users of data to understand its quality, licensing, structure, and its usability;
- businesses, entrepreneurs and innovators have confidence that the data has value to them;
- policy-makers benchmark and compare the progress and quality of the data released.
Commercial and public sector organisations have already committed to the certificates including:
– Open Corporates: corporate information for over 50 million companies worldwide
– OpenStreetMap: the free wiki world map offering worldwide open geodata
– legislation.gov.uk: 500 years of UK legislation information
– amee: an environmental score for each of the 2.7 million companies in Britain
– MastodonC: energy monitoring data analysis from Retrofit for the Future projects
– Placr: transport data covering all 360,000 stops and stations nationwide
Certificates are created online, for free, at http://certificates.theodi.org/. The process involves publishers answering a series of questions, each of which affect the certificate generated at the end.Read Minister for the Cabinet Office, Francis Maude’s opening remarks at the conference, with the emphasis firmly on open data and transparency”
Mozilla Science Lab
Mark Surman in Mozilla Blog: “We’re excited to announce the launch of the Mozilla Science Lab, a new initiative that will help researchers around the world use the open web to shape science’s future.
Scientists created the web — but the open web still hasn’t transformed scientific practice to the same extent we’ve seen in other areas like media, education and business. For all of the incredible discoveries of the last century, science is still largely rooted in the “analog” age. Credit systems in science are still largely based around “papers,” for example, and as a result researchers are often discouraged from sharing, learning, reusing, and adopting the type of open and collaborative learning that the web makes possible.
The Science Lab will foster dialog between the open web community and researchers to tackle this challenge. Together they’ll share ideas, tools, and best practices for using next-generation web solutions to solve real problems in science, and explore ways to make research more agile and collaborative….
With support from the Alfred P. Sloan Foundation, Mozilla Science Lab will start by convening a broad conversation about open web approaches and skills training, working with existing tool developers and supporting a global community of researchers.
Get involved
Stay tuned for more about how you can join the conversation. In the mean time, you can:
- Learn more about the project at the Mozilla Science Wiki.
- Follow @MozillaScience and @kaythaney on Twitter.
- Follow the project’s progress on Kaitlin’s blog.”
UK launches Information Economy Strategy
Open Data Institute: “The Information Economy Strategy sets out a range of key actions, including:
- Digitally transforming 25 of the top 50 UK public services over the next 300 days, including plans to give businesses a single, online view of their tax records
- Launching a new programme to help 1.6 million SMEs scale up their business online over the next five years.
- Publishing a data capability strategy in October 2013, developed in partnership with government, industry and academia. The strategy will build on the recommendations in Stephan Shakespeare’s review of Public Sector Information and the Prime Minister’s Council for Science and Technology’s report on algorithms, and will be published alongside the Open Government Partnership National Action Plan.
- Establishing the world’s first facility for testing state of the art 5G mobile technology, working with industry and the University of Surrey.”
ResearchGate Tackles Social Networking for Scientists
What’s most important to Madisch, though, is the site’s ability to connect scientists around the world and allow better research to happen faster. He cites the example of a young child who died from an unknown cause in Nigeria; a local doctor sent samples to a scientist in Italy he found on ResearchGate, and together they identified a new pathogen responsible for the child’s death. Further analysis was conducted by a Dutch researcher, also found through the networking site.
For Madisch, this anecdote embodies the ResearchGate mentality: The more you collaborate, the more successful you’ll be.”
It’s Time to Rewrite the Internet to Give Us Better Privacy, and Security
Larry Lessig in The Daily Beast: “Almost 15 years ago, as I was just finishing a book about the relationship between the Net (we called it “cyberspace” then) and civil liberties, a few ideas seemed so obvious as to be banal: First, life would move to the Net. Second, the Net would change as it did so. Gone would be simple privacy, the relatively anonymous default infrastructure for unmonitored communication; in its place would be a perpetually monitored, perfectly traceable system supporting both commerce and the government. That, at least, was the future that then seemed most likely, as business raced to make commerce possible and government scrambled to protect us (or our kids) from pornographers, and then pirates, and now terrorists.
But what astonishes me is that today, more than a decade into the 21st century, the world has remained mostly oblivious to these obvious points about the relationship between law and code….
the fact is that there is technology that could be deployed that would give many the confidence that none of us now have. “Trust us” does not compute. But trust and verify, with high-quality encryption, could. And there are companies, such as Palantir, developing technologies that could give us, and more importantly, reviewing courts, a very high level of confidence that data collected or surveilled was not collected or used in an improper way. Think of it as a massive audit log, recording how and who used what data for what purpose. We could code the Net in a string of obvious ways to give us even better privacy, while also enabling better security.
The Use of Data Visualization in Government
The report presents case studies on how visualization techniques are now being used by two local governments, one state government,and three federal government agencies. Each case study discusses the audience for visualization. Understanding audience is important, as government organizations provide useful visualizations to different audiences, including the media, political oversight organizations, constituents, and internal program teams.To assist in effectively communicating to these audiences, the report details attributes of meaningful visualizations: relevance,meaning, beauty, ease of use, legibility, truthfulness, accuracy,and consistency among them.”
“
Big Data Is Not Our Master. Humans create technology. Humans can control it.
Chris Hughes in New Republic: “We’ve known for a long time that big companies can stalk our every digital move and customize our every Web interaction. Our movements are tracked by credit cards, Gmail, and tollbooths, and we haven’t seemed to care all that much.
That is, until this week’s news of government eavesdropping, with the help of these very same big companies—Verizon, Facebook, and Google, among others. For the first time, America is waking up to the realities of what all this information—known in the business as “big data”—enables governments and corporations to do….
We are suddenly wondering, Can the rise of enormous data systems that enable this surveillance be stopped or controlled? Is it possible to turn back the clock?
Technologists see the rise of big data as the inevitable march of history, impossible to prevent or alter. Viktor Mayer-Schönberger and Kenneth Cukier’s recent book Big Data is emblematic of this argument: They say that we must cope with the consequences of these changes, but they never really consider the role we play in creating and supporting these technologies themselves….
But these well-meaning technological advocates have forgotten that as a society, we determine our own future and set our own standards, norms, and policy. Talking about technological advancements as if they are pre-ordained science erases the role of human autonomy and decision-making in inventing our own future. Big data is not a Leviathan that must be coped with, but a technological trend that we have made possible and support through social and political policy.”
A Citizen’s Guide to Open Government, E-Government, and Government 2.0
Inside the MPA@UNC Blog: “Engaged citizens want clear, credible information from the government about how it’s carrying on its business. They don’t want to thumb through thousands of files or wait month after month or go through the rigors of filing claims through FOIA (Freedom of Information Act). They want government information, services, and communication to be forthcoming and swift. The Open Government, Government 2.0, and E-Governance movements fill the need of connecting citizens with the government and each other to foster a more open, collaborative, and efficient public sector through the use of new technology and public data.
Open Government is defined by the OECD (Organisation for Economic Cooperation and Development) as “the transparency of government actions, the accessibility of government services and information, and the responsiveness of government to new ideas, demands and needs.”
E-Government is defined by the World Bank as “the use by government agencies of information technologies that have the ability to transform relations with citizens, businesses, and other arms of government. These technologies can serve a variety of different ends: better delivery of government services to citizens, improved interactions with business and industry, citizen empowerment through access to information, or more efficient government management. The resulting benefits can be less corruption, increased transparency, greater convenience, revenue growth, and/or cost reductions.”
Government 2.0 is defined by Gartner Research as “the use of Web 2.0 technologies, both internally and externally, to increase collaboration and transparency and potentially transform the way government agencies relate to citizens and operate.”
Open Government and E-Government paved the way for Government 2.0, a collaborative technology whose mission is to improve government transparency and efficiency. How? Gov 2.0 has been called the next generation of government because it not only utilizes new technologies such as social media, cloud computing, and other apps, it is a means to increase citizen participation….
We have compiled a list of organizations, blogs, guides, and tools to help citizens and public service leaders better understand the Open Government, E-Government, and Government 2.0 movement….”