But the vast majority of data collected by governments never sees the light of day. It sits squirreled away on servers, and is only rarely cross-referenced in ways that private sector companies do all the time to gain insights into what’s actually going on across the country, and emerging problems and opportunities. Yet as governments all around the world have realized, if shared safely with due precautions to protect individual privacy, in the hand of citizens all of this data could be a national civic monument of tremendous economic and social value.”
Why the world’s governments are interested in creating hubs for open data
Katie Fehrenbacher in Gigaom: “Amid the tech giants and eager startups that have camped out in East London’s trendy Shoreditch neighborhood, the Open Data Institute is the rare nonprofit on the block that talks about feel-good sorts of things like “triple-bottom line” and “social and environmental value.” …Governments everywhere are embracing the idea that open data is the right way to manage services for citizens. The U.K. has been a leader on this — just check out the simplicity of gov.uk — which is one of the reasons why ODI is U.K. born….“Open data” is open access to the data that has exploded on the scene in recent years, some of it due to the rise of our connected, digital lifestyles from the internet, sensors, GPS, and cell phones, just to name a few resources. But ODI is particularly interested in working with data sets that can have big global and societal impacts, like health, financial, environmental and government data. For example, in conjunction with startup OpenCorporates, ODI recently helped launch a data visualization about Goldman Sachs’s insanely complex corporate structure.”
The World is a Natural Laboratory, and Social Media is the New Petri Dish
Perspective by Jean-Loup Rault et al in Ethology: “Many high-priority and high-interest species are challenging to study due to the difficulty in accessing animals and/or obtaining sufficient sample sizes. The recent explosion in technology, particularly social media and live webcams available on the Internet, provides new opportunities for behavioral scientists to collect data not just on our own species, as well as new resources for teaching and outreach. We discuss here the possibility of exploiting online media as a new source of behavioral data, which we termed ‘video mining’. This article proposes epidemiological and ethological field techniques to gather and screen online media as a data source on diverse taxa. This novel method provides access to a rich source of untapped knowledge, particularly to study the behavior of understudied species or sporadic behaviors, but also for teaching or monitoring animals in challenging settings.”
A Videogame That Recruits Players to Map the Brain
Wired: “I’m no neuroscientist, and yet, here I am at my computer attempting to reconstruct a neural circuit of a mouse’s retina. It’s not quite as difficult and definitely not as boring as it sounds. In fact, it’s actually pretty fun, which is a good thing considering I’m playing a videogame.
Called EyeWire, the browser-based game asks players to map the connections between retinal neurons by coloring in 3-D slices of the brain. Much like any other game out there, being good at EyeWire earns you points, but the difference is that the data you produce during gameplay doesn’t just get you on a leader board—it’s actually used by scientists to build a better picture of the human brain.
Created by neuroscientist Sebastian Seung’s lab at MIT, EyeWire basically gamifies the professional research Seung and his collaborators do on a daily basis. Seung is studying the connectome, the hyper-complex tangle of connections among neurons in the brain.”
Big data, crowdsourcing and machine learning tackle Parkinson’s
Successful Workingplace: “Parkinson’s is a very tough disease to fight. People suffering from the disease often have significant tremors that keep them from being able to create accurate records of their daily challenges. Without this information, doctors are unable to fine tune drug dosages and other treatment regimens that can significantly improve the lives of sufferers.
It was a perfect catch-22 situation until recently, when the Michael J. Fox Foundation announced that LIONsolver, a company specializing in machine learning software, was able to differentiate Parkinson’s patients from healthy individuals and to also show the trend in symptoms of the disease over time.…
To set up the competition, the Foundation worked with Kaggle, an organization that specializes in crowdsourced big data analysis competitions. The use of crowdsourcing as a way to get to the heart of very difficult Big Data problems works by allowing people the world over from a myriad of backgrounds and with diverse experiences to devote time on personally chosen challenges where they can bring the most value. It’s a genius idea for bringing some of the scarcest resources together with the most intractable problems.”
International Principles on the Application of Human Rights to Communications Surveillance
Final version, 10 July 2013: “As technologies that facilitate State surveillance of communications advance, States are failing to ensure that laws and regulations related to communications surveillance adhere to international human rights and adequately protect the rights to privacy and freedom of expression. This document attempts to explain how international human rights law applies in the current digital environment, particularly in light of the increase in and changes to communications surveillance technologies and techniques. These principles can provide civil society groups, industry, States and others with a framework to evaluate whether current or proposed surveillance laws and practices are consistent with human rights.
These principles are the outcome of a global consultation with civil society groups, industry and international experts in communications surveillance law, policy and technology.”
New Report Finds Cost-Benefit Analyses Improve Budget Choices & Taxpayer Results
Press Release: “A new report shows cost-benefit analyses have helped states make better investments of public dollars by identifying programs and policies that deliver high returns. However, the majority of states are not yet consistently using this approach when making critical decisions. This 50-state look at cost-benefit analysis, a method that compares the expense of public programs to the returns they deliver, was released today by the Pew-MacArthur Results First Initiative, a project of The Pew Charitable Trusts and the John D. and Catherine T. MacArthur Foundation.
The study, “States’ Use of Cost-benefit Analysis: Improving Results for Taxpayers”, comes at a time when states are under continuing pressure to direct limited dollars toward the most cost-effective programs and policies while curbing spending on those that do not deliver. The report is the first comprehensive study of how all 50 states and the District of Columbia analyze the costs and benefits of programs and policies, report findings, and incorporate the assessments into decision-making. It identifies key challenges states face in conducting and using the analyses and offers strategies to overcome those obstacles. The study includes a review of state statutes, a search for cost benefit analyses released between 2008 and 2011, and interviews with legislators, legislative and program evaluation staff, executive officials, report authors, and agency officials.”
New Book: Untangling the Web
By Aleks Krotoski: “The World Wide Web is the most revolutionary innovation of our time. In the last decade, it has utterly transformed our lives. But what real effects is it having on our social world? What does it mean to be a modern family when dinner table conversations take place over smartphones? What happens to privacy when we readily share our personal lives with friends and corporations? Are our Facebook updates and Twitterings inspiring revolution or are they just a symptom of our global narcissism? What counts as celebrity, when everyone can have a following or be a paparazzo? And what happens to relationships when love, sex and hate can be mediated by a computer? Social psychologist Aleks Krotoski has spent a decade untangling the effects of the Web on how we work, live and play. In this groundbreaking book, she uncovers how much humanity has – and hasn’t – changed because of our increasingly co-dependent relationship with the computer. In Untangling the Web, she tells the story of how the network became woven in our lives, and what it means to be alive in the age of the Internet.” Blog: http://untanglingtheweb.tumblr.com/
The Recent Rise of Government Open Data APIs
Janet Wagner in ProgrammableWeb: “In recent months, the number of government open data APIs has been increasing rapidly due to a variety of factors including the development of open data technology platforms, the launch of Project Open Data and a recent White House executive order regarding government data.
ProgrammableWeb writer Mark Boyd has recently written three articles related to open data APIs; an article about the latest release of the CKAN API, an article about the UK Open Data Institute and an article about the CivOmega Open Data Search Engine. This post is a brief overview of several recent factors that have led to the rise of government open data APIs.”
Orwell is drowning in data: the volume problem
Dom Shaw in OpenDemocracy: “During World War II, whilst Bletchley Park laboured in the front line of code breaking, the British Government was employing vast numbers of female operatives to monitor and report on telephone, mail and telegraph communications in and out of the country.
The biggest problem, of course, was volume. Without even the most primitive algorithm to detect key phrases that later were to cause such paranoia amongst the sixties and seventies counterculture, causing a whole generation of drug users to use a wholly unnecessary set of telephone synonyms for their desired substance, the army of women stationed in exchanges around the country was driven to report everything and then pass it on up to those whose job it was to analyse such content for significance.
Orwell’s vision of Big Brother’s omniscience was based upon the same model – vast armies of Winston Smiths monitoring data to ensure discipline and control. He saw a culture of betrayal where every citizen was held accountable for their fellow citizens’ political and moral conformity.
Up until the US Government’s Big Data Research and Development Initiative [12] and the NSA development of the Prism programme [13], the fault lines always lay in the technology used to collate or collect and the inefficiency or competing interests of the corporate systems and processes that interpreted the information. Not for the first time, the bureaucracy was the citizen’s best bulwark against intrusion.
Now that the algorithms have become more complex and the technology tilted towards passive surveillance through automation, the volume problem becomes less of an obstacle….
The technology for obtaining this information, and indeed the administration of it, is handled by corporations. The Government, driven by the creed that suggests private companies are better administrators than civil servants, has auctioned off the job to a dozen or more favoured corporate giants who are, as always, beholden not only to their shareholders, but to their patrons within the government itself….
The only problem the state had was managing the scale of the information gleaned from so many people in so many forms. Not any more. The volume problem has been overcome.”