Blog by Steven Hodas at The Center on Reinventing Public Education (CRPE): “Michael Horn’s recent piece on the failure of inBloom captures why it was the very opposite of a disruptive innovation from a markets perspective, as well the fatal blind spots and judgment errors present from its inception.
inBloom was a textbook example of what I call “Innovation 1.0”, which thinks of innovation as a noun, a thing with transformative transitive properties that magically make its recipient “innovative.” It’s the cargo–cult theory of innovation: I give you this innovative thing (a tablet, a data warehouse, an LMS) and you thereby become innovative yourself. This Innovation 1.0 approach to both product and policy has characterized a great deal of foundation and Federal efforts over the past ten years.
But as Michael points out (and as real innovators and entrepreneurs understand viscerally), “innovation” is not a noun but a verb. It is not a thing but a process, a frame of mind, a set of reflexes. He correctly notes the essential iterative approach that characterizes innovation–as–a–verb, its make–something–big–by–making–something–small theory of action (this is fundamentally different from piloting or focus–grouping, but that’s another topic).
But it’s important to go deeper and understand why iteration is important. Simply, it is a means to bake into the process, the product, or the policy a respect for users’ subjectivity and autonomy. In short, functional iteration requires that you listen.
True, durable innovation, “Innovation 2.0” is not some thing I can give to you, do to you, or even do for you: it must be a process I do with you. Lean Startup theory—with its emphasis on iteration, an assumption of the innovator’s fallibility and limited perspective, and the importance of low–cost, low–stakes discovery of product–market fit that Michael describes—is essentially a cookbook for baking empathy into the development of products, services, or policies…..
That doesn’t mean inBloom was a bad idea. But the failure to anticipate its vehement visceral rejection—however misinformed and however cynically exploited by those with larger agendas—was a profound failure of imagination, of empathy, of the respect for user subjectivity that characterizes Innovation 2.0….(More).”
New Implementation Guide for Local Government Innovation
Living Cities Blog and Press Release: “Living Cities, with support from the Citi Foundation, today released a toolkit to help local governments adopt cutting-edge approaches to innovation as part of the City Accelerator program. The implementation guide offers practical guidance to local government officials on how to build a durable culture and practice of innovation that draws from leading practices with promising results from cities around the United States, as well as from the private sector. The guide was developed as part of the City Accelerator, a $3 million program of Living Cities with the Citi Foundation to speed the spread of innovation with the potential to benefit low-income people in local governments. The implementation guide – authored by Nigel Jacob, co-founder of the Mayor’s Office of New Urban Mechanics in Boston and Urban Technologist-in-Residence at Living Cities – addresses some of the key barriers that local governments face when looking to incorporate innovation in their cities, and introduces fresh ideas as well…(More)”
‘Frontier methods’ offer a powerful but accessible approach for measuring the efficiency of public sector organisations
EUROPP Blog of the LSE: “How can the efficiency of public sector organisations best be measured? Jesse Stroobants and Geert Bouckaert write that while the efficiency of an organisation is typically measured using performance indicators, there are some notable problems with this approach, such as the tendency for different indicators to produce conflicting conclusions on organisational performance. As an alternative, they outline so called ‘frontier methods’, which use direct comparisons between different organisations to create a benchmark or standard for performance. They argue that the frontier approach not only alleviates some of the problems associated with performance indicators, but is also broadly accessible for those employed in public administration….
However, despite their merits, there are some drawbacks to using performance indicators. First, they provide only an indirect or partial indication of performance. For instance with respect to efficiency, indicators will be single-input/single-output indicators. Second, they may provide conflicting results: an organisation that appears to do well on one indicator may perform less successfully when considered using another.
In this context, ‘frontier methods’ offer alternative techniques for measuring and evaluating the performance of a group of comparable entities. Unlike single factor measures that reflect only partial aspects of performance, frontier techniques can be applied to assess overall performance by handling multiple inputs and outputs at the same time. Specifically, Data Envelopment Analysis (DEA) and Free Disposal Hull (FDH) have proven to be useful tools for assessing the relative efficiency of entities….
At this point you may be thinking that the term ‘frontier methods’ sounds overly complex or that these techniques are only likely to be of any use to academic specialists. Yet there are a number of reasons why this interpretation would be incorrect. It is indeed true that DEA and FDH have been used predominantly by economists and econometricians, and only rarely by those employed in public administration. We should re-establish this bridge. Therefore, in a recent article, we have provided a step-by-step application of DEA/FDH to benchmark the efficiency of comparable public sector organisations (in the article’s case: public libraries in Flanders). With this gradual approach, we want to offer both academics and practitioners a basic grounding in more advanced efficiency measurement techniques….(More)”.
The 18F Hub
18F Blog: “Clear, organized, and easy access to information is critical to supporting team growth and promoting the kind of culture change that 18F, the U.S. Digital Service, and fellow Innovators and Early Adopters aim to produce throughout federal IT development. As a small step towards that goal, we’d like to announce the 18F Hub, a Jekyll-based documentation platform that aims to help development teams organize and easily share their information, and to enable easy exploration of the connections between team members, projects, and skill sets. It’s still very much alpha-stage software, but over time, we’ll incrementally improve it until it serves as the go-to place for all our team’s working information, whether that information is integrated into the Hub directly or provided as links to other sources. It also serves as a lightweight tool that other teams can experiment with and deploy with a minimum of setup.
While we at 18F strongly believe in the value of transparency and collaboration across government, the details of our team, our projects, and our activities aren’t the real reason we’re launching the Hub: the exposure of our domain knowledge, working models and processes through tangible artifacts that people can adapt to their own environments is… (More).
Open policy making in action: Empowering divorcing couples and separating families to create sustainable solutions
Dr Lucy Kimbell at Open Policy Making Blog (UK Cabinet): “Set up in April 2014, Policy Lab brings new tools and techniques, new insights and practical experimentation to policy-making. This second demonstrator project has over the past two months resulted in learning about how policy professionals can work in a more open, user-centred way to engage with others and generate novel solutions to policy issues.
The project, with the Ministry of Justice (MoJ), is concerned with family mediation during divorce and separation….
The main findings from the Lab’s perspective are in three areas.
Clarifying what user perspectives bring to policy-making.
The project gave us some insights into the potential value of ethnography in policy-making. It was centred around people’s whole experience of divorce or separation, not just their interactions with mediators or lawyers. The research explored what it was like for people now, and the creative activities in the workshop proposed what it could be like for people in the future. Unexpected insights included that some people going through separation and divorce lacked confidence in their ability to make decisions about their futures.
Using person-centred techniques in the workshop made participants accountable to the users. Their stories were read, interpreted and discussed at the start. Throughout the workshop, participants repeatedly raised questions about what a proposed new solution might be like for these personas. It was as if these participants were now accountable to these individuals.
Reconstituting the issue of family mediation.
Another result of this project was to shift from seeing policy-making as primarily as the province of the MoJ towards a collective activity in which many actors and different kinds of expertise needed to be involved. The project constituted policy-making as a complex configuration of socio-cultural, organizational and technological actors, processes, data and resources – more of a living system than a mechanical object with inputs, outputs and policy “levers”.
Starting and ending with people’s lives, not government-funded or delivered services, as the driver to innovate.
Finally, this Lab project looked broadly at people’s lives, not just as users of mediation or court services…. (More)”
Launching Disasters.Data.Gov
OSTP Blog: “Strengthening our Nation’s resilience to disasters is a shared responsibility, with all community members contributing their unique skills and perspectives. Whether you’re a data steward who can unlock information and foster a culture of open data, an innovator who can help address disaster preparedness challenges, or a volunteer ready to join the “Innovation for Disasters” movement, we are excited for you to visit the new disasters.data.gov site, launching today.
First previewed at the White House Innovation for Disaster Response and Recovery Initiative Demo Day, disasters.data.gov is designed to be a public resource to foster collaboration and the continual improvement of disaster-related open data, free tools, and new ways to empower first responders, survivors, and government officials with the information needed in the wake of a disaster.
A screenshot from the new disasters.data.gov web portal.
Today, the Administration is unveiling the first in a series of Innovator Challenges that highlight pressing needs from the disaster preparedness community. The inaugural Innovator Challenge focuses on a need identified from firsthand experience of local emergency management, responders, survivors, and Federal departments and agencies. The challenge asks innovators across the nation: “How might we leverage real-time sensors, open data, social media, and other tools to help reduce the number of fatalities from flooding?”
In addition to this first Innovator Challenge, here are some highlights from disasters.data.gov:….(More)”
The Free 'Big Data' Sources Everyone Should Know
Bernard Marr at Linkedin Pulse: “…The moves by companies and governments to put large amounts of information into the public domain have made large volumes of data accessible to everyone….here’s my rundown of some of the best free big data sources available today.
Data.gov
The US Government pledged last year to make all government data available freely online. This site is the first stage and acts as a portal to all sorts of amazing information on everything from climate to crime. To check it out, click here.
US Census Bureau
A wealth of information on the lives of US citizens covering population data, geographic data and education. To check it out, click here. To check it out, click here.
European Union Open Data Portal
As the above, but based on data from European Union institutions. To check it out, click here.
Data.gov.uk
Data from the UK Government, including the British National Bibliography – metadata on all UK books and publications since 1950. To check it out, click here.
The CIA World Factbook
Information on history, population, economy, government, infrastructure and military of 267 countries. To check it out, click here.
Healthdata.gov
125 years of US healthcare data including claim-level Medicare data, epidemiology and population statistics. To check it out, click here.
NHS Health and Social Care Information Centre
Health data sets from the UK National Health Service. To check it out, click here.
Amazon Web Services public datasets
Huge resource of public data, including the 1000 Genome Project, an attempt to build the most comprehensive database of human genetic information and NASA’s database of satellite imagery of Earth. To check it out, click here.
Facebook Graph
Although much of the information on users’ Facebook profile is private, a lot isn’t – Facebook provide the Graph API as a way of querying the huge amount of information that its users are happy to share with the world (or can’t hide because they haven’t worked out how the privacy settings work). To check it out, click here.
Gapminder
Compilation of data from sources including the World Health Organization and World Bank covering economic, medical and social statistics from around the world. To check it out, click here.
Google Trends
Statistics on search volume (as a proportion of total search) for any given term, since 2004. To check it out, click here.
Google Finance
40 years’ worth of stock market data, updated in real time. To check it out, click here.
Google Books Ngrams
Search and analyze the full text of any of the millions of books digitised as part of the Google Books project. To check it out, click here.
National Climatic Data Center
Huge collection of environmental, meteorological and climate data sets from the US National Climatic Data Center. The world’s largest archive of weather data. To check it out, click here.
DBPedia
Wikipedia is comprised of millions of pieces of data, structured and unstructured on every subject under the sun. DBPedia is an ambitious project to catalogue and create a public, freely distributable database allowing anyone to analyze this data. To check it out, click here.
Topsy
Free, comprehensive social media data is hard to come by – after all their data is what generates profits for the big players (Facebook, Twitter etc) so they don’t want to give it away. However Topsy provides a searchable database of public tweets going back to 2006 as well as several tools to analyze the conversations. To check it out, click here.
Likebutton
Mines Facebook’s public data – globally and from your own network – to give an overview of what people “Like” at the moment. To check it out, click here.
New York Times
Searchable, indexed archive of news articles going back to 1851. To check it out, click here.
Freebase
A community-compiled database of structured data about people, places and things, with over 45 million entries. To check it out, click here.
Million Song Data Set
Metadata on over a million songs and pieces of music. Part of Amazon Web Services. To check it out, click here.”
See also Bernard Marr‘s blog at Big Data Guru
Lab Rats
Clare Dwyer Hogg at the Long and Short: “Do you remember how you were feeling between 11 and 18 January, 2012? If you’re a Facebook user, you can scroll back and have a look. Your status updates might show you feeling a little bit down, or cheery. All perfectly natural, maybe. But if you were one of 689,003 unwitting users selected for an experiment to determine whether emotions are contagious, then maybe not. The report on its findings was published in March this year: “Experimental evidence of massive-scale emotional contagion through social networks”. How did Facebook do it? Very subtly, by adjusting the algorithm of selected users’ news feeds. One half had a reduced chance of being exposed to positive updates, the other had a more upbeat newsfeed. Would users be more inclined to feel positive or negative themselves, depending on which group they were in? Yes. The authors of the report found – by extracting the posts of the people they were experimenting on – that, indeed, emotional states can be transferred to others, “leading people to experience the same emotions without their awareness”.
It was legal (see Facebook’s Data Use Policy). Ethical? The answer to that lies in the shadows. A one-off? Not likely. When revealed last summer, the Facebook example created headlines around the world – and another story quickly followed. On 28 July, Christian Rudder, a Harvard math graduate and one of the founders of the internet dating site OkCupid, wrote a blog post titled “We Experiment on Human Beings!”. In it, he outlined a number of experiments they performed on their users, one of which was to tell people who were “bad matches” (only 30 per cent compatible, according to their algorithm) that they were actually “exceptionally good for each other” (which usually requires a 90 per cent match). OkCupid wanted to see if mere suggestion would inspire people to like each other (answer: yes). It was a technological placebo. The experiment found that the power of suggestion works – but so does the bona fide OkCupid algorithm. Outraged debates ensued, with Rudder defensive. “This is the only way to find this stuff out,” he said, in one heated radio interview. “If you guys have an alternative to the scientific method, I’m all ears.”…
The debate, says Mark Earls, should primarily be about civic responsibility, even before the ethical concerns. Earls is a towering figure in the world of advertising and communication, and his book Herd: How to Change Mass Behaviour By Harnessing our True Nature, was a gamechanger in how people in the industry thought about what drives us to make decisions. That was a decade ago, before Facebook, and it’s increasingly clear that his theories were prescient.
He kept an eye on the Facebook experiment furore, and was, he says, heavily against the whole concept. “They’re supporting the private space between people, their contacts and their social media life,” he says. “And then they abused it.”…”
Transparency is not just television
Beth Noveck at the SunLight Foundation Blog: “In 2009, Larry Lessig published a headline-grabbing piece in the New Republic entitled “Against Transparency,” arguing that the “naked transparency movement” might inspire disgust in, rather than reform of, our political system. In their recent Brookings Institution paper, “Why Critics of Transparency Are Wrong,” Gary Bass, Danielle Brian and Norm Eisen rightly critique the latest generation of naysayers who contend that transparency has contributed to our political ills, and that efforts to reduce transparency will cause government to work better. The problem with suggesting that transparency is the root of extreme partisan gridlock, the absence of dealmaking, and the lowest rates of trust by the American people in Congress, however, is that there is no real transparency in that institution.
If there is any shortcoming in Bass, Brian and Eisen’s robust defense of transparency, it is that they should be even tougher in rapping these other writers across the knuckles, including for some critics’ unsophisticated equation of television cameras in the chamber with transparency.
Even in our media-savvy age, televising congressional deliberations (if you can call them that) – what we might call political transparency – surely contributes too little to policy transparency. It lays bare the spectacle but does not provide access to the kinds of information disclosures about the workings of Congress in a way that also affords people an opportunity to participate in changing those workings and that Bass, Brian and Eisen also support. Done right, transparency provides an empirical foundation for developing solutions together. If the Brits can have a 21st Century parliament initiative, we are long overdue for a 21st century Congress….”
Designing a Citizen Science and Crowdsourcing Toolkit for the Federal Government
2013 Second Open Government National Action Plan, President Obama called on Federal agencies to harness the ingenuity of the public by accelerating and scaling the use of open innovation methods, such as citizen science and crowdsourcing, to help address a wide range of scientific and societal problems.
Citizen science is a form of open collaboration in which members of the public participate in the scientific process, including identifying research questions, collecting and analyzing data, interpreting results, and solving problems. Crowdsourcing is a process in which individuals or organizations submit an open call for voluntary contributions from a large group of unknown individuals (“the crowd”) or, in some cases, a bounded group of trusted individuals or experts.
Citizen science and crowdsourcing are powerful tools that can help Federal agencies:
- Advance and accelerate scientific research through group discovery and co-creation of knowledge. For instance, engaging the public in data collection can provide information at resolutions that would be difficult for Federal agencies to obtain due to time, geographic, or resource constraints.
- Increase science literacy and provide students with skills needed to excel in science, technology, engineering, and math (STEM). Volunteers in citizen science or crowdsourcing projects gain hands-on experience doing real science, and take that learning outside of the classroom setting.
- Improve delivery of government services with significantly lower resource investments.
- Connect citizens to the missions of Federal agencies by promoting a spirit of open government and volunteerism.
To enable effective and appropriate use of these new approaches, the Open Government National Action Plan specifically commits the Federal government to “convene an interagency group to develop an Open Innovation Toolkit for Federal agencies that will include best practices, training, policies, and guidance on authorities related to open innovation, including approaches such as incentive prizes, crowdsourcing, and citizen science.”
On November 21, 2014, the Office of Science and Technology Policy (OSTP) kicked off development of the Toolkit with a human-centered design workshop. Human-centered design is a multi-stage process that requires product designers to engage with different stakeholders in creating, iteratively testing, and refining their product designs. The workshop was planned and executed in partnership with the Office of Personnel Management’s human-centered design practice known as “The Lab” and the Federal Community of Practice on Crowdsourcing and Citizen Science (FCPCCS), a growing network of more than 100 employees from more than 20 Federal agencies….
The Toolkit will help further the culture of innovation, learning, sharing, and doing in the Federal citizen science and crowdsourcing community: indeed, the development of the Toolkit is a collaborative and community-building activity in and of itself.
The following successful Federal projects illustrate the variety of possible citizen science and crowdsourcing applications:
- The Citizen Archivist Dashboard (NARA) coordinates crowdsourced archival record tagging and document transcription. Recently, more than 170,000 volunteers indexed 132 million names of the 1940 Census in only five months, which NARA could not have done alone.
- Through Measuring Broadband America (FCC), 2 million volunteers collected and provided the FCC with data on their Internet speeds, data that FCC used to create a National Broadband Map revealing digital divides.
- In 2014, Nature’s Notebook (USGS, NSF) volunteers recorded more than 1 million observations on plants and animals that scientists use to analyze environmental change.
- Did You Feel It? (USGS) has enabled more than 3 million people worldwide to share their experiences during and immediately after earthquakes. This information facilitates rapid damage assessments and scientific research, particularly in areas without dense sensor networks.
- The mPING (NOAA) mobile app has collected more than 600,000 ground-based observations that help verify weather models.
- USAID anonymized and opened its loan guarantee data to volunteer mappers. Volunteers mapped 10,000 data points in only 16 hours, compared to the 60 hours officials expected.
- The Air Sensor Toolbox (EPA), together with training workshops, scientific partners, technology evaluations, and a scientific instrumentation loan program, empowers communities to monitor and report local air pollution.
In early 2015, OSTP, in partnership with the Challenges and Prizes Community of Practice, will convene Federal practitioners to develop the other half of the Open Innovation Toolkit for prizes and challenges. Stay tuned!”