The Alberta CoLab Story: Redesigning the policy development process in government


Alex Ryan at Medium: “Alberta CoLab is an evolving experiment built on three counter-intuitive ideas:

1. Culture shifts faster through collaborative project work than through a culture change initiative.

2. The way to accelerate policy development is to engage more perspectives and more complexity.

3. The best place to put a cross-ministry design team is in a line ministry.

I want to explain what CoLab is and why it has evolved the way it has. We don’t view CoLab as a best practice to be replicated, since our model is tailored to the specific culture and context of Alberta. Perhaps you are also trying to catalyze innovation inside a large bureaucratic organization. I hope you can learn something from our journey so far,….

….Both the successes and frustrations of Alberta CoLab are consequences of the way that we have mediated some key tensions and tradeoffs involved with setting up a public sector innovation lab. Practitioners in other labs will likely recognize these tensions and tradeoffs, although your successes and frustrations will be different depending on how your business model reconciles them.

  1. Where should the lab be? Public innovation labs can exist inside, outside, or on the edge of government. Dubai The Model Centre and Alberta CoLab operate inside government. Inside labs have the best access to senior decision makers and the authority to convene whole of government collaborations, but may find it harder to engage openly with citizens and stakeholders. Unicef Innovation Labs and NouLab exist outside of government. Outside labs have more freedom in who they convene, the kind of container they can create, and timelines to impact, but find it harder to connect with and affect policy change. MindLab and MaRS Solutions Lab are examples of labs on the edge of government. This positioning can offer the best of both worlds. However, edge labs are vulnerable to fluctuations in their relationship with government. Surviving and thriving on the edge means continually walking a tightrope between autonomy and integration. Labs can change their positioning. Alberta CoLab began as an external consulting project. The Behavioural Insights Team is a social purpose company that was spun-off from a lab inside the U.K. government. The location of the lab is unlikely to change often, so it is an important strategic choice.
  2. How deep should the lab go? Here the tension is between taking on small, tactical improvement projects that deliver tangible results, or tackling the big, strategic systems changes that will take years to manifest. Public sector innovation labs are a reaction to the almost total failure of traditional approaches to move the needle on systems change.Therefore, most labs have aspirations to the strategic and the systemic. Yet most labs are also operating in a dominant culture that demands quick wins and measures success by linear progress against a simple logic model theory of change. We believe that operating at either extreme of this spectrum is equally misguided. We use a portfolio approach and a barbell strategy to mediate this tension. Having a portfolio of projects allows us to invest energy in systems change and generate immediate value. It allows us to balance our projects across three horizons of innovation: sustaining innovations; disruptive innovations; and transformative innovations. A barbell strategy means avoiding the middle of the bell curve. We maintain a small number of long-term, flagship initiatives, combined with a rapid turnover of quick-win projects. This allows us to remind the organization of our immediate value without sacrificing long-term commitment to systems change.
  3. What relationship should the lab have with government? Even an inside lab must create some distance between itself and the broader government culture if it is to provide a safe space for innovation. There is a tension between being separate and being integrated. Developing novel ideas that get implemented requires the lab to be both separate and integrated at the same time. You need to decouple from regular policy cycles to enable divergence and creativity, yet provide input into key decisions at the right time. Sometimes these decision points are known in advance, but more often this means sensing and responding to a dynamic decision landscape. Underneath any effective lab is a powerful social network, which needs to cut across government silos and stratas and draw in external perspectives. I think of a lab as having a kind of respiratory rhythm. It starts by bringing fresh ideas into the organization, like a deep breath that provides the oxygen for new thinking. But new ideas are rarely welcome in old organizations. When the lab communicates outwards, these new ideas should be translated into familiar language and concepts, and then given a subtle twist. Often labs believe they have to differentiate their innovations — to emphasize novelty — to justify their existence as an innovation lab. But the more the output of the lab resembles the institutional culture, the more it appears obvious and familiar, the more likely it will be accepted and integrated into the mainstream.
  4. What relationship should the lab have with clients? Alberta CoLab is a kind of in-house consultancy that provides services to clients across all ministries. There is a tension in the nature of the relationship, which can span from consulting problem-solver to co-design facilitator to teacher. The main problem with a consulting model is it often builds dependency rather than capacity. The challenge with an educational relationship is that clients struggle to apply theory that is disconnected from practice. We often use facilitation as a ‘cover’ for our practice, because it allows us to design a process that enables both reflective practice and situated learning. By teaching systemic design and strategic foresight approaches through taking on live projects, we build capacity while doing the work our clients need to do anyway. This helps to break down barriers between theory and practice, learning and doing. Another tension is between doing what the client says she wants and what she needs but does not articulate. Unlike a customer, who is always right, the designer has a duty of care to their client. This involves pushing back when the client demands are unreasonable, reframing the challenge when the problem received is a symptom of a deeper issue, and clearly communicating the risks and potential side effects of policy options. As Denys Lasdun has said about designers: “Our job is to give the client, on time and on cost, not what he wants, but what he never dreamed he wanted; and when he gets it, he recognizes it as something he wanted all the time.”

Lessons Learned

These are our top lessons learned from our journey to date that may have broader applicability.

  1. Recruit outsiders and insiders. Bringing in outside experts elevates the lab’s status. Outsiders are essential to question and challenge organizational patterns that insiders take as given. Insiders bring an understanding of organizational culture. They know how to move files through the bureaucracy and they know where the landmines are.
  2. Show don’t tell. As lab practitioners, we tend to be process geeks with a strong belief in the superiority of our own methods. There is a temptation to cast oneself in the role of the missionary bringing the good word to the unwashed masses. Not only is this arrogant, it’s counter-productive. It’s much more effective to show your clients how your approach adds value by starting with a small collaborative project. If your approach really is as good as you believe it is, the results will speak for themselves. Once people are envious of the results you have achieved, they will be curious and open to learning how you did it, and they will demand more of it.
  3. Be a catalyst, not a bottleneck. Jess McMullin gave us this advice when we founded CoLab. It’s why we developed a six day training course to train over 80 systemic designers across the government. It’s why we run communities of practice on systemic design and strategic foresight. And it’s why we publish about our experiences and share the toolkits we develop. If the innovation lab is an ivory tower, it will not change the way government works. Think instead of the lab as the headquarters of a democratic grassroots movement.
  4. Select projects based on the potential for reframing. There are many criteria we apply when we decide whether to take on a new project. Is it a strategic priority? Is there commitment to implement? Are the client expectations realistic? Can our contribution have a positive impact? These are useful but apply to almost any service offering. The unique value a social innovation lab offers is discontinuous improvement. The source of discontinuous improvement is reframing — seeing a familiar challenge with new eyes, from a different perspective that opens up new potential for positive change. If a project ticks all the boxes, except that the client is certain they already know what the problem is, then that already limits the kind of solutions they will consider. Unless they are open to reframing, they will likely be frustrated by a lab approach, and would be better served by traditional facilitation or good project management.
  5. Prototyping is just the end of the beginning. After one year, we went around and interviewed the first 40 clients of Alberta CoLab. We wanted to know what they had achieved since our co-design sessions. Unfortunately, for most of them, the answer was “not much.” They were very happy with the quality of the ideas and prototypes generated while working with CoLab and were hopeful that the ideas would eventually see the light of day. But they also noted that once participants left the lab and went back to their desks, they found it difficult to sustain the momentum and excitement of the lab, and easy to snap back to business as usual. We had to pivot our strategy to take on fewer projects, but take on a greater stewardship role through to implementation.
  6. Find a rhythm. It’s not useful to create a traditional project plan with phases and milestones for a non-linear and open-ended discovery process like a lab. Yet without some kind of structure, it’s easy to lose momentum or become lost. The best projects I have participated in create a rhythm: an alternating movement between open collaboration and focused delivery. The lab opens up every few months to engage widely on what needs to be done and why. A core team then works between collaborative workshops on how to make it happen. Each cycle allows the group to frame key challenges, make progress, and receive feedback, which builds momentum and commitment.
  7. Be a good gardener. Most of the participants of our workshops arrive with a full plate. They are already 100% committed in their day jobs. Even when they are enthusiastic to ideate, they will be reluctant to take on any additional work. If we want our organizations to innovate, first we have to create the space for new work. We need to prune those projects that we have kept on life support — not yet declared dead but not priorities. This often means making difficult decisions. The flip side of pruning is to actively search for positive deviance and help it to grow. When you find something that’s already working, you just need to turn up the good…..(More)”

Innovation and Its Enemies: Why People Resist New Technologies


]Book by Calestous Juma: “The rise of artificial intelligence has rekindled a long-standing debate regarding the impact of technology on employment. This is just one of many areas where exponential advances in technology signal both hope and fear, leading to public controversy. This book shows that many debates over new technologies are framed in the context of risks to moral values, human health, and environmental safety. But it argues that behind these legitimate concerns often lie deeper, but unacknowledged, socioeconomic considerations. Technological tensions are often heightened by perceptions that the benefits of new technologies will accrue only to small sections of society while the risks will be more widely distributed. Similarly, innovations that threaten to alter cultural identities tend to generate intense social concern. As such, societies that exhibit great economic and political inequities are likely to experience heightened technological controversies.

Drawing from nearly 600 years of technology history, Innovation and Its Enemies identifies the tension between the need for innovation and the pressure to maintain continuity, social order, and stability as one of today’s biggest policy challenges. It reveals the extent to which modern technological controversies grow out of distrust in public and private institutions. Using detailed case studies of coffee, the printing press, margarine, farm mechanization, electricity, mechanical refrigeration, recorded music, transgenic crops, and transgenic animals, it shows how new technologies emerge, take root, and create new institutional ecologies that favor their establishment in the marketplace. The book uses these lessons from history to contextualize contemporary debates surrounding technologies such as artificial intelligence, online learning, 3D printing, gene editing, robotics, drones, and renewable energy. It ultimately makes the case for shifting greater responsibility to public leaders to work with scientists, engineers, and entrepreneurs to manage technological change, make associated institutional adjustments, and expand public engagement on scientific and technological matters….(More)”

First, design for data sharing


John Wilbanks & Stephen H Friend in Nature Biotechnology: “To upend current barriers to sharing clinical data and insights, we need a framework that not only accounts for choices made by trial participants but also qualifies researchers wishing to access and analyze the data.

This March, Sage Bionetworks (Seattle) began sharing curated data collected from >9,000 participants of mPower, a smartphone-enabled health research study for Parkinson’s disease. The mPower study is notable as one of the first observational assessments of human health to rapidly achieve scale as a result of its design and execution purely through a smartphone interface. To support this unique study design, we developed a novel electronic informed consent process that includes participant-determined data-sharing preferences. It is through these preferences that the new data—including self-reported outcomes and quantitative sensor data—are shared broadly for secondary analysis. Our hope is that by sharing these data immediately, prior even to our own complete analysis, we will shorten the time to harnessing any utility that this study’s data may hold to improve the condition of patients who suffer from this disease.

Turbulent times for data sharing

Our release of mPower comes at a turbulent time in data sharing. The power of data for secondary research is top of mind for many these days. Vice President Joe Biden, in heading President Barack Obama’s ambitious cancer ‘moonshot’, describes data sharing as second only to funding to the success of the effort. However, this powerful support for data sharing stands in opposition to the opinions of many within the research establishment. To wit, the august New England Journal of Medicine (NEJM)’s recent editorial suggesting that those who wish to reuse clinical trial data without the direct participation and approval of the original study team are “research parasites”4. In the wake of colliding perspectives on data sharing, we must not lose sight of the scientific and societal ends served by such efforts.

It is important to acknowledge that meaningful data sharing is a nontrivial process that can require substantial investment to ensure that data are shared with sufficient context to guide data users. When data analysis is narrowly targeted to answer a specific and straightforward question—as with many clinical trials—this added effort might not result in improved insights. However, many areas of science, such as genomics, astronomy and high-energy physics, have moved to data collection methods in which large amounts of raw data are potentially of relevance to a wide variety of research questions, but the methodology of moving from raw data to interpretation is itself a subject of active research….(More)”

Website Seeks to Make Government Data Easier to Sift Through


Steve Lohr at the New York Times: “For years, the federal government, states and some cities have enthusiastically made vast troves of data open to the public. Acres of paper records on demographics, public health, traffic patterns, energy consumption, family incomes and many other topics have been digitized and posted on the web.

This abundance of data can be a gold mine for discovery and insights, but finding the nuggets can be arduous, requiring special skills.

A project coming out of the M.I.T. Media Lab on Monday seeks to ease that challenge and to make the value of government data available to a wider audience. The project, called Data USA, bills itself as “the most comprehensive visualization of U.S. public data.” It is free, and its software code is open source, meaning that developers can build custom applications by adding other data.

Cesar A. Hidalgo, an assistant professor of media arts and sciences at the M.I.T. Media Lab who led the development of Data USA, said the website was devised to “transform data into stories.” Those stories are typically presented as graphics, charts and written summaries….Type “New York” into the Data USA search box, and a drop-down menu presents choices — the city, the metropolitan area, the state and other options. Select the city, and the page displays an aerial shot of Manhattan with three basic statistics: population (8.49 million), median household income ($52,996) and median age (35.8).

Lower on the page are six icons for related subject categories, including economy, demographics and education. If you click on demographics, one of the so-called data stories appears, based largely on data from the American Community Survey of the United States Census Bureau.

Using colorful graphics and short sentences, it shows the median age of foreign-born residents of New York (44.7) and of residents born in the United States (28.6); the most common countries of origin for immigrants (the Dominican Republic, China and Mexico); and the percentage of residents who are American citizens (82.8 percent, compared with a national average of 93 percent).

Data USA features a selection of data results on its home page. They include the gender wage gap in Connecticut; the racial breakdown of poverty in Flint, Mich.; the wages of physicians and surgeons across the United States; and the institutions that award the most computer science degrees….(More)

Smart Cities Readiness Guide


SmartCitiesCouncil: “Welcome to the Readiness Guide. This document was assembled with input from many of the world’s leading smart city practitioners – the members and advisors of the Smart Cities Council. It will help you create a vision for the future of your own city. Equally important, it will help you build an action plan to get to that better future.

The first goal of the Readiness Guide is to give you a “vision” of a smart city, to help you understand how technology will transform the cities of tomorrow.

The second goal is to help you construct your own roadmap to that future. It suggests the goals to which you should aspire, the features and functions you should specify, the best practices that will gain you the maximum benefits for the minimum cost, at reduced risk.

The Readiness Guide is intended for mayors, city managers, city planners and their staffs. It helps cities help themselves by providing objective, vendor-neutral information to make confident, educated choices about the technologies that can transform a city.

Cities around the world are already making tremendous progress in achieving economic, environmental and social sustainability, in export-based initiatives and in the creation of 21st century jobs. All of these are excellent ways to improve city living standards and economies. The concept of smart cities doesn’t compete with these efforts. Instead, smart city technologies can support and  enhance work already underway….Contents:

Crowdsourcing On-street Parking Space Detection


Paper by Ruizhi Liao et al in: “As the number of vehicles continues to grow, parking spaces are at a premium in city streets. Additionally, due to the lack of knowledge about street parking spaces, heuristic circling the blocks not only costs drivers’ time and fuel, but also increases city congestion. In the wake of recent trend to build convenient, green and energy-efficient smart cities, we rethink common techniques adopted by high-profile smart parking systems, and present a user-engaged (crowdsourcing) and sonar-based prototype to identify urban on-street parking spaces. The prototype includes an ultrasonic sensor, a GPS receiver and associated Arduino micro-controllers. It is mounted on the passenger side of a car to measure the distance from the vehicle to the nearest roadside obstacle. Multiple road tests are conducted around Wheatley, Oxford to gather results and emulate the crowdsourcing approach. By extracting parked vehicles’ features from the collected trace, a supervised learning algorithm is developed to estimate roadside parking occupancy and spot illegal parking vehicles. A quantity estimation model is derived to calculate the required number of sensing units to cover urban streets. The estimation is quantitatively compared to a fixed sensing solution. The results show that the crowdsourcing way would need substantially fewer sensors compared to the fixed sensing system…(More)”

A machine intelligence commission for the UK


Geoff Mulgan at NESTA: ” This paper makes the case for creating a Machine Intelligence Commission – a new public institution to help the development of new generations of algorithms, machine learning tools and uses of big data, ensuring that the public interest is protected.

I argue that new institutions of this kind – which can interrogate, inspect and influence technological development – are a precondition for growing informed public trust. That trust will, in turn, be essential if we are to reap the full potential public and economic benefits from new technologies. The proposal draws on lessons from fields such as human fertilisation, biotech and energy, which have shown how trust can be earned, and how new industries can be grown.  It also draws on lessons from the mistakes made in fields like GM crops and personal health data, where lack of trust has impeded progress….(More)”

Technology and the Future of Cities


Mark Gorenberg, Craig Mundie, Eric Schmidt and Marjory Blumenthal at PCAST: “Growing urbanization presents the United States with an opportunity to showcase its innovation strength, grow its exports, and help to improve citizens’ lives – all at once. Seizing this triple opportunity will involve a concerted effort to develop and apply new technologies to enhance the way cities work for the people who live there.

A new report released today by the President’s Council of Advisors on Science and Technology (PCAST), Technology and the Future of Cities, lays out why now is a good time to promote technologies for cities: more (and more diverse) people are living in cities; people are increasingly open to different ways of using space, living, working, and traveling across town; physical infrastructures for transportation, energy, and water are aging; and a wide range of innovations are in reach that can yield better infrastructures and help in the design and operation of city services.

There are also new ways to collect and use information to design and operate systems and services. Better use of information can help make the most of limited resources – whether city budgets or citizens’ time – and help make sure that the neediest as well as the affluent benefit from new technology.

Although the vision of technology’s promise applies city-wide, PCAST suggests that a practical way for cities to adopt infrastructural and other innovation is by starting in a discrete area  – a district, the dimensions of which depend on the innovation in question. Experiences in districts can help inform decisions elsewhere in a given city – and in other cities. PCAST urges broader sharing of information about, and tools for, innovation in cities.

Such sharing is already happening in isolated pockets focused on either specific kinds of information or recipients of specific kinds of funding. A more comprehensive City Web, achieved through broader interconnection, could inform and impel urban innovation. A systematic approach to developing open-data resources for cities is recommended, too.

PCAST recommends a variety of steps to make the most of the Federal Government’s engagement with cities. To begin, it calls for more – and more effective – coordination among Federal agencies that are key to infrastructural investments in cities.  Coordination across agencies, of course, is the key to place-based policy. Building on the White House Smart Cities Initiative, which promotes not only R&D but also deployment of IT-based approaches to help cities solve challenges, PCAST also calls for expanding research and development coordination to include the physical, infrastructural technologies that are so fundamental to city services.

A new era of city design and city life is emerging. If the United States steers Federal investments in cities in ways that foster innovation, the impacts can be substantial. The rest of the world has also seen the potential, with numerous cities showcasing different approaches to innovation. The time to aim for leadership in urban technologies and urban science is now….(More)”

Cities want to get smarter, so why is it taking so long?


Kevin Ebi at Smart Cities Council: “Most cities and utilities want to get smarter. They see the smart cities movement as delivering more than some incremental improvement. They see it as a meaningful transformation — one that delivers far more than just some cost savings.

Despite all that, the latest Black & Veatch Strategic Directions: U.S. Smart City/Smart Utility Report finds they plan to move slower — not faster — to become smarter. But understanding the obstacles can help you overcome them.

First, the good news
Cities don’t need to be sold on the idea of becoming smarter. More than 90% see the smart cities movement as being transformational with long-term lasting impacts.

Nearly 80% believe it should start with initiatives that have lasting benefits — even if that work is largely behind the scenes (and therefore less likely for the public to notice.) A similar number also believe that data analytics will significantly improve decision making. And nearly all believe it’s a comprehensive effort; it’s more than just buying some new technology.

The smart cities revolution is also inclusive. More than three-quarters say that energy, water and telecommunications providers should play a leadership role in smart cities initiatives — they shouldn’t be relegated to a supporting role.

And growing numbers see smart cities initiatives as something more than just a vehicle to cut costs. This year, more respondents — cities leaders and utilities alike — see the potential to become more sustainable, better manage community resources and to attract business investment.

But there’s also room for improvement
Despite clearly understanding the value of smart cities initiatives, the survey finds respondents are losing faith the transition can happen quickly. Last year, the study found that nearly 1 in 5 thought the smart cities model would be widespread in American cities within the next five years. This year, not even 1 in 10 believe that timeline is achievable.

Instead, more than a third now believe the implementation could take a decade. Nearly a quarter believe it could take 15 years. More than 80% believe the U.S. is lagging the world in the smart cities revolution.

What’s holding them back
Part of the problem may be a big knowledge gap. While people responding to the survey say they understand the potential, more than half say their city still doesn’t understand what it means to be a “smart city.”

And while half the cities and utilities are assessing their readiness — a third are even working on roadmaps — nearly two-thirds still don’t understand where the payoff point is. That may be adding to the money woes….(More)”

How Citizen Science Changed the Way Fukushima Radiation is Reported


Ari Beser at National Geographic: “It appears the world-changing event didn’t change anything, and it’s disappointing,”said Pieter Franken, a researcher at Keio University in Japan (Wide Project), the MIT Media Lab (Civic Media Centre), and co-founder of Safecast, a citizen-science network dedicated to the measurement and distribution of accurate levels of radiation around the world, especially in Fukushima. “There was a chance after the disaster for humanity to innovate our thinking about energy, and that doesn’t seem like it’s happened.  But what we can change is the way we measure the environment around us.”

Franken and his founding partners found a way to turn their email chain, spurred by the tsunami, into Safecast; an open-source network that allows everyday people to contribute to radiation-monitoring.

“We literally started the day after the earthquake happened,” revealed Pieter. “A friend of mine, Joi Ito, the director of MIT Media Lab, and I were basically talking about what Geiger counter to get. He was in Boston at the time and I was here in Tokyo, and like the rest of the world, we were worried, but we couldn’t get our hands on anything. There’s something happening here, we thought. Very quickly as the disaster developed, we wondered how to get the information out. People were looking for information, so we saw that there was a need. Our plan became: get information, put it together and disseminate it.”

An e-mail thread between Franken, Ito, and Sean Bonner, (co-founder of CRASH Space, a group that bills itself as Los Angeles’ first hackerspace), evolved into a network of minds, including members of Tokyo Hackerspace, Dan Sythe, who produced high-quality Geiger counters, and Ray Ozzie, Microsoft’s former Chief Technical Officer. On April 15, the group that was to become Safecast sat down together for the first time. Ozzie conceived the plan to strap a Geiger counter to a car and somehow log measurements in motion. This would became the bGeigie, Safecast’s future model of the do-it-yourself Geiger counter kit.

Armed with a few Geiger counters donated by Sythe, the newly formed team retrofitted their radiation-measuring devices to the outside of a car.  Safecast’s first volunteers drove up to the city of Koriyama in Fukushima Prefecture, and took their own readings around all of the schools. Franken explained, “If we measured all of the schools, we covered all the communities; because communities surround schools. It was very granular, the readings changed a lot, and the levels were far from academic, but it was our start. This was April 24, 6 weeks after the disaster. Our thinking changed quite a bit through this process.”

DSC_0358
With the DIY kit available online, all anyone needs to make their own Geiger counter is a soldering iron and the suggested directions.

Since their first tour of Koriyama, with the help of a successful Kickstarter campaign, Safecast’s team of volunteers have developed the bGeigie handheld radiation monitor, that anyone can buy on Amazon.com and construct with suggested instructions available online. So far over 350 users have contributed 41 million readings, using around a thousand fixed, mobile, and crowd-sourced devices….(More)