Using Data to Help People in Distress Get Help Faster


Nicole Wallace in The Chronicle of Philanthropy: “Answering text messages to a crisis hotline is different from handling customer-service calls: You don’t want counselors to answer folks in the order their messages were received. You want them to take the people in greatest distress first.

Crisis Text Line, a charity that provides counseling by text message, uses sophisticated data analysis to predict how serious the conversations are likely to be and ranks them by severity. Using an algorithm to automate triage ensures that people in crisis get help fast — with an unexpected side benefit for other texters contacting the hotline: shorter wait times.

When the nonprofit started in 2013, deciding which messages to take first was much more old-school. Counselors had to read all the messages in the queue and make a gut-level decision on which person was most in need of help.

“It was slow,” says Bob Filbin, the organization’s chief data scientist.

To solve the problem, Mr. Filbin and his colleagues used past messages to the hotline to create an algorithm that analyzes the language used in incoming messages and ranks them in order of predicted severity.

And it’s working. Since the algorithm went live on the platform, messages it marked as severe — code orange — led to conversations that were six times more likely to include thoughts of suicide or self-harm than exchanges started by other texts that weren’t marked code orange, and nine times more likely to have resulted in the counselor contacting emergency services to intervene in a suicide attempt.

Counselors don’t even see the queue of waiting texts anymore. They just click a button marked “Help Another Texter,” and the system connects them to the person whose message has been marked most urgent….(More)”

UN-Habitat Urban Data Portal


Data Driven Journalism:UN-Habitat has launched a new web portal featuring a wealth of city data based on its repository of research on urban trends.

Launched during the 25th Governing Council, the Urban Data Portal allows users to explore data from 741 cities in 220 countries, and compare these for 103 indicators such as slum prevalence and city prosperity.

compare.PNG
Image: A comparison of share in national urban population and average annual rate of urban population change for San Salvador, El Salvador, and Asuncion, Paraguay.

The urban indicators data available are analyzed, compiled and published by UN-Habitat’s Global Urban Observatory, which supports governments, local authorities and civil society organizations to develop urban indicators, data and statistics.

Leveraging GIS technology, the Observatory collects data by taking aerial photographs, zooming into particular areas, and then sending in survey teams to answer any remaining questions about the area’s urban development.

The Portal also contains data collected by national statistics authorities, via household surveys and censuses, with analysis conducted by leading urbanists in UN-HABITAT’s State of the World’s Cities and the Global Report on Human Settlements report series.

For the first time, these datasets are available for use under an open licence agreement, and can be downloaded in straightforward database formats like CSV and JSON….(More)

Ethical Reasoning in Big Data


Book edited by Collmann, Jeff, and Matei, Sorin Adam: “This book springs from a multidisciplinary, multi-organizational, and multi-sector conversation about the privacy and ethical implications of research in human affairs using big data. The need to cultivate and enlist the public’s trust in the abilities of particular scientists and scientific institutions constitutes one of this book’s major themes. The advent of the Internet, the mass digitization of research information, and social media brought about, among many other things, the ability to harvest – sometimes implicitly – a wealth of human genomic, biological, behavioral, economic, political, and social data for the purposes of scientific research as well as commerce, government affairs, and social interaction. What type of ethical dilemmas did such changes generate? How should scientists collect, manipulate, and disseminate this information? The effects of this revolution and its ethical implications are wide-ranging.

This book includes the opinions of myriad investigators, practitioners, and stakeholders in big data on human beings who also routinely reflect on the privacy and ethical issues of this phenomenon. Dedicated to the practice of ethical reasoning and reflection in action, the book offers a range of observations, lessons learned, reasoning tools, and suggestions for institutional practice to promote responsible big data research on human affairs. It caters to a broad audience of educators, researchers, and practitioners. Educators can use the volume in courses related to big data handling and processing. Researchers can use it for designing new methods of collecting, processing, and disseminating big data, whether in raw form or as analysis results. Lastly, practitioners can use it to steer future tools or procedures for handling big data. As this topic represents an area of great interest that still remains largely undeveloped, this book is sure to attract significant interest by filling an obvious gap in currently available literature. …(More)”

The “Social Side” of Public Policy: Monitoring Online Public Opinion and Its Mobilization During the Policy Cycle


Andrea Ceron and Fedra Negri in Policy & Internet: “This article addresses the potential role played by social media analysis in promoting interaction between politicians, bureaucrats, and citizens. We show that in a “Big Data” world, the comments posted online by social media users can profitably be used to extract meaningful information, which can support the action of policymakers along the policy cycle. We analyze Twitter data through the technique of Supervised Aggregated Sentiment Analysis. We develop two case studies related to the “jobs act” labor market reform and the “#labuonascuola” school reform, both formulated and implemented by the Italian Renzi cabinet in 2014–15. Our results demonstrate that social media data can help policymakers to rate the available policy alternatives according to citizens’ preferences during the formulation phase of a public policy; can help them to monitor citizens’ opinions during the implementation phase; and capture stakeholders’ mobilization and de-mobilization processes. We argue that, although social media analysis cannot replace other research methods, it provides a fast and cheap stream of information that can supplement traditional analyses, enhancing responsiveness and institutional learning….(More)”

Science to the People


David Lang on how citizen science bridges the gap between science and society: “It’s hard to find a silver lining in the water crisis in Flint, Michigan. The striking images of jugs of brown water being held high in protest are a symbol of institutional failure on a grand scale. It’s a disaster. But even as questions of accountability and remedy remain unanswered, there is already one lesson we can take away: Citizen science can be used as a powerful tool to build (or rebuild) the public’s trust in science.

Because the other striking image from Flint is this: Citizen-scientists  sampling and testing their own water, from their homes and neighborhoods,and reporting the results as scientific data. Dr. Marc Edwards is the VirginiaTech civil engineering professor who led the investigation into the lead levels in Flint’s water supply, and in a February 2016 interview with TheChronicle of Higher Education, he gave an important answer about the methods his team used to obtain the data: “Normal people really appreciate good science that’s done in their interest. They stepped forward as citizen-scientists to explore what was happening to them and to their community,we provided some funding and the technical and analytical expertise, and they did all the work. I think that work speaks for itself.”

It’s a subtle but important message: The community is rising up and rallying by using science, not by reacting to it. Other scientists trying to highlight important issues and influence public opinion would do well to take note, because there’s a disconnect between what science reports and what the general public chooses to believe. For instance, 97 percent of scientists agree that the world’s climate is warming, likely due to human activities. Yet only 70 percent of Americans believe that global warming is real. Many of the most important issues of our time have the same, growing gap between scientific and societal consensus: genetically modified foods, evolution,vaccines are often widely distrusted or disputed despite strong, positive scientific evidence…..

The good news is that we’re learning. Citizen science — the growing trend of involving non-professional scientists in the process of discovery — is proving to be a supremely effective tool. It now includes far more than birders and backyard astronomers, its first amateur champions. Over the past few years,the discipline has been gaining traction and popularity in academic circles too. Involving groups of amateur volunteers is now a proven strategy for collecting data over large geographic areas or over long periods of time.Online platforms like Zooniverse have shown that even an untrained human eye can spot anomalies in everything from wildebeest migrations to Martiansurfaces. For certain types of research, citizen science just works.

While a long list of peer-reviewed papers now backs up the efficacy of citizen science, and a series of papers has shown its positive impact on students’ view of science, we’re just beginning to understand the impact of that participation on the wider perception of science. Truthfully, for now,most of what we know so far about its public impact is anecdotal, as in the work in Flint, or even on our online platform for explorers, OpenExplorer….It makes sense that citizen science should affect public perception of science.The difference between “here are the results of a study” and “please help

It makes sense that citizen science should affect public perception of science.The difference between “here are the results of a study” and “please help us in the process of discovery” is profound. It’s the difference between a rote learning moment and an immersive experience. And even if not everyone is getting involved, the fact that this is possible and that some members of a community are engaging makes science instantly more relatable. It creates what Tim O’Reilly calls an “architecture of participation.” Citizen scientists create the best interface for convincing the rest of the populace.

A recent article in Nature argued that the DIY biology community was, in fact, ahead of the scientific establishment in terms of proactively thinking about the safety and ethics of rapidly advancing biotechnology tools. They had to be. For those people opening up community labs so that anyone can come and participate, public health issues can’t be pushed aside or dealt with later. After all, they are the public that will be affected….(More)”

The Open Data Barometer (3rd edition)


The Open Data Barometer: “Once the preserve of academics and statisticians, data has become a development cause embraced by everyone from grassroots activists to the UN Secretary-General. There’s now a clear understanding that we need robust data to drive democracy and development — and a lot of it.

Last year, the world agreed the Sustainable Development Goals (SDGs) — seventeen global commitments that set an ambitious agenda to end poverty, fight inequality and tackle climate change by 2030. Recognising that good data is essential to the success of the SDGs, the Global Partnership for Sustainable Development Data and the International Open Data Charter were launched as the SDGs were unveiled. These alliances mean the “data revolution” now has over 100 champions willing to fight for it. Meanwhile, Africa adopted the African Data Consensus — a roadmap to improving data standards and availability in a region that has notoriously struggled to capture even basic information such as birth registration.

But while much has been made of the need for bigger and better data to power the SDGs, this year’s Barometer follows the lead set by the International Open Data Charter by focusing on how much of this data will be openly available to the public.

Open data is essential to building accountable and effective institutions, and to ensuring public access to information — both goals of SDG 16. It is also essential for meaningful monitoring of progress on all 169 SDG targets. Yet the promise and possibilities offered by opening up data to journalists, human rights defenders, parliamentarians, and citizens at large go far beyond even these….

At a glance, here are this year’s key findings on the state of open data around the world:

    • Open data is entering the mainstream.The majority of the countries in the survey (55%) now have an open data initiative in place and a national data catalogue providing access to datasets available for re-use. Moreover, new open data initiatives are getting underway or are promised for the near future in a number of countries, including Ecuador, Jamaica, St. Lucia, Nepal, Thailand, Botswana, Ethiopia, Nigeria, Rwanda and Uganda. Demand is high: civil society and the tech community are using government data in 93% of countries surveyed, even in countries where that data is not yet fully open.
    • Despite this, there’s been little to no progress on the number of truly open datasets around the world.Even with the rapid spread of open government data plans and policies, too much critical data remains locked in government filing cabinets. For example, only two countries publish acceptable detailed open public spending data. Of all 1,380 government datasets surveyed, almost 90% are still closed — roughly the same as in the last edition of the Open Data Barometer (when only 130 out of 1,290 datasets, or 10%, were open). What is more, much of the approximately 10% of data that meets the open definition is of poor quality, making it difficult for potential data users to access, process and work with it effectively.
    • “Open-washing” is jeopardising progress. Many governments have advertised their open data policies as a way to burnish their democratic and transparent credentials. But open data, while extremely important, is just one component of a responsive and accountable government. Open data initiatives cannot be effective if not supported by a culture of openness where citizens are encouraged to ask questions and engage, and supported by a legal framework. Disturbingly, in this edition we saw a backslide on freedom of information, transparency, accountability, and privacy indicators in some countries. Until all these factors are in place, open data cannot be a true SDG accelerator.
    • Implementation and resourcing are the weakest links.Progress on the Barometer’s implementation and impact indicators has stalled or even gone into reverse in some cases. Open data can result in net savings for the public purse, but getting individual ministries to allocate the budget and staff needed to publish their data is often an uphill battle, and investment in building user capacity (both inside and outside of government) is scarce. Open data is not yet entrenched in law or policy, and the legal frameworks supporting most open data initiatives are weak. This is a symptom of the tendency of governments to view open data as a fad or experiment with little to no long-term strategy behind its implementation. This results in haphazard implementation, weak demand and limited impact.
    • The gap between data haves and have-nots needs urgent attention.Twenty-six of the top 30 countries in the ranking are high-income countries. Half of open datasets in our study are found in just the top 10 OECD countries, while almost none are in African countries. As the UN pointed out last year, such gaps could create “a whole new inequality frontier” if allowed to persist. Open data champions in several developing countries have launched fledgling initiatives, but too often those good open data intentions are not adequately resourced, resulting in weak momentum and limited success.
    • Governments at the top of the Barometer are being challenged by a new generation of open data adopters. Traditional open data stalwarts such as the USA and UK have seen their rate of progress on open data slow, signalling that new political will and momentum may be needed as more difficult elements of open data are tackled. Fortunately, a new generation of open data adopters, including France, Canada, Mexico, Uruguay, South Korea and the Philippines, are starting to challenge the ranking leaders and are adopting a leadership attitude in their respective regions. The International Open Data Charter could be an important vehicle to sustain and increase momentum in challenger countries, while also stimulating renewed energy in traditional open data leaders….(More)”

Foreign Policy has lost its creativity. Design thinking is the answer.


Elizabeth Radziszewski at The Wilson Quaterly: “Although the landscape of threats has changed in recent years, U.S. strategies bear striking resemblance to the ways policymakers dealt with crises in the past. Whether it involves diplomatic overtures, sanctions, bombing campaigns, or the use of special ops and covert operations, the range of responses suffers from innovation deficit. Even the use of drones, while a new tool of warfare, is still part of the limited categories of responses that focus mainly on whether or not to kill, cooperate, or do nothing. To meet the evolving nature of threats posed by nonstate actors such as ISIS, the United States needs a strategy makeover — a creative lift, so to speak.

Sanctions, diplomacy, bombing campaigns, special ops, covert operations — the range of our foreign policy responses suffers from an innovation deficit.

Enter the business world. Today’s top companies face an increasingly competitive marketplace where innovative approaches to product and service development are a necessity. Just as the market has changed for companies since the forces of globalization and the digital economy took over, so has the security landscape evolved for the world’s leading hegemon. Yet the responses of top businesses to these changes stand in stark contrast to the United States’ stagnant approaches to current national security threats. Many of today’s thriving businesses have embraced design thinking (DT), an innovative process that identifies consumer needs through immersive ethnographic experiences that are melded with creative brainstorming and quick prototyping.

What would happen if U.S. policymakers took cues from the business world and applied DT in policy development? Could the United States prevent the threats from metastasizing with more proactive rather than reactive strategies — by discovering, for example, how ideas from biology, engineering, and other fields could help analysts inject fresh perspective into tired solutions? Put simply, if U.S. policymakers want to succeed in managing future threats, then they need to start thinking more like business innovators who integrate human needs with technology and economic feasibility.

In his 1969 book The Sciences of the Artificial, Herbert Simon made the first connection between design and a way of thinking. But it was not until the 1980s and 1990s that Stanford scientists began to see the benefits of design practices used by industrial designers as a method for creative thinking. At the core of DT is the idea that solving a challenge requires a deeper understanding of the problem’s true nature and the processes and people involved. This approach contrasts greatly with more standard innovation styles, where a policy solution is developed and then resources are used to fit the solution to the problem. DT reverses the order.

DT encourages divergent thinking, the process of generating many ideas before converging to select the most feasible ones, including making connections between different-yet-related worlds. Finally, the top ideas are quickly prototyped and tested so that early solutions can be modified without investing many resources and risking the biggest obstacle to real innovation: the impulse to try fitting an idea, product, policy to the people, rather of the other way around…

If DT has reenergized the innovative process in the business and nonprofit sector, a systematic application of its methodology could just as well revitalize U.S. national security policies. Innovation in security and foreign policy is often framed around the idea of technological breakthroughs. Thanks toDefense Advanced Research Projects Agency (DARPA), the Department of Defense has been credited with such groundbreaking inventions as GPS, the Internet, and stealth fighters — all of which have created rich opportunities to explore new military strategies. Reflecting this infatuation with technology, but with a new edge, is Defense Secretary Ashton Carter’s unveiling of the Defense Innovation Unit Experimental, an initiative to scout for new technologies, improve outreach to startups, and form deeper relationships between the Pentagon and Silicon Valley. The new DIUE effort signals what businesses have already noticed: the need to be more flexible in establishing linkages with people outside of the government in search for new ideas.

Yet because the primary objective of DIUE remains technological prowess, the effort alone is unlikely to drastically improve the management of national security. Technology is not a substitute for an innovative process. When new invention is prized as the sole focus of innovation, it can, paradoxically, paralyze innovation. Once an invention is adopted, it is all too tempting to mold subsequent policy development around emergent technology, even if other solutions could be more appropriate….(More)”

E-Regulation and the Rule of Law: Smart Government, Institutional Information Infrastructures, and Fundamental Values


Rónán Kennedy in Information Polity: “Information and communications technology (ICT) is increasingly used in bureaucratic and regulatory processes. With the development of the ‘Internet of Things’, some researchers speak enthusiastically of the birth of the ‘Smart State’. However, there are few theoretical or critical perspectives on the role of ICT in these routine decision-making processes and the mundane work of government regulation of economic and social activity. This paper therefore makes an important contribution by putting forward a theoretical perspective on smartness in government and developing a values-based framework for the use of ICT as a tool in the internal machinery of government.

It critically reviews the protection of the rule of law in digitized government. As an addition to work on e-government, a new field of study, ‘e-regulation’ is proposed, defined, and critiqued, with particular attention to the difficulties raised by the use of models and simulation. The increasing development of e-regulation could compromise fundamental values by embedding biases, software errors, and mistaken assumptions deeply into government procedures. The article therefore discusses the connections between the ‘Internet of Things’, the development of ‘Ambient Law’, and how the use of ICT in e-regulation can be a support for or an impediment to the operation of the rule of law. It concludes that e-government research should give more attention to the processes of regulation, and that law should be a more central discipline for those engaged in this activity….(More)

The Alberta CoLab Story: Redesigning the policy development process in government


Alex Ryan at Medium: “Alberta CoLab is an evolving experiment built on three counter-intuitive ideas:

1. Culture shifts faster through collaborative project work than through a culture change initiative.

2. The way to accelerate policy development is to engage more perspectives and more complexity.

3. The best place to put a cross-ministry design team is in a line ministry.

I want to explain what CoLab is and why it has evolved the way it has. We don’t view CoLab as a best practice to be replicated, since our model is tailored to the specific culture and context of Alberta. Perhaps you are also trying to catalyze innovation inside a large bureaucratic organization. I hope you can learn something from our journey so far,….

….Both the successes and frustrations of Alberta CoLab are consequences of the way that we have mediated some key tensions and tradeoffs involved with setting up a public sector innovation lab. Practitioners in other labs will likely recognize these tensions and tradeoffs, although your successes and frustrations will be different depending on how your business model reconciles them.

  1. Where should the lab be? Public innovation labs can exist inside, outside, or on the edge of government. Dubai The Model Centre and Alberta CoLab operate inside government. Inside labs have the best access to senior decision makers and the authority to convene whole of government collaborations, but may find it harder to engage openly with citizens and stakeholders. Unicef Innovation Labs and NouLab exist outside of government. Outside labs have more freedom in who they convene, the kind of container they can create, and timelines to impact, but find it harder to connect with and affect policy change. MindLab and MaRS Solutions Lab are examples of labs on the edge of government. This positioning can offer the best of both worlds. However, edge labs are vulnerable to fluctuations in their relationship with government. Surviving and thriving on the edge means continually walking a tightrope between autonomy and integration. Labs can change their positioning. Alberta CoLab began as an external consulting project. The Behavioural Insights Team is a social purpose company that was spun-off from a lab inside the U.K. government. The location of the lab is unlikely to change often, so it is an important strategic choice.
  2. How deep should the lab go? Here the tension is between taking on small, tactical improvement projects that deliver tangible results, or tackling the big, strategic systems changes that will take years to manifest. Public sector innovation labs are a reaction to the almost total failure of traditional approaches to move the needle on systems change.Therefore, most labs have aspirations to the strategic and the systemic. Yet most labs are also operating in a dominant culture that demands quick wins and measures success by linear progress against a simple logic model theory of change. We believe that operating at either extreme of this spectrum is equally misguided. We use a portfolio approach and a barbell strategy to mediate this tension. Having a portfolio of projects allows us to invest energy in systems change and generate immediate value. It allows us to balance our projects across three horizons of innovation: sustaining innovations; disruptive innovations; and transformative innovations. A barbell strategy means avoiding the middle of the bell curve. We maintain a small number of long-term, flagship initiatives, combined with a rapid turnover of quick-win projects. This allows us to remind the organization of our immediate value without sacrificing long-term commitment to systems change.
  3. What relationship should the lab have with government? Even an inside lab must create some distance between itself and the broader government culture if it is to provide a safe space for innovation. There is a tension between being separate and being integrated. Developing novel ideas that get implemented requires the lab to be both separate and integrated at the same time. You need to decouple from regular policy cycles to enable divergence and creativity, yet provide input into key decisions at the right time. Sometimes these decision points are known in advance, but more often this means sensing and responding to a dynamic decision landscape. Underneath any effective lab is a powerful social network, which needs to cut across government silos and stratas and draw in external perspectives. I think of a lab as having a kind of respiratory rhythm. It starts by bringing fresh ideas into the organization, like a deep breath that provides the oxygen for new thinking. But new ideas are rarely welcome in old organizations. When the lab communicates outwards, these new ideas should be translated into familiar language and concepts, and then given a subtle twist. Often labs believe they have to differentiate their innovations — to emphasize novelty — to justify their existence as an innovation lab. But the more the output of the lab resembles the institutional culture, the more it appears obvious and familiar, the more likely it will be accepted and integrated into the mainstream.
  4. What relationship should the lab have with clients? Alberta CoLab is a kind of in-house consultancy that provides services to clients across all ministries. There is a tension in the nature of the relationship, which can span from consulting problem-solver to co-design facilitator to teacher. The main problem with a consulting model is it often builds dependency rather than capacity. The challenge with an educational relationship is that clients struggle to apply theory that is disconnected from practice. We often use facilitation as a ‘cover’ for our practice, because it allows us to design a process that enables both reflective practice and situated learning. By teaching systemic design and strategic foresight approaches through taking on live projects, we build capacity while doing the work our clients need to do anyway. This helps to break down barriers between theory and practice, learning and doing. Another tension is between doing what the client says she wants and what she needs but does not articulate. Unlike a customer, who is always right, the designer has a duty of care to their client. This involves pushing back when the client demands are unreasonable, reframing the challenge when the problem received is a symptom of a deeper issue, and clearly communicating the risks and potential side effects of policy options. As Denys Lasdun has said about designers: “Our job is to give the client, on time and on cost, not what he wants, but what he never dreamed he wanted; and when he gets it, he recognizes it as something he wanted all the time.”

Lessons Learned

These are our top lessons learned from our journey to date that may have broader applicability.

  1. Recruit outsiders and insiders. Bringing in outside experts elevates the lab’s status. Outsiders are essential to question and challenge organizational patterns that insiders take as given. Insiders bring an understanding of organizational culture. They know how to move files through the bureaucracy and they know where the landmines are.
  2. Show don’t tell. As lab practitioners, we tend to be process geeks with a strong belief in the superiority of our own methods. There is a temptation to cast oneself in the role of the missionary bringing the good word to the unwashed masses. Not only is this arrogant, it’s counter-productive. It’s much more effective to show your clients how your approach adds value by starting with a small collaborative project. If your approach really is as good as you believe it is, the results will speak for themselves. Once people are envious of the results you have achieved, they will be curious and open to learning how you did it, and they will demand more of it.
  3. Be a catalyst, not a bottleneck. Jess McMullin gave us this advice when we founded CoLab. It’s why we developed a six day training course to train over 80 systemic designers across the government. It’s why we run communities of practice on systemic design and strategic foresight. And it’s why we publish about our experiences and share the toolkits we develop. If the innovation lab is an ivory tower, it will not change the way government works. Think instead of the lab as the headquarters of a democratic grassroots movement.
  4. Select projects based on the potential for reframing. There are many criteria we apply when we decide whether to take on a new project. Is it a strategic priority? Is there commitment to implement? Are the client expectations realistic? Can our contribution have a positive impact? These are useful but apply to almost any service offering. The unique value a social innovation lab offers is discontinuous improvement. The source of discontinuous improvement is reframing — seeing a familiar challenge with new eyes, from a different perspective that opens up new potential for positive change. If a project ticks all the boxes, except that the client is certain they already know what the problem is, then that already limits the kind of solutions they will consider. Unless they are open to reframing, they will likely be frustrated by a lab approach, and would be better served by traditional facilitation or good project management.
  5. Prototyping is just the end of the beginning. After one year, we went around and interviewed the first 40 clients of Alberta CoLab. We wanted to know what they had achieved since our co-design sessions. Unfortunately, for most of them, the answer was “not much.” They were very happy with the quality of the ideas and prototypes generated while working with CoLab and were hopeful that the ideas would eventually see the light of day. But they also noted that once participants left the lab and went back to their desks, they found it difficult to sustain the momentum and excitement of the lab, and easy to snap back to business as usual. We had to pivot our strategy to take on fewer projects, but take on a greater stewardship role through to implementation.
  6. Find a rhythm. It’s not useful to create a traditional project plan with phases and milestones for a non-linear and open-ended discovery process like a lab. Yet without some kind of structure, it’s easy to lose momentum or become lost. The best projects I have participated in create a rhythm: an alternating movement between open collaboration and focused delivery. The lab opens up every few months to engage widely on what needs to be done and why. A core team then works between collaborative workshops on how to make it happen. Each cycle allows the group to frame key challenges, make progress, and receive feedback, which builds momentum and commitment.
  7. Be a good gardener. Most of the participants of our workshops arrive with a full plate. They are already 100% committed in their day jobs. Even when they are enthusiastic to ideate, they will be reluctant to take on any additional work. If we want our organizations to innovate, first we have to create the space for new work. We need to prune those projects that we have kept on life support — not yet declared dead but not priorities. This often means making difficult decisions. The flip side of pruning is to actively search for positive deviance and help it to grow. When you find something that’s already working, you just need to turn up the good…..(More)”

Innovation and Its Enemies: Why People Resist New Technologies


]Book by Calestous Juma: “The rise of artificial intelligence has rekindled a long-standing debate regarding the impact of technology on employment. This is just one of many areas where exponential advances in technology signal both hope and fear, leading to public controversy. This book shows that many debates over new technologies are framed in the context of risks to moral values, human health, and environmental safety. But it argues that behind these legitimate concerns often lie deeper, but unacknowledged, socioeconomic considerations. Technological tensions are often heightened by perceptions that the benefits of new technologies will accrue only to small sections of society while the risks will be more widely distributed. Similarly, innovations that threaten to alter cultural identities tend to generate intense social concern. As such, societies that exhibit great economic and political inequities are likely to experience heightened technological controversies.

Drawing from nearly 600 years of technology history, Innovation and Its Enemies identifies the tension between the need for innovation and the pressure to maintain continuity, social order, and stability as one of today’s biggest policy challenges. It reveals the extent to which modern technological controversies grow out of distrust in public and private institutions. Using detailed case studies of coffee, the printing press, margarine, farm mechanization, electricity, mechanical refrigeration, recorded music, transgenic crops, and transgenic animals, it shows how new technologies emerge, take root, and create new institutional ecologies that favor their establishment in the marketplace. The book uses these lessons from history to contextualize contemporary debates surrounding technologies such as artificial intelligence, online learning, 3D printing, gene editing, robotics, drones, and renewable energy. It ultimately makes the case for shifting greater responsibility to public leaders to work with scientists, engineers, and entrepreneurs to manage technological change, make associated institutional adjustments, and expand public engagement on scientific and technological matters….(More)”