Big Data for public policy: the quadruple helix


Julia Lane in the Journal of Policy Analysis and Management: “Data from the federal statistical system, particularly the Census Bureau, have long been a key resource for public policy. Although most of those data have been collected through purposive surveys, there have been enormous strides in the use of administrative records on business (Jarmin & Miranda, 2002), jobs (Abowd, Halti- wanger, & Lane, 2004), and individuals (Wagner & Layne, 2014). Those strides are now becoming institutionalized. The President has allocated $10 million to an Administrative Records Clearing House in his FY2016 budget. Congress is considering a bill to use administrative records, entitled the Evidence-Based Policymaking Commission Act, sponsored by Patty Murray and Paul Ryan. In addition, the Census Bureau has established a Center for “Big Data.” In my view, these steps represent important strides for public policy, but they are only part of the story. Public policy researchers must look beyond the federal statistical system and make use of the vast resources now available for research and evaluation.

All politics is local; “Big Data” now mean that policy analysis can increasingly be local. Modern empirical policy should be grounded in data provided by a network of city/university data centers. Public policy schools should partner with scholars in the emerging field of data science to train the next generation of policy researchers in the thoughtful use of the new types of data; the apparent secular decline in the applications to public policy schools is coincident with the emergence of data science as a field of study in its own right. The role of national statistical agencies should be fundamentally rethought—and reformulated to one of four necessary strands in the data infrastructure; that of providing benchmarks, confidentiality protections, and national statistics….(More)”

Using Tweets and Posts to Speed Up Organ Donation


David Bornstein in the New York Times: “…But there is a problem: Demand for organ transplants vastly outstrips supply, as my colleague Tina Rosenberg has reported. In 2015 in the United States, there were only about 9,000 deceased donors (each of whom can save up to eight lives) and 6,000 living donors (who most often donate a kidney or liver lobe). Today, more than 121,000 people are on waiting lists, roughly 100,000 for kidney transplants, 15,000 for livers, and 4,000 for hearts. And the lists keep getting longer — 3,000 people are added to the kidney list each month. Last year, more than 4,000 people died while waiting for a new kidney; 3,600 dropped off the waiting list because they became too sick to qualify for a transplant.

Although 95 percent of Americans support organ donation, fewer than half of American adults are registered as donors. Research suggests that the number who donate organs after death could be increased greatly. Moreover, surveys indicate untapped support for living donation, too; nearly one in four people have told pollsters they would be willing to donate a kidney to save the life of a friend, community member or stranger. “If one in 10,000 Americans decided to donate each year, there wouldn’t be a shortage,” said Josh Morrison, who donated a kidney to a stranger and founded WaitList Zero, an organization that works to increase living kidney donation.

What could be done to harness people’s generous impulses more effectively to save lives?

One group attacking the question is Organize, which was founded in 2014 by Rick Segal’s son Greg, and Jenna Arnold, a media producer and educator who has worked with MTV and the United Nations in engaging audiences in social issues. Organize uses technology, open data and insights from behavioral economics to simplify becoming an organ donor.

This approach is shaking up longstanding assumptions.

For example, in the last four decades, people have most often been asked to register as an organ donor as part of renewing or obtaining a driver’s license. This made sense in the 1970s, when the nation’s organ procurement system was being set up, says Blair Sadler, the former president and chief executive of Rady Children’s Hospital in San Diego. He helped draft theUniform Anatomical Gift Act in 1967, which established a national legal framework for organ donation. “Health care leaders were asking, ‘How do we make this more routine?’” he recalled. “It’s hard to get people to put it in their wills. Oh, there’s a place where people have to go every five years” — their state Department of Motor Vehicles.

Today, governments allow individuals to initiate registrations online, but the process can be cumbersome. For example, New York State required me to fill out a digital form on my computer, then print it out and mail it to Albany. Donate Life America, by contrast, allows individuals to register online as an organ donor just by logging in with email or a Facebook or Google account — much easier.

In practice, legal registration may be overemphasized. It may be just as important to simply make your wishes known to your loved ones. When people tell relatives, “If something happens to me, I want to be an organ donor,” families almost always respect their wishes. This is particularly important for minors, who cannot legally register as donors.

Using that insight, Organize is making it easier to conduct social media campaigns to both prompt and collect sentiments about organ donation from Facebook, Twitter and Instagram.

If you post or tweet about organ donation, or include a hashtag like #iwanttobeanorgandonor, #organdonor, #donatemyparts, or any of a number of other relevant terms, Organize captures the information and logs it in a registry. In a year, it has gathered the names of nearly 600,000 people who declare support for organ donation. Now the big question is: Will it actually increase organ donation rates?

We should begin getting an idea pretty soon. Organize has been working with the Nevada Donor Network to test its registry. And in the coming months, several other states will begin using it….(More)”

Yelp, Google Hold Pointers to Fix Governments


Christopher Mims at the Wall Street Journal: “When Kaspar Korjus was born, he was given a number before he was given a name, as are all babies in Estonia. “My name is 38712012796, which I got before my name of Kaspar,”says Mr. Korjus.

In Estonia, much of life—voting, digital signatures, prescriptions, taxes, banktransactions—is conducted with this number. The resulting services aren’t just more convenient, they are demonstrably better. It takes an Estonian three minutes to file his or her taxes.

Americans are unlikely to accept a unified national ID system. But Estonia offers an example of the kind of innovation possible around government services, a competitive factor for modern nations.

The former Soviet republic—with a population of 1.3 million, roughly the size of SanDiego—is regularly cited as a world leader in e-governance. At base, e-governance is about making government function as well as private enterprise, mostly by adopting the same information-technology infrastructure and management techniques as the world’s most technologically savvy corporations.

It isn’t that Estonia devotes more people to the problem—it took only 60 to build the identity system. It is that the country’s leaders are willing to empower those engineers.“There is a need for politicians to not only show leadership but also there is a need to take risks,” says Estonia’s prime minister, Taavi Rõivas.

In the U.S., Matt Lira, senior adviser for House Majority Leader Kevin McCarthy, says the gap between the government’s information technology and the private sector’s has grown larger than ever. Americans want to access government services—paying property taxes or renewing a driver’s license—as easily as they look up a restaurant on Yelp or a business on Alphabet’s Google, says Neil Kleiman, a professor of policy at New York University who collaborates with cities in this subject area.

The government is unlikely to catch up soon. The Government Accountability Office last year estimated that about 25% of the federal government’s 738 major IT investments—projected to cost a total of $42 billion—were in danger of significant delays or cost overruns.

One reason for such overruns is the government’s reliance on big, monolithic projects based on proposal documents that can run to hundreds of pages. It is an approach to software development that is at least 20 years out of date. Modern development emphasizes small chunks of code accomplished in sprints and delivered to end users quickly so that problems can be identified and corrected.

Two years ago, the Obama administration devised a novel way to address these issues:assembling a crack team of coders and project managers from the likes of Google,Amazon.com and Microsoft and assigning them to big government boondoggles to help existing IT staff run more like the private sector. Known as 18F, this organization and its sister group, the U.S. Digital Service, are set to hit 500 staffers by the end of 2016….(More)”

‘Big data’ was supposed to fix education. It didn’t. It’s time for ‘small data’


Pasi Sahlberg and Jonathan Hasak in the Washington Post: “One thing that distinguishes schools in the United States from schools around the world is how data walls, which typically reflect standardized test results, decorate hallways and teacher lounges. Green, yellow, and red colors indicate levels of performance of students and classrooms. For serious reformers, this is the type of transparency that reveals more data about schools and is seen as part of the solution to how to conduct effective school improvement. These data sets, however, often don’t spark insight about teaching and learning in classrooms; they are based on analytics and statistics, not on emotions and relationships that drive learning in schools. They also report outputs and outcomes, not the impacts of learning on the lives and minds of learners….

If you are a leader of any modern education system, you probably care a lot about collecting, analyzing, storing, and communicating massive amounts of information about your schools, teachers, and students based on these data sets. This information is “big data,” a term that first appeared around 2000, which refers to data sets that are so large and complex that processing them by conventional data processing applications isn’t possible. Two decades ago, the type of data education management systems processed were input factors of education system, such as student enrollments, teacher characteristics, or education expenditures handled by education department’s statistical officer. Today, however, big data covers a range of indicators about teaching and learning processes, and increasingly reports on student achievement trends over time.

With the outpouring of data, international organizations continue to build regional and global data banks. Whether it’s the United Nations, the World Bank, the European Commission, or the Organization for Economic Cooperation and Development, today’s international reformers are collecting and handling more data about human development than before. Beyond government agencies, there are global education and consulting enterprises like Pearson and McKinsey that see business opportunities in big data markets.

Among the best known today is the OECD’s Program for International Student Assessment (PISA), which measures reading, mathematical, and scientific literacy of 15-year-olds around the world. OECD now also administers an Education GPS, or a global positioning system, that aims to tell policymakers where their education systems place in a global grid and how to move to desired destinations. OECD has clearly become a world leader in the big data movement in education.

Despite all this new information and benefits that come with it, there are clear handicaps in how big data has been used in education reforms. In fact, pundits and policymakers often forget that Big data, at best, only reveals correlations between variables in education, not causality. As any introduction to statistics course will tell you, correlation does not imply causation….
We believe that it is becoming evident that big data alone won’t be able to fix education systems. Decision-makers need to gain a better understanding of what good teaching is and how it leads to better learning in schools. This is where information about details, relationships and narratives in schools become important. These are what Martin Lindstrom calls “small data”: small clues that uncover huge trends. In education, these small clues are often hidden in the invisible fabric of schools. Understanding this fabric must become a priority for improving education.

To be sure, there is not one right way to gather small data in education. Perhaps the most important next step is to realize the limitations of current big data-driven policies and practices. Too strong reliance on externally collected data may be misleading in policy-making. This is an example of what small data look like in practice:

  • It reduces census-based national student assessments to the necessary minimum and transfer saved resources to enhance the quality of formative assessments in schools and teacher education on other alternative assessment methods. Evidence shows that formative and other school-based assessments are much more likely to improve quality of education than conventional standardized tests.
  • It strengthens collective autonomy of schools by giving teachers more independence from bureaucracy and investing in teamwork in schools. This would enhance social capital that is proved to be critical aspects of building trust within education and enhancing student learning.
  • It empowers students by involving them in assessing and reflecting their own learning and then incorporating that information into collective human judgment about teaching and learning (supported by national big data). Because there are different ways students can be smart in schools, no one way of measuring student achievement will reveal success. Students’ voices about their own growth may be those tiny clues that can uncover important trends of improving learning.

Edwards Deming once said that “without data you are another person with an opinion.” But Deming couldn’t have imagined the size and speed of data systems we have today….(More)”

What’s Wrong with Open-Data Sites–and How We Can Fix Them


César A. Hidalgo at Scientific American: “Imagine shopping in a supermarket where every item is stored in boxes that look exactly the same. Some are filled with cereal, others with apples, and others with shampoo. Shopping would be an absolute nightmare! The design of most open data sites—the (usually government) sites that distribute census, economic and other data to be used and redistributed freely—is not exactly equivalent to this nightmarish supermarket. But it’s pretty close.

During the last decade, such sites—data.gov, data.gov.uk, data.gob.cl,data.gouv.fr, and many others—have been created throughout the world. Most of them, however, still deliver data as sets of links to tables, or links to other sites that are also hard to comprehend. In the best cases, data is delivered through APIs, or application program interfaces, which are simple data query languages that require a user to have a basic knowledge of programming. So understanding what is inside each dataset requires downloading, opening, and exploring the set in ways that are extremely taxing for users. The analogy of the nightmarish supermarket is not that far off.

THE U.S. GOVERNMENT’S OPEN DATA SITE

The consensus among those who have participated in the creation of open data sites is that current efforts have failed and we need new options. Pointing your browser to these sites should show you why. Most open data sites are badly designed, and here I am not talking about their aesthetics—which are also subpar—but about the conceptual model used to organize and deliver data to users. The design of most open data sites follows a throwing-spaghetti-against-the-wall strategy, where opening more data, instead of opening data better, has been the driving force.

Some of the design flaws of current open data sites are pretty obvious. The datasets that are more important, or could potentially be more useful, are not brought into the surface of these sites or are properly organized. In our supermarket analogy, not only all boxes look the same, but also they are sorted in the order they came. This cannot be the best we can do.

There are other design problems that are important, even though they are less obvious. The first one is that most sites deliver data in the way in which it is collected, instead of used. People are often looking for data about a particular place, occupation, industry, or about an indicator (such as income, or population). If the data they need comes from the national survey of X, or the bureau of Y, it is secondary and often—although not always—irrelevant to the user. Yet, even though this is not the way we should be giving data back to users, this is often what open data sites do.

The second non-obvious design problem, which is probably the most important, is that most open data sites bury data in what is known as the deep web. The deep web is the fraction of the Internet that is not accessible to search engines, or that cannot be indexed properly. The surface of the web is made of text, pictures, and video, which search engines know how to index. But search engines are not good at knowing that the number that you are searching for is hidden in row 17,354 of a comma separated file that is inside a zip file linked in a poorly described page of an open data site. In some cases, pressing a radio button and selecting options from a number of dropdown menus can get you the desired number, but this does not help search engines either, because crawlers cannot explore dropdown menus. To make open data really open, we need to make it searchable, and for that we need to bring data to the surface of the web.

So how do we that? The solution may not be simple, but it starts by taking design seriously. This is something that I’ve been doing for more than half a decade when creating data visualization engines at MIT. The latest iteration of our design principles are now embodied in DataUSA, a site we created in a collaboration between Deloitte, Datawheel, and my group at MIT.

So what is design, and how do we use it to improve open data sites? My definition of design is simple. Design is discovering the forms that best fulfill a function….(More)”

mySidewalk


Springwise: “With vast amounts of data now publicly available, the answers to many questions lie buried in the numbers, and we already saw a publishing platform helping entrepreneurs visualize government data. For an organization as passionate about civic engagement asmySidewalk, this open data is a treasure trove of compelling stories.

mySidewalk was founded by city planners who recognized the potential force for change contained in local communities. Yet without a compelling reason to get involved, many individuals remain ‘interested bystanders’ — something mySidewalk is determined to change.

Using the latest available data, mySidewalk creates dashboards that are customized for every project to help local public officials make the most informed decisions possible. The dashboards present visualizations of a wide range of socioeconomic and demographic datasets, as well as provide local, regional and national comparisons, all of which help to tell the stories behind the numbers.

It is those stories that mySidewalk believes will provide enough motivation for the ‘interested bystanders’ to get involved. As it says on the mySidewalk website, “Share your ideas. Shape your community.” Organizations of all types have taken notice of the power of data, with businesses using geo-tagging to analyze social media content, and real-time information sharing helping humanitarians in crises….(More)”

From Stalemate to Solutions


Karen Abrams Gerber & Andrea Jacobs  at Stanford Social Innovation Review: “….We waste time asking, “How can we change the way people think?” when we should be asking, “How do we change the way we do things?”

Changing how we do things isn’t just about reworking laws, policies, and systems; it means rethinking the very act of problem-solving. We believe there are five basic tenets to successful collaboration:

  1. Engaging unlikely bedfellows
  2. Creating a resonant vision
  3. Cultivating relationships
  4. Communicating across worldviews
  5. Committing to ongoing learning

Over the past two years, we’ve researched an organization that embodies all of these: Convergence Center for Policy Resolution “convenes people and groups with conflicting views to build trust, identify solutions, and form alliances for action on critical national issues.” Its projects include reimagining K-12 education, addressing economic mobility and poverty, reforming the federal budget process, financing long-term care, and improving the dietary choices and wellness of Americans.

The organization’s unique approach to collaboration enables adversaries to work together and develop breakthrough solutions. It starts with targeting and framing an issue, and then enrolling a wide spectrum of stakeholders. Over an extended period of time, these stakeholders attend a series of expertly facilitated meetings to explore the issue and identify solutions, and finally take joint action….

Foundational to Convergence’s success is the principle of engaging unlikely bedfellows. Stakeholder diversity helps eliminate the “echo chamber” effect (also described by Witter and Mikulsky) created when like-minded groups talk only with one another. The organization vets potential stakeholders to determine their capacity for working with the tensions and complexities of diverse perspectives and their willingness to participate in an emergent process, believing that each ideological camp holds a crucial piece of the puzzle and that the tension of differing views actually creates better solutions.

Convergence exemplifies the power of creating a resonant vision in its approach to tackling big social issues. Framing the issue in a way that galvanizes all stakeholders takes tremendous time, energy, and skill. For example, when the organization decided to focus on addressing K-12 education in the United States, it engaged in hundreds of interviews to identify the best way to frame the project. While everyone agreed the system did not serve the needs of many students, they had difficulty finding consensus about how to move forward. One stakeholder commented that the current system was based on a 19th-century factory model that could never meet the needs of 21st-century students. This comment sparked a new narrative that excited stakeholders across the ideological spectrum: “reimagining education for the 21st century!”

It’s important to note that Convergence focuses on framing the problem, not formulating the solution(s). Rather, it believes the solution emerges through the process of authentic collaboration. This differs significantly from an advocacy-based approach, in which a group agrees on a solution and then mobilizes as much support for that solution as possible. As a result, solutions created through Convergence’s collaborative approach are better able to weather the resistance that all change efforts face, because some of that resistance is built into the process.

Change takes time, and so does cultivating relationships. In an article last year, Jane Wei-Skillern, David Ehrlichman, and David Sawyer wrote, “The single most important factor behind all successful collaborations is trust-based relationships among participants.”…..

Change is complex and certainly not linear. Convergence’s approach “lives” this complexity and uncertainty. In its own words, the organization is “building the ship while sailing it.” Its success is due in part to actively and simultaneously engaging each of the five tenets of authentic collaboration, and its work demonstrates the powerful possibilities of authentic collaboration at a time when partisan rancor and stalemate feel inevitable. It proves we can change the world—collaboratively—without anyone relinquishing their core values….(More)”

UN-Habitat Urban Data Portal


Data Driven Journalism:UN-Habitat has launched a new web portal featuring a wealth of city data based on its repository of research on urban trends.

Launched during the 25th Governing Council, the Urban Data Portal allows users to explore data from 741 cities in 220 countries, and compare these for 103 indicators such as slum prevalence and city prosperity.

compare.PNG
Image: A comparison of share in national urban population and average annual rate of urban population change for San Salvador, El Salvador, and Asuncion, Paraguay.

The urban indicators data available are analyzed, compiled and published by UN-Habitat’s Global Urban Observatory, which supports governments, local authorities and civil society organizations to develop urban indicators, data and statistics.

Leveraging GIS technology, the Observatory collects data by taking aerial photographs, zooming into particular areas, and then sending in survey teams to answer any remaining questions about the area’s urban development.

The Portal also contains data collected by national statistics authorities, via household surveys and censuses, with analysis conducted by leading urbanists in UN-HABITAT’s State of the World’s Cities and the Global Report on Human Settlements report series.

For the first time, these datasets are available for use under an open licence agreement, and can be downloaded in straightforward database formats like CSV and JSON….(More)

The Open Data Barometer (3rd edition)


The Open Data Barometer: “Once the preserve of academics and statisticians, data has become a development cause embraced by everyone from grassroots activists to the UN Secretary-General. There’s now a clear understanding that we need robust data to drive democracy and development — and a lot of it.

Last year, the world agreed the Sustainable Development Goals (SDGs) — seventeen global commitments that set an ambitious agenda to end poverty, fight inequality and tackle climate change by 2030. Recognising that good data is essential to the success of the SDGs, the Global Partnership for Sustainable Development Data and the International Open Data Charter were launched as the SDGs were unveiled. These alliances mean the “data revolution” now has over 100 champions willing to fight for it. Meanwhile, Africa adopted the African Data Consensus — a roadmap to improving data standards and availability in a region that has notoriously struggled to capture even basic information such as birth registration.

But while much has been made of the need for bigger and better data to power the SDGs, this year’s Barometer follows the lead set by the International Open Data Charter by focusing on how much of this data will be openly available to the public.

Open data is essential to building accountable and effective institutions, and to ensuring public access to information — both goals of SDG 16. It is also essential for meaningful monitoring of progress on all 169 SDG targets. Yet the promise and possibilities offered by opening up data to journalists, human rights defenders, parliamentarians, and citizens at large go far beyond even these….

At a glance, here are this year’s key findings on the state of open data around the world:

    • Open data is entering the mainstream.The majority of the countries in the survey (55%) now have an open data initiative in place and a national data catalogue providing access to datasets available for re-use. Moreover, new open data initiatives are getting underway or are promised for the near future in a number of countries, including Ecuador, Jamaica, St. Lucia, Nepal, Thailand, Botswana, Ethiopia, Nigeria, Rwanda and Uganda. Demand is high: civil society and the tech community are using government data in 93% of countries surveyed, even in countries where that data is not yet fully open.
    • Despite this, there’s been little to no progress on the number of truly open datasets around the world.Even with the rapid spread of open government data plans and policies, too much critical data remains locked in government filing cabinets. For example, only two countries publish acceptable detailed open public spending data. Of all 1,380 government datasets surveyed, almost 90% are still closed — roughly the same as in the last edition of the Open Data Barometer (when only 130 out of 1,290 datasets, or 10%, were open). What is more, much of the approximately 10% of data that meets the open definition is of poor quality, making it difficult for potential data users to access, process and work with it effectively.
    • “Open-washing” is jeopardising progress. Many governments have advertised their open data policies as a way to burnish their democratic and transparent credentials. But open data, while extremely important, is just one component of a responsive and accountable government. Open data initiatives cannot be effective if not supported by a culture of openness where citizens are encouraged to ask questions and engage, and supported by a legal framework. Disturbingly, in this edition we saw a backslide on freedom of information, transparency, accountability, and privacy indicators in some countries. Until all these factors are in place, open data cannot be a true SDG accelerator.
    • Implementation and resourcing are the weakest links.Progress on the Barometer’s implementation and impact indicators has stalled or even gone into reverse in some cases. Open data can result in net savings for the public purse, but getting individual ministries to allocate the budget and staff needed to publish their data is often an uphill battle, and investment in building user capacity (both inside and outside of government) is scarce. Open data is not yet entrenched in law or policy, and the legal frameworks supporting most open data initiatives are weak. This is a symptom of the tendency of governments to view open data as a fad or experiment with little to no long-term strategy behind its implementation. This results in haphazard implementation, weak demand and limited impact.
    • The gap between data haves and have-nots needs urgent attention.Twenty-six of the top 30 countries in the ranking are high-income countries. Half of open datasets in our study are found in just the top 10 OECD countries, while almost none are in African countries. As the UN pointed out last year, such gaps could create “a whole new inequality frontier” if allowed to persist. Open data champions in several developing countries have launched fledgling initiatives, but too often those good open data intentions are not adequately resourced, resulting in weak momentum and limited success.
    • Governments at the top of the Barometer are being challenged by a new generation of open data adopters. Traditional open data stalwarts such as the USA and UK have seen their rate of progress on open data slow, signalling that new political will and momentum may be needed as more difficult elements of open data are tackled. Fortunately, a new generation of open data adopters, including France, Canada, Mexico, Uruguay, South Korea and the Philippines, are starting to challenge the ranking leaders and are adopting a leadership attitude in their respective regions. The International Open Data Charter could be an important vehicle to sustain and increase momentum in challenger countries, while also stimulating renewed energy in traditional open data leaders….(More)”

Foreign Policy has lost its creativity. Design thinking is the answer.


Elizabeth Radziszewski at The Wilson Quaterly: “Although the landscape of threats has changed in recent years, U.S. strategies bear striking resemblance to the ways policymakers dealt with crises in the past. Whether it involves diplomatic overtures, sanctions, bombing campaigns, or the use of special ops and covert operations, the range of responses suffers from innovation deficit. Even the use of drones, while a new tool of warfare, is still part of the limited categories of responses that focus mainly on whether or not to kill, cooperate, or do nothing. To meet the evolving nature of threats posed by nonstate actors such as ISIS, the United States needs a strategy makeover — a creative lift, so to speak.

Sanctions, diplomacy, bombing campaigns, special ops, covert operations — the range of our foreign policy responses suffers from an innovation deficit.

Enter the business world. Today’s top companies face an increasingly competitive marketplace where innovative approaches to product and service development are a necessity. Just as the market has changed for companies since the forces of globalization and the digital economy took over, so has the security landscape evolved for the world’s leading hegemon. Yet the responses of top businesses to these changes stand in stark contrast to the United States’ stagnant approaches to current national security threats. Many of today’s thriving businesses have embraced design thinking (DT), an innovative process that identifies consumer needs through immersive ethnographic experiences that are melded with creative brainstorming and quick prototyping.

What would happen if U.S. policymakers took cues from the business world and applied DT in policy development? Could the United States prevent the threats from metastasizing with more proactive rather than reactive strategies — by discovering, for example, how ideas from biology, engineering, and other fields could help analysts inject fresh perspective into tired solutions? Put simply, if U.S. policymakers want to succeed in managing future threats, then they need to start thinking more like business innovators who integrate human needs with technology and economic feasibility.

In his 1969 book The Sciences of the Artificial, Herbert Simon made the first connection between design and a way of thinking. But it was not until the 1980s and 1990s that Stanford scientists began to see the benefits of design practices used by industrial designers as a method for creative thinking. At the core of DT is the idea that solving a challenge requires a deeper understanding of the problem’s true nature and the processes and people involved. This approach contrasts greatly with more standard innovation styles, where a policy solution is developed and then resources are used to fit the solution to the problem. DT reverses the order.

DT encourages divergent thinking, the process of generating many ideas before converging to select the most feasible ones, including making connections between different-yet-related worlds. Finally, the top ideas are quickly prototyped and tested so that early solutions can be modified without investing many resources and risking the biggest obstacle to real innovation: the impulse to try fitting an idea, product, policy to the people, rather of the other way around…

If DT has reenergized the innovative process in the business and nonprofit sector, a systematic application of its methodology could just as well revitalize U.S. national security policies. Innovation in security and foreign policy is often framed around the idea of technological breakthroughs. Thanks toDefense Advanced Research Projects Agency (DARPA), the Department of Defense has been credited with such groundbreaking inventions as GPS, the Internet, and stealth fighters — all of which have created rich opportunities to explore new military strategies. Reflecting this infatuation with technology, but with a new edge, is Defense Secretary Ashton Carter’s unveiling of the Defense Innovation Unit Experimental, an initiative to scout for new technologies, improve outreach to startups, and form deeper relationships between the Pentagon and Silicon Valley. The new DIUE effort signals what businesses have already noticed: the need to be more flexible in establishing linkages with people outside of the government in search for new ideas.

Yet because the primary objective of DIUE remains technological prowess, the effort alone is unlikely to drastically improve the management of national security. Technology is not a substitute for an innovative process. When new invention is prized as the sole focus of innovation, it can, paradoxically, paralyze innovation. Once an invention is adopted, it is all too tempting to mold subsequent policy development around emergent technology, even if other solutions could be more appropriate….(More)”