Ideas to help civil servants understand the opportunities of data


, at Gov.UK: “Back in April we set out our plan for the discovery phase for what we are now calling “data science literacy”. We explained that we were going to undertake user research with civil servants to understand how they use data. The discovery phase has helped clarify the focus of this work, and we have now begun to develop options for a data science literacy service for government.

Discovery has helped us understand what we really mean when we say ‘data literacy’. For one person it can be a basic understanding of statistics, but to someone else it might mean knowledge of new data science approaches. But on the basis of our exploration, we have started to use the term “data science literacy” to mean the ability to understand how new data science techniques and approaches can be applied in real world contexts in the civil service, and to distinguish it from a broader definition of ‘data literacy’….

In the spirit of openness and transparency we are making this long list of ideas available here:

Data science driven apps

One way in which civil servants could come to understand the opportunities of data science would be to experience products and services which are driven by data science in their everyday roles. This could be something like having a recommendation engine for actions provided to them on the basis of information already held on the customer.

Sharing knowledge across government

A key user need from our user research was to understand how others had undertaken data science projects in government. This could be supported by something like a series of videos / podcasts created by civil servants, setting out case studies and approaches to data science in government. Alternatively, we could have a regularly organised speaker series where data science projects across government are presented alongside outside speakers.

Support for using data science in departments

Users in departments need to understand and experience data science projects in government so that they can undertake their own. Potentially this could be achieved through policy, analytical and data science colleagues working in multidisciplinary teams. Colleagues could also be supported by tools of differing levels of complexity ranging from a simple infographic showing at a high level the types of data available in a department to an online tool which diagnoses which approach people should take for a data science project on the basis of their aims and the data available to them.

In practice training

Users could learn more about how to use data science in their jobs by attending more formal training courses. These could take the form of something like an off-site, week-long training course where they experience the stages of undertaking a data science project (similar to the DWP Digital Academy). An alternative model could be to allocate one day a week to work on a project with departmental importance with a data scientist (similar to theData Science Accelerator Programme for analysts).

IMG_1603

Cross-government support for collaboration

For those users who have responsibility for leading on data science transformation in their departments there is also a need to collaborate with others in similar roles. This could be achieved through interventions such as a day-long unconference to discuss anything related to data science, and using online tools such as Google Groups, Slack, Yammer, Trello etc. We also tested the idea of a collaborative online resource where data science leads and others can contribute content and learning materials / approaches.

This is by no means an exhaustive list of potential ways to encourage data science thinking by policy and delivery colleagues across government. We hope this list is of interest to others in the field and we will update in the next six months about the transition of this project to Alpha….(More)”

Civil Solutions


Citizen Scientist


Book by Mary Ellen Hannibal: “…Here is a wide-ranging adventure in becoming a citizen scientist by an award-winning writer and environmental thought leader. As Mary Ellen Hannibal wades into tide pools, follows hawks, and scours mountains to collect data on threatened species, she discovers the power of a heroic cast of volunteers—and the makings of what may be our last, best hope in slowing an unprecedented mass extinction.

Digging deeply, Hannibal traces today’s tech-enabled citizen science movement to its roots: the centuries-long tradition of amateur observation by writers and naturalists. Prompted by her novelist father’s sudden death, she also examines her own past—and discovers a family legacy of looking closely at the world. With unbending zeal for protecting the planet, she then turns her gaze to the wealth of species left to fight for.

Combining original reporting, meticulous research, and memoir in impassioned prose, Citizen Scientist is a literary event, a blueprint for action, and the story of how one woman rescued herself from an odyssey of loss—with a new kind of science….(More)”

Against transparency


 at Vox: “…Digital storage is pretty cheap and easy, so maybe the next step in open government is ubiquitous surveillance of public servants paired with open access to the recordings.

As a journalist and an all-around curious person, I can’t deny there’s something appealing about this.

Historians, too, would surely love to know everything that President Obama and his top aides said to one another regarding budget negotiations with John Boehner rather than needing to rely on secondhand news accounts influenced by the inevitable demands of spin. By the same token, historians surely would wish that there were a complete and accurate record of what was said at the Constitutional Convention in 1787 that, instead, famously operated under a policy of anonymous discussions.

But we should be cautioned by James Madison’s opinion that “no Constitution would ever have been adopted by the convention if the debates had been public.”

His view, which seems sensible, is that public or recorded debates would have been simply exercises in position-taking rather than deliberation, with each delegate playing to his base back home rather than working toward a deal.

“Had the members committed themselves publicly at first, they would have afterwards supposed consistency required them to maintain their ground,” Madison wrote, “whereas by secret discussion no man felt himself obliged to retain his opinions any longer than he was satisfied of their propriety and truth, and was open to the force of argument.”

The example comes to me by way of Cass Sunstein, who formerly held a position as a top regulatory czar in Obama’s White House, and who delivered a fascinating talk on the subject of government transparency at a June 2016 Columbia symposium on the occasion of the anniversary of the Freedom of Information Act.

Sunstein asks us to distinguish between disclosure of the government’s outputs and disclosure of the government’s inputs. Output disclosure is something like the text of the Constitution or when the Obama administration had Medicare change decades of practice and begin publishing information about what Medicare pays to hospitals and other health providers.

Input disclosure would be something like the transcript of the debates at the Constitutional Convention or a detailed record of the arguments inside the Obama administration over whether to release the Medicare data. Sunstein’s argument is that it is a mistake to simply conflate the two ideas of disclosure under one broad heading of “transparency” when considerations around the two are very different.

Public officials need to have frank discussions

The fundamental problem with input disclosure is that in addition to serving as a deterrent to misconduct, it serves as a deterrent to frankness and honesty.

There are a lot of things that colleagues might have good reason to say to one another in private that would nonetheless be very damaging if they went viral on Facebook:

  • Healthy brainstorming processes often involve tossing out bad or half-baked ideas in order to stimulate thought and elevate better ones.
  • A realistic survey of options may require a blunt assessment of the strengths and weaknesses of different members of the team or of outside groups that would be insulting if publicized.
  • Policy decisions need to be made with political sustainability in mind, but part of making a politically sustainable policy decision is you don’t come out and say you made the decision with politics in mind.
  • Someone may want to describe an actual or potential problem in vivid terms to spur action, without wanting to provoke public panic or hysteria through public discussion.
  • If a previously embarked-upon course of action isn’t working, you may want to quietly change course rather than publicly admit failure.

Journalists are, of course, interested in learning about all such matters. But it’s precisely because such things are genuinely interesting that making disclosure inevitable is risky.

Ex post facto disclosure of discussions whose participants didn’t realize they would be disclosed would be fascinating and useful. But after a round or two of disclosure, the atmosphere would change. Instead of peeking in on a real decision-making process, you would have every meeting dominated by the question “what will this look like on the home page of Politico?”…(More)”

For Quick Housing Data, Hit Craigslist


Tanvi Misra at CityLab: “…housing researchers can use the Internet bulletin board for a more worthy purpose: as a source of fairly accurate, real-time data on the U.S. rental housing market.

A new paper in the Journal of Planning Education and Research analyzed 11 million Craigslist rental listings posted between May and July 2014 across the U.S. and found a treasure trove of information on regional and local housing trends. “Being able to track rental listings data from Craigslist is really useful for urban planners to take the pulse of [changing neighborhoods] much more quickly,” says Geoff Boeing, a researcher at University of California at Berkeley’s Urban Analytics Lab, who co-authored the paper with Paul Waddell, a Berkeley professor of planning and design.

Here are a couple of big takeaways from their deep dive down the CL rabbit hole:

Overall, Craigslist listings track with HUD data (except when they don’t)

The researchers compared median rents in different Craigslist domains (metropolitan areas, essentially) to the corresponding Housing and Urban Development median rents. In New Orleans and Oklahoma City, the posted and the official rents were very similar. But in other metros, they diverged significantly. In Las Vegas, for example, the Craigslist median rent was lower than the HUD median rent, but in New York, it was much, much higher.

“That’s important for local planners to be careful with because there are totally different cultures and ways that Craigslist is used in different cities,” Boeing explains. “The economies of the cities could very much affect how rentals are being posted. If they’re posting it higher [on Craigslist], they may negotiate down eventually. Or, if they’re posting it low, they could be expecting a bidding war with a bunch of tenants coming in.” …(More)”

Situation vacant: technology triathletes wanted


Anne-Marie Slaughter in the Financial Times: “It is time to celebrate a new breed of triathletes, who work in technology. When I was dean in the public affairs school at Princeton, I would tell students to aim to work in the public, private and civic sectors over the course of their careers.

Solving public problems requires collaboration among government, business and civil society. Aspiring problem solvers need the culture and language of all three sectors and to develop a network of contacts in each.

The public problems we face, in the US and globally, require lawyers, economists and issue experts but also technologists. A lack of technologists capable of setting up HealthCare.gov, a website designed to implement the Affordable Care act, led President Barack Obama to create the US Digital Service, which deploys Swat tech teams to address specific problems in government agencies.

But functioning websites that deliver government services effectively are only the most obvious technological need for the public sector.

Government can reinvent how it engages with citizens entirely, for example by personalising public education with digital feedback or training jobseekers. But where to find the talent? The market for engineers, designers and project managers sees big tech companies competing for graduates from the world’s best universities.

Governments can offer only a fraction of those salaries, combined with a rigid work environment, ingrained resistance to innovation and none of the amenities and perks so dear to Silicon Valley .

Government’s comparative advantage, however, is mission and impact, which is precisely what Todd Park sells…Still, demand outstrips supply. ….The goal is to create an ecosystem for public interest technology comparable to that in public interest law. In the latter, a number of American philanthropists created role models, educational opportunities and career paths for aspiring lawyers who want to change the world.

That process began in the 1960s, and today every great law school has a public interest programme with scholarships for the most promising students. Many branches of government take on top law school graduates. Public interest lawyers coming out of government find jobs with think-tanks and advocacy organisations and take up research fellowships, often at the law schools that educated them. When they need to pay the mortgage or send their kids to college, they can work at large law firms with pro bono programmes….We need much more. Every public policy school at a university with a computer science, data science or technology design programme should follow suit. Every think-tank should also become a tech tank. Every non-governmental organisation should have at least one technologist on staff. Every tech company should have a pro bono scheme rewarding public interest work….(More)”

‘Homo sapiens is an obsolete algorithm’


Extract from Homo Deus: A Brief History of Tomorrow by Yuval Noah Harari: “There’s an emerging market called Dataism, which venerates neither gods nor man – it worships data. From a Dataist perspective, we may interpret the entire human species as a single data-processing system, with individual humans serving as its chips. If so, we can also understand the whole of history as a process of improving the efficiency of this system, through four basic methods:

1. Increasing the number of processors. A city of 100,000 people has more computing power than a village of 1,000 people.

2. Increasing the variety of processors. Different processors may use diverse ways to calculate and analyse data. Using several kinds of processors in a single system may therefore increase its dynamism and creativity. A conversation between a peasant, a priest and a physician may produce novel ideas that would never emerge from a conversation between three hunter-gatherers.

3. Increasing the number of connections between processors. There is little point in increasing the mere number and variety of processors if they are poorly connected. A trade network linking ten cities is likely to result in many more economic, technological and social innovations than ten isolated cities.

4. Increasing the freedom of movement along existing connections. Connecting processors is hardly useful if data cannot flow freely. Just building roads between ten cities won’t be very useful if they are plagued by robbers, or if some autocratic despot doesn’t allow merchants and travellers to move as they wish.
These four methods often contradict one another. The greater the number and variety of processors, the harder it is to freely connect them. The construction of the sapiens data-processing system accordingly passed through four main stages, each of which was characterised by an emphasis on different methods.

The first stage began with the cognitive revolution, which made it possible to connect unlimited sapiens into a single data-processing network. This gave sapiens an advantage over all other human and animal species. Although there is a limit to the number of Neanderthals, chimpanzees or elephants you can connect to the same net, there is no limit to the number of sapiens.

Sapiens used their advantage in data processing to overrun the entire world. However, as they spread into different lands and climates they lost touch with one another, and underwent diverse cultural transformations. The result was an immense variety of human cultures, each with its own lifestyle, behaviour patterns and world view. Hence the first phase of history involved an increase in the number and variety of human processors, at the expense of connectivity: 20,000 years ago there were many more sapiens than 70,000 years ago, and sapiens in Europe processed information differently from sapiens in China. However, there were no connections between people in Europe and China, and it would have seemed utterly impossible that all sapiens may one day be part of a single data-processing web.
The second stage began with agriculture and continued until the invention of writing and money. Agriculture accelerated demographic growth, so the number of human processors rose sharply, while simultaneously enabling many more people to live together in the same place, thereby generating dense local networks that contained an unprecedented number of processors. In addition, agriculture created new incentives and opportunities for different networks to trade and communicate.

Nevertheless, during the second phase, centrifugal forces remained predominant. In the absence of writing and money, humans could not establish cities, kingdoms or empires. Humankind was still divided into innumerable little tribes, each with its own lifestyle and world view. Uniting the whole of humankind was not even a fantasy.
The third stage kicked off with the appearance of writing and money about 5,000 years ago, and lasted until the beginning of the scientific revolution. Thanks to writing and money, the gravitational field of human co-operation finally overpowered the centrifugal forces. Human groups bonded and merged to form cities and kingdoms. Political and commercial links between different cities and kingdoms also tightened. At least since the first millennium BC – when coinage, empires, and universal religions appeared – humans began to consciously dream about forging a single network that would encompass the entire globe.

This dream became a reality during the fourth and last stage of history, which began around 1492. Early modern explorers, conquerors and traders wove the first thin threads that encompassed the whole world. In the late modern period, these threads were made stronger and denser, so that the spider’s web of Columbus’s days became the steel and asphalt grid of the 21st century. Even more importantly, information was allowed to flow increasingly freely along this global grid. When Columbus first hooked up the Eurasian net to the American net, only a few bits of data could cross the ocean each year, running the gauntlet of cultural prejudices, strict censorship and political repression.

But as the years went by, the free market, the scientific community, the rule of law and the spread of democracy all helped to lift the barriers. We often imagine that democracy and the free market won because they were “good”. In truth, they won because they improved the global data-processing system.

So over the last 70,000 years humankind first spread out, then separated into distinct groups and finally merged again. Yet the process of unification did not take us back to the beginning. When the different human groups fused into the global village of today, each brought along its unique legacy of thoughts, tools and behaviours, which it collected and developed along the way. Our modern larders are now stuffed with Middle Eastern wheat, Andean potatoes, New Guinean sugar and Ethiopian coffee. Similarly, our language, religion, music and politics are replete with heirlooms from across the planet.
If humankind is indeed a single data-processing system, what is its output? Dataists would say that its output will be the creation of a new and even more efficient data-processing system, called the Internet-of-All-Things. Once this mission is accomplished, Homo sapiens will vanish….(More)

25 Years Later, What Happened to ‘Reinventing Government’?


 at Governing: “…A generation ago, governments across the United States embarked on ambitious efforts to use performance measures to “reinvent” how government worked. Much of the inspiration for this effort came from the bestselling 1992 book Reinventing Government: How the Entrepreneurial Spirit Is Transforming the Public Sector by veteran city manager Ted Gaebler and journalist David Osborne. Gaebler and Osborne challenged one of the most common complaints about public administration — that government agencies were irredeemably bureaucratic and resistant to change. The authors argued that that need not be the case. Government managers and employees could and should, the authors wrote, be as entrepreneurial as their private-sector counterparts. This meant embracing competition; measuring outcomes rather than inputs or processes; and insisting on accountability.

For public-sector leaders, Gaebler and Osborne’s book was a revelation. “I would say it has been the most influential book of the past 25 years,” says Robert J. O’Neill Jr., the executive director of the International City/County Management Association (ICMA). At the federal level, Reinventing Government inspired Vice President Al Gore’s National Performance Review. But it had its greatest impact on state and local governments. Public-sector officials across the country read Reinventing Government and ingested its ideas. Osborne joined the consulting firm Public Strategies Group and began hiring himself out as an adviser to governments.

There’s no question states and localities function differently today than they did 25 years ago. Performance management systems, though not universally beloved, have become widespread. Departments and agencies routinely measure customer satisfaction. Advances in information technology have allowed governments to develop and share outcomes more easily than ever before. Some watchdog groups consider linking outcomes to budgets — also known as performance-based budgeting — to be a best practice. Government executives in many places talk about “innovation” as if they were Silicon Valley executives. This represents real, undeniable change.

Yet despite a generation of reinvention, government is less trusted than ever before. Performance management systems are sometimes seen not as an instrument of reform but as an obstacle to it. Performance-based budgeting has had successes, but they have rarely been sustained. Some of the most innovative efforts to improve government today are pursuing quite different approaches, emphasizing grassroots employee initiatives rather than strict managerial accountability. All of this raises a question: Has the reinventing government movement left a legacy of greater effectiveness, or have the systems it generated become roadblocks that today’s reformers must work around?  Or is the answer somehow “yes” to both of those questions?

Reinventing Government presented dozens of examples of “entrepreneurial” problem-solving, organized into 10 chapters. Each chapter illustrated a theme, such as results-oriented government or enterprising government. This structure — concrete examples grouped around larger themes — reflected the distinctive sensibilities of each author. Gaebler, as a city manager, had made a name for himself by treating constraints such as funding shortfalls or bureaucratic rules as opportunities. His was a bottom-up, let-a-hundred-flowers-bloom sensibility. He wanted his fellow managers to create cultures where risks could be taken and initiative could be rewarded.

Osborne, a journalist, was more of a systematizer, drawn to sweeping ideas. In his previous book, Laboratories of Democracy, he had profiled six governors who he believed were developing new approaches for delivering services that constituted a “third way” between big government liberalism and anti-government conservatism.Reinventing Government suggested how that would work in practice. It also offered readers a daring and novel vision of what government’s core mission should be. Government, the book argued, should focus less on operating programs and more on overseeing them. Instead of “rowing” (stressing administrative detail), senior public officials should do more “steering” (concentrating on overall strategy). They should contract out more, embrace competition and insist on accountability. This aspect of Osborne’s thinking became more pronounced as time went by.

“Today we are well beyond the experimental approach,” Osborne and Peter Hutchinson, a former Minnesota finance commissioner, wrote in their 2004 book, The Price of Government: Getting the Results We Need in an Age of Permanent Fiscal Crisis. A decade of experience had produced a proven set of strategies, the book continued. The foremost should be to turn the budget process “on its head, so that it starts with the results we demand and the price we are willing to pay rather than the programs we have and the costs they incur.” In other words, performance-based budgeting. Then, they continued, “we must cut government down to its most effective size and shape, through strategic reviews, consolidation and reorganization.”

Assessing the influence and efficacy of these ideas is difficult. According to the U.S. Census, the United States has 90,106 state and local governments. Tens of thousands of public employees read Reinventing Government and the books that followed. Surveys have shown that the use of performance measurement systems is widespread across state, county and municipal government. Yet only a handful of studies have sought to evaluate systematically the impact of Reinventing Government’s core ideas. Most have focused on just one, the idea highlighted in The Price of Government: budgeting for outcomes.

To evaluate the reinventing government movement primarily by assessing performance-based budgeting might seem a bit narrow. But paying close attention to the budgeting process is the key to understanding the impact of the entire enterprise. It reveals the difficulty of sustaining even successful innovations….

“Reinventing government was relatively blind to the role of legislatures in general,” says University of Maryland public policy professor and Governing columnist Donald F. Kettl. “There was this sense that the real problem was that good people were trapped in a bad system and that freeing administrators to do what they knew how to do best would yield vast improvements. What was not part of the debate was the role that legislatures might have played in creating those constraints to begin with.”

Over time, a pattern emerged. During periods of crisis, chief executives were able to implement performance-based budgeting. Often, it worked. But eventually legislatures pushed back….

There was another problem. Measuring results, insisting on accountability — these were supposed to spur creative problem-solving. But in practice, says Blauer, “whenever the budget was invoked in performance conversations, it automatically chilled innovative thinking; it chilled engagement,” she says. Agencies got defensive. Rather than focusing on solving hard problems, they focused on justifying past performance….

The fact that reinventing government never sparked a revolution puzzles Gaebler to this day. “Why didn’t more of my colleagues pick it up and run with it?” he asks. He thinks the answer may be that many public managers were simply too risk-averse….(More)”.

“Data-Driven Policy”: San Francisco just showed us how it should work.


abhi nemani at Medium: “…Auto collisions with bikes (and also pedestrians) poses a real threat to the safety and wellbeing of residents. But more than temporary injuries, auto collisions with bikes and pedestrians can kill people. And it does at an alarming rate. According to the city, “Every year in San Francisco, about 30 people lose their lives and over 200 more are seriously injured while traveling on city streets.”…

Problem -> Data Analysis

The city government, in good fashion, made a commitment to do something about. But in better fashion, they decided to do so in a data-driven way. And they tasked the Department of Public Health in collaboration with theDepartment of Transportation to develop policy. What’s impressive is that instead of some blanket policy or mandate, they opted to study the problem,take a nuanced approach, and put data first.

SF High Injury Network

The SF team ran a series of data-driven analytics to determine the causes of these collisions. They developed TransBase to continuously map and visualize traffic incidents throughout the city. Using this platform, then, they developed the “high injury network” — they key places where most problems happen; or as they put it, “to identify where the most investments in engineering, education and enforcement should be focused to have the biggest impact in reducing fatalities and severe injuries.” Turns out that, just12 percent of intersections result in 70% of major injuries. This is using data to make what might seem like an intractable problem, tractable….

Data Analysis -> Policy

So now what? Well, this month, Mayor Ed Lee signed an executive directive to challenge the city to implement these findings under the banner of“Vision Zero”: a goal of reducing auto/pedestrian/bike collision deaths to zero by 2024….

Fortunately, San Francisco took the next step: they put their data to work.

Policy -> Implementation

This week, the city of San Francisco announced plans to build its first“Protected Intersection”:

“Protected intersections use a simple design concept to make everyone safer.Under this configuration, features like concrete islands placed at the cornersslow turning cars and physically separate people biking and driving. They alsoposition turning drivers at an angle that makes it easier for them to see andyield to people walking and biking crossing their path.”

That’s apparently just the start: plans are underway for other intersections,protected bike lanes, and more. Biking and walking in San Francisco is about to become much safer. (Though maybe not easier: the hills — they’rethe worst.)

***

There is ample talk of “Data-Driven Policy” — indeed, I’ve written about it myself — but too often we get lost in the abstract or theoretical….(More)”

Driving government transformation through design thinking


Michael McHugh at Federal Times: “According to Gartner, “Design thinking is a multidisciplinary process that builds solutions for complex, intractable problems in a technically feasible, commercially sustainable and emotionally meaningful way.”

Design thinking as an approach puts the focus on people — their likes, dislikes, desires and experience — for designing new services and products. It encourages a free flow of ideas within a team to build and test prototypes by setting a high tolerance for failure. The approach is more holistic, as it considers both human and technological aspects to cater to mission-critical needs. Due to its innovative and agile problem-solving technique, design thinking inspires teams to collaborate and contribute towards driving mission goals.

How Can Design Thinking Help Agencies?

Whether it is problem solving, streamlining a process or increasing the adoption rate of a new service, design thinking calls for agencies to be empathetic towards people’s needs while being open to continuous learning and a willingness to fail — fast. A fail-fast model enables agencies to detect errors during the course of finding a solution, in which they learn from the possible mistakes and then proceed to develop a more suitable solution that is likely to add value to the user.

Consider an example of a federal agency whose legacy inspection application was affecting the productivity of its inspectors. By leveraging an agile approach, the agency built a mobile inspection solution to streamline and automate the inspection process. The methodology involved multiple iterations based on observations and findings from inspector actions. Here is a step-by-step synopsis of this methodology:

  • Problem presentation: Identifying the problems faced by inspectors.
  • Empathize with users: Understanding the needs and challenges of inspectors.
  • Define the problem: Redefining the problem based on input from inspectors.
  • Team collaboration: Brainstorming and discussing multiple solutions.
  • Prototype creation: Determining and building viable design solutions.
  • Testing with constituents: Releasing the prototype and testing it with inspectors.
  • Collection of feedback: Incorporating feedback from pilot testing and making required changes.

The insights drawn from each step helped the agency to design a secure platform in the form of a mobile inspection tool, optimized for tablets with a smartphone companion app for enhanced mobility. Packed with features like rich media capture with video, speech-to-text and photographs, the mobile inspection tool dramatically reduces manual labor and speeds up the on-site inspection process. It delivers significant efficiencies by improving processes, increasing productivity and enhancing the visibility of information. Additionally, its integration with legacy systems helps leverage existing investments, therefore justifying the innovation, which is based on a tightly defined test and learn cycle….(More)”