Designing a toolkit for policy makers


 at UK’s Open Policy Making Blog: “At the end of the last parliament, the Cabinet Office Open Policy Making team launched the Open Policy Making toolkit. This was about giving policy makers the actual tools that will enable them to develop policy that is well informed, creative, tested, and works. The starting point was addressing their needs and giving them what they had told us they needed to develop policy in an ever changing, fast paced and digital world. In a way, it was the culmination of the open policy journey we have been on with departments for the past 2 years. In the first couple of months we saw thousands of unique visits….

Our first version toolkit has been used by 20,000 policy makers. This gave us a huge audience to talk to to make sure that we continue to meet the needs of policy makers and keep the toolkit relevant and useful. Although people have really enjoyed using the toolkit, user testing quickly showed us a few problems…

We knew what we needed to do. Help people understand what Open Policy Making was, how it impacted their policy making, and then to make it as simple as possible for them to know exactly what to do next.

So we came up with some quick ideas on pen and paper and tested them with people. We quickly discovered what not to do. People didn’t want a philosophy— they wanted to know exactly what to do, practical answers, and when to do it. They wanted a sort of design manual for policy….

How do we make user-centered design and open policy making as understood as agile?

We decided to organise the tools around the journey of a policy maker. What might a policy maker need to understand their users? How could they co-design ideas? How could they test policy? We looked at what tools and techniques they could use at the beginning, middle and end of a project, and organised tools accordingly.

We also added sections to remove confusion and hesitation. Our opening section ‘Getting started with Open Policy Making’ provides people with a clear understanding of what open policy making might mean to them, but also some practical considerations. Sections for limited timeframes and budgets help people realise that open policy can be done in almost any situation.

And finally we’ve created a much cleaner and simpler design that lets people show as much or little of the information as they need….

So go and check out the new toolkit and make more open policy yourselves….(More)”

Open Data Is Changing the World in Four Ways…


 at The GovLab Blog: “New repository of case studies documents the impact of open data globally: odimpact.org.

odimpact-tweet-3

Despite global commitments to and increasing enthusiasm for open data, little is actually known about its use and impact. What kinds of social and economic transformation has open data brought about, and what is its future potential? How—and under what circumstances—has it been most effective? How have open data practitioners mitigated risks and maximized social good?

Even as proponents of open data extol its virtues, the field continues to suffer from a paucity of empiricalevidence. This limits our understanding of open data and its impact.

Over the last few months, The GovLab (@thegovlab), in collaboration with Omidyar Network(@OmidyarNetwork), has worked to address these shortcomings by developing 19 detailed open data case studies from around the world. The case studies have been selected for their sectoral and geographic representativeness. They are built in part from secondary sources (“desk research”), and also from more than60 first-hand interviews with important players and key stakeholders. In a related collaboration withOmidyar Network, Becky Hogge(@barefoot_techie), an independent researcher, has developed an additional six open data case studies, all focused on the United Kingdom.  Together, these case studies, seek to provide a more nuanced understanding of the various processes and factors underlying the demand, supply, release, use and impact of open data.

Today, after receiving and integrating comments from dozens of peer reviewers through a unique open process, we are delighted to share an initial batch of 10 case studies, as well three of Hogge’s UK-based stories. These are being made available at a new custom-built repository, Open Data’s Impact (http://odimpact.org), that will eventually house all the case studies, key findings across the studies, and additional resources related to the impact of open data. All this information will be stored in machine-readable HTML and PDF format, and will be searchable by area of impact, sector and region….(More)

Big-data analytics: the power of prediction


Rachel Willcox in Public Finance: “The ability to anticipate demands will improve planning and financial efficiency, and collecting and analysing data will enable the public sector to look ahead…

Hospitals around the country are well accustomed to huge annual rises in patient numbers as winter demand hits accident and emergency departments. But Wrightington, Wigan and Leigh NHS Foundation Trust (WWL) had to rethink service planning after unprecedented A&E demand during a sunny July 2014, which saw ambulances queuing outside the hospital. The trust now employs computer analysis to help predict and prepare for peaks in demand.

As public sector organisations grapple with ever-tighter savings targets, analysis of a broad range of historical data – big data analytics – offers an opportunity to pre-empt service requirements and so help the public sector manage demand more effectively and target scarce resources better. However, working with data to gain insight and save money is not without its challenges.

At WWL, a partnership with business support provider NHS Shared Business Services – a 50:50 joint venture between the Department of Health and technology firm Sopra Steria – resulted in a project that uses an analysis of historical data and complex algorithms to predict the most likely scenarios. In September, the partners launched HealthIntell, a suite of data reporting tools for A&E, procurement and finance.

The suite includes an application designed to help hospitals better cope with A&E pressures and meet waiting time targets. HealthIntell presents real-time data on attendances at A&E departments to doctors and other decision makers. It can predict demand on a daily and hourly basis, and allows trusts to use their own data to identify peaks and troughs – for example, the likely rise in attendances due to bad weather or major sporting events – to help deploy the right people with the right expertise at the right time….

Rikke Duus, a senior teaching fellow at University College London’s School of Management, agrees strongly that an evidence-based approach to providing services is key to efficiency gains, using data that is already available. Although the use of big data across the public sector is trailing well behind that in the private sector, pressure is mounting for it to catch up. Consumers’ experiences with private sector organisations – in particular the growing personalisation of services – is raising expectations about the sort of public services people expect to receive.

Transparency, openness and integration can benefit consumers, Duus says. “It’s about reinventing the business model to cut costs and improve efficiency. We have to use data to predict and prevent. The public-sector mindset is getting there and the huge repositories of data held across the public sector offer a great starting point, but often they don’t know how to get into it and skills are an issue,” Duus says.

Burgeoning demand for analytics expertise in retail, banking and finance has created a severe skills shortage that is allowing big-data professionals to command an average salary of £55,000 – 31% higher than the average IT position, according to a report published in November 2014 by the Tech Partnership employers’ network and business analytics company SAS. More than three quarters of posts were considered “fairly” or “very” difficult to fill, and the situation is unlikely to have eased in the interim.

Professor Robert Fildes, director of the Lancaster Centre for Forecasting, part of Lancaster University Management School, warns that public sector organisations are at a distinct disadvantage when it comes to competing for such sought-after skills.

The centre has worked on a number of public sector forecasting projects, including a Department of Health initiative to predict pay drift for its non-medical workforce and a scheme commissioned by NHS Blackpool to forecast patient activity.

“The other constraint is data,” Fildes observes. “People talk about data as if it is a uniform value. But the Department of Health doesn’t have any real data on the demand for, say, hip operations. They only have data on the operations they’ve done. The data required for analysis isn’t good enough,” he says….

Despite the challenges, projects are reaping rewards across a variety of public sector organisations. Since 2008, the London Fire Brigade (LFB) has been using software from SAS to prioritise the allocation of fire prevention resources, even pinpointing specific households most at risk of fire. The software brings together around 60 data inputs including demographic information, geographical locations, historical data, land use and deprivation levels to create lifestyle profiles for London households.

Deaths caused by fire in the capital fell by almost 50% between 2010 and 2015, according to the LFB. It attributes much of the reduction to better targeting of around 90,000 home visits the brigade carries out each year, to advise on fire safety….(More)”

 

Understanding Participatory Governance


An analysis of “Participants’ Motives for Participation” by Per Gustafson and Nils Hertting: “Despite the growing body of literature on participatory and collaborative governance, little is known about citizens’ motives for participation in such new governance arrangements. The present article argues that knowledge about these motives is essential for understanding the quality and nature of participatory governance and its potential contribution to the overall political and administrative system.

Survey data were used to explore participants’ motives for participating in a large-scale urban renewal program in Stockholm, Sweden. The program was neighborhood-based, characterized by self-selected and repeated participation, and designed to influence local decisions on the use of public resources.

Three types of motives were identified among the participants: (a) Common good motives concerned improving the neighborhood in general and contributing knowledge and competence. (b) Self-interest motives reflected a desire to improve one’s own political efficacy and to promote the interest of one’s own group or family. (c) Professional competence motives represented a largely apolitical type of motive, often based on a professional role. Different motives were expressed by different categories of participants and were also associated with different perceptions concerning program outcomes.

Further analysis suggested that participatory governance may represent both an opportunity for marginalized groups to empower themselves and an opportunity for more privileged groups to act as local “citizen representatives” and articulate the interests of their neighborhoods. These findings call for a more complex understanding of the role and potential benefits of participatory governance…(More).”

 

Core Concepts: Computational social science


Adam Mann at PNAS:Cell phone tower data predicts which parts of London can expect a spike in crime (1). Google searches for polling place information on the day of an election reveal the consequences of different voter registration laws (2). Mathematical models explain how interactions among financial investors produce better yields, and even how they generate economic bubbles (3).

Figure

Using cell-phone and taxi GPS data, researchers classified people in San Francisco into “tribal networks,” clustering them according to their behavioral patterns. Student’s, tourists, and businesspeople all travel through the city in various ways, congregating and socializing in different neighborhoods. Image courtesy of Alex Pentland (Massachusetts Institute of Technology, Cambridge, MA).

Figure

Where people hail from in the Mexico City area, here indicated by different colors, feeds into a crime-prediction model devised by Alex Pentland and colleagues (6). Image courtesy of Alex Pentland (Massachusetts Institute of Technology, Cambridge, MA).

 These are just a few examples of how a suite of technologies is helping bring sociology, political science, and economics into the digital age. Such social science fields have historically relied on interviews and survey data, as well as censuses and other government databases, to answer important questions about human behavior. These tools often produce results based on individuals—showing, for example, that a wealthy, well-educated, white person is statistically more likely to vote (4)—but struggle to deal with complex situations involving the interactions of many different people.

 

A growing field called “computational social science” is now using digital tools to analyze the rich and interactive lives we lead. The discipline uses powerful computer simulations of networks, data collected from cell phones and online social networks, and online experiments involving hundreds of thousands of individuals to answer questions that were previously impossible to investigate. Humans are fundamentally social creatures and these new tools and huge datasets are giving social scientists insights into exactly how connections among people create societal trends or heretofore undetected patterns, related to everything from crime to economic fortunes to political persuasions. Although the field provides powerful ways to study the world, it’s an ongoing challenge to ensure that researchers collect and store the requisite information safely, and that they and others use that information ethically….(More)”

Can We Use Data to Stop Deadly Car Crashes?


Allison Shapiro in Pacific Standard Magazine: “In 2014, New York City Mayor Bill de Blasio decided to adopt Vision Zero, a multi-national initiative dedicated to eliminating traffic-related deaths. Under Vision Zero, city services, including the Department of Transportation, began an engineering and public relations plan to make the streets safer for drivers, pedestrians, and cyclists. The plan included street re-designs, improved accessibility measures, and media campaigns on safer driving.

The goal may be an old one, but the approach is innovative: When New York City officials wanted to reduce traffic deaths, they crowdsourced and used data.

Many cities in the United States—from Washington, D.C., all the way to Los Angeles—have adopted some version of Vision Zero, which began in Sweden in 1997. It’s part of a growing trend to make cities “smart” by integrating data collection into things like infrastructure and policing.

Map of high crash corridors in Portland, Oregon. (Map: Portland Bureau of Transportation)
Map of high crash corridors in Portland, Oregon. (Map: Portland Bureau of Transportation)

Cities have access to an unprecedented amount of data about traffic patterns, driving violations, and pedestrian concerns. Although advocacy groups say Vision Zero is moving too slowly, de Blasio has invested another $115 million in this data-driven approach.

Interactive safety map. (Map: District Department of Transportation)
Interactive safety map. (Map: District Department of Transportation)

De Blasio may have been vindicated. A 2015 year-end report released by the city last week analyzes the successes and shortfalls of data-driven city life, and the early results look promising. In 2015, fewer New Yorkers lost their lives in traffic accidents than in any year since 1910, according to the report, despite the fact that the population has almost doubled in those 105 years.

Below are some of the project highlights.

New Yorkers were invited to add to this public dialogue map, where they could list information ranging from “not enough time to cross” to “red light running.” The Department of Transportation ended up with over 10,000 comments, which led to 80 safety projects in 2015, including the creation of protected bike lanes, the introduction of leading pedestrian intervals, and the simplifying of complex intersections….

Data collected from the public dialogue map, town hall meetings, and past traffic accidents led to “changes to signals, street geometry and markings and regulations that govern actions like turning and parking. These projects simplify driving, walking and bicycling, increase predictability, improve visibility and reduce conflicts,” according to Vision Zero in NYC….(More)”

Crowdfunded Journalism: A Small but Growing Addition to Publicly Driven Journalism


Nancy Vogt and Amy Mitchell at PewResearchCenter: “Projects funded through Kickstarter cut across more than 60 countries

Over the past several years, crowdfunding via the internet has become a popular way to engage public support – and financial backing – for all kinds of projects, from the Coolest Cooler to a virtual reality gaming headset to a prototype of a sailing spacecraft and a bailout fund for Greece.

From April 28, 2009 to September 15, 2015, 658 journalism-related projects proposed on Kickstarter, one of the largest single hubs for crowdfunding journalism, received full – or more than full – funding, to the tune of nearly $6.3 million.These totals – both in terms of number of projects and funds raised – trail nearly all of Kickstarter’s other funding categories, from music, theater and film to technology and games. Nevertheless, the number of funded journalism projects has seen an ongoing increase over time and includes a growing number of proposals from established media organizations.

These totals – both in terms of number of projects and funds raised – trail nearly all of Kickstarter’s other funding categories, from music, theater and film to technology and games. Nevertheless, the number of funded journalism projects has seen an ongoing increase over time and includes a growing number of proposals from established media organizations….(More)

Distributed ledger technology: beyond block chain


UK Government Office for Science: “In a major report on distributed ledgers published today (19 January 2016), the Government Chief Scientist, Sir Mark Walport, sets out how this technology could transform the delivery of public services and boost productivity.

A distributed ledger is a database that can securely record financial, physical or electronic assets for sharing across a network through entirely transparent updates of information.

Its first incarnation was ‘Blockchain’ in 2008, which underpinned digital cash systems such as Bitcoin. The technology has now evolved into a variety of models that can be applied to different business problems and dramatically improve the sharing of information.

Distributed ledger technology could provide government with new tools to reduce fraud, error and the cost of paper intensive processes. It also has the potential to provide new ways of assuring ownership and provenance for goods and intellectual property.

Distributed ledgers are already being used in the diamond markets and in the disbursing of international aid payments.

Sir Mark Walport said:

Distributed ledger technology has the potential to transform the delivery of public and private services. It has the potential to redefine the relationship between government and the citizen in terms of data sharing, transparency and trust and make a leading contribution to the government’s digital transformation plan.

Any new technology creates challenges, but with the right mix of leadership, collaboration and sound governance, distributed ledgers could yield significant benefits for the UK.

The report makes a number of recommendations which focus on ministerial leadership, research, standards and the need for proof of concept trials.

They include:

  • government should provide ministerial leadership to ensure that it provides the vision, leadership and the platform for distributed ledger technology within government; this group should consider governance, privacy, security and standards
  • government should establish trials of distributed ledgers in order to assess the technology’s usability within the public sector
  • government could support the creation of distributed ledger demonstrators for local government that will bring together all the elements necessary to test the technology and its application.
  • the UK research community should invest in the research required to ensure that distributed ledgers are scalable, secure and provide proof of correctness of their contents….View the report ‘Distributed ledger technology: beyond block chain’.”

Smart Devolution


New report by Eddie Copeland and Cameron Scott at Policy Exchange: “Elected mayors should be required to set up an Office of Data Analytics comprising of small, expert teams tasked with using public and privately held data to create smarter and more productive cities.

A new paper, Smart Devolution, by leading think tank Policy Exchange says that most cities have vast quantities of data that if accessed and used effectively could help improve public services, optimise transport routes, support the growth of small businesses and even prevent cycling accidents.

The report highlights how every UK city should use the additional powers they receive from Whitehall to replicate New York by employing a small team of data experts to collect and collate information from a range of sources, including councils, emergency services, voluntary organisations, mobile phone networks and payment systems.

The data teams will provide city mayors with a great opportunity to break down the silos that exist between local authorities and public sector bodies when it comes to unlocking information that could save money and improve the standard of living for the public.

Examples of how a better use of data could make our cities smarter include:

  • Preventing cycling accidents: HGVs travelling through city centres should be required to share their GPS data with the city mayor’s Office for Data Analytics. Combining HGV routes with data from cyclists obtained by their mobile phone signals could provide real time information showing the most common routes shared by large lorries and cyclists. City leaders could then put in place evidence based policy responses, for example, prioritising spending on new bike lanes or updating cyclists via an app of the city’s most dangerous routes.
  • Spending smarter: cities could save and residents benefit from the analysis of  anonymised spend and travel information to understand where investment and services are needed based on real consumer decisions. Locating schools, transport links and housing when and where it is needed. This also applies to business investment with data being harnessed to identify fruitful locations….(More)”

The Future of Behavioural Change: Balancing Public Nudging vs Private Nudging


2nd AIM Lecture by Alberto Alemanno: “Public authorities, including the European Union and its Member States, are increasingly interested in exploiting behavioral insights through public action. They increasingly do so through choice architecture, i.e. the alteration of the environment of choice surrounding a particular decision making context in areas as diverse as energy consumption, tax collection and public health. In this regard, it is useful to distinguish between two situations. The first is that of a public authority which seeks to steer behaviour in the public interest, taking into account one or more mental shortcuts. Thus, a default enrollment for organ donation leverages on the power of inertia to enhance the overall prevalence organ donors. Placing an emoticon (sad face) or a set of information about average consumption on a prohibitive energy bill has the potential to nudge consumers towards less energy consumption. I call this pure public nudging. The second perspective is when public authorities react to exploitative uses of mental shortcuts by market forces by regulating private nudging. I call this ‘counter-nudging’. Pure public nudging helps people correct mental shortcuts so as to achieve legitimate objectives (e.g. increased availability of organs, environmental protection, etc.), regardless of their exploitative use by market forces.
It is against this proposed taxonomy that the 2nd AIM Lecture examines whether also private companies may nudge for good. Are corporations well-placed to nudge their customers towards societal objectives, such as the protection of the environment or the promotion of public health? This is what I call benign corporate nudging.
Their record is far from being the most credible. Companies have used behavioural inspired interventions to maximize profits, what led them to sell more and in turn to induce citizens into more consumption. Yet corporate marketing need not always be self-interested. An incipient number of companies are using their brand, generally through their packaging and marketing efforts, to ‘nudge for good’. By illustrating some actual examples, this lecture defines the conditions under which companies may genuinely and credibly nudge for good. It argues that benign corporate nudging may have – unlike dominant CSR efforts – a positive long-term, habit-forming effect that influences consumers’ future behaviour ‘for good’….(More)”