How to use research evidence to improve your work


NESTA: “We’re pleased to announce the launch of the latest publication in our series of practice guides – Using Research Evidence. Created by the Alliance for Useful Evidence and Nesta, the guide has been designed to help you improve the way you work by using evidence effectively.

Evidence can help you make better decisions. Whether it’s in a police station, a school classroom or the boardroom of a charity, using research-based evidence can help improve outcomes. It is helpful not only in frontline service-delivery, but also in creating smarter organisations – charities, local authorities, government departments – and in developing national policies or charity campaigns.

It is also useful not only to you as a decision-maker, but to the citizens, voters, donors and wider public you are trying to support. Evidence can show if your services are working (or failing), save money, and align services with public needs.

The guide is aimed at those working in government, charities, voluntary organisations, professional membership bodies and local authorities. It will help you to:

  • Learn about evidence-informed decision-making, and why research is an essential element of it.

  • Understand the different scenarios in which using evidence can help you, as well as the types of evidence you might need at different stages of development.

  • Explore different types of evidence, how to choose the most appropriate and how to judge its quality.

  • Get advice on finding the right evidence to support your case, and how to get your message across once you have it….

Download the report here.”

Can We Use Data to Stop Deadly Car Crashes?


Allison Shapiro in Pacific Standard Magazine: “In 2014, New York City Mayor Bill de Blasio decided to adopt Vision Zero, a multi-national initiative dedicated to eliminating traffic-related deaths. Under Vision Zero, city services, including the Department of Transportation, began an engineering and public relations plan to make the streets safer for drivers, pedestrians, and cyclists. The plan included street re-designs, improved accessibility measures, and media campaigns on safer driving.

The goal may be an old one, but the approach is innovative: When New York City officials wanted to reduce traffic deaths, they crowdsourced and used data.

Many cities in the United States—from Washington, D.C., all the way to Los Angeles—have adopted some version of Vision Zero, which began in Sweden in 1997. It’s part of a growing trend to make cities “smart” by integrating data collection into things like infrastructure and policing.

Map of high crash corridors in Portland, Oregon. (Map: Portland Bureau of Transportation)
Map of high crash corridors in Portland, Oregon. (Map: Portland Bureau of Transportation)

Cities have access to an unprecedented amount of data about traffic patterns, driving violations, and pedestrian concerns. Although advocacy groups say Vision Zero is moving too slowly, de Blasio has invested another $115 million in this data-driven approach.

Interactive safety map. (Map: District Department of Transportation)
Interactive safety map. (Map: District Department of Transportation)

De Blasio may have been vindicated. A 2015 year-end report released by the city last week analyzes the successes and shortfalls of data-driven city life, and the early results look promising. In 2015, fewer New Yorkers lost their lives in traffic accidents than in any year since 1910, according to the report, despite the fact that the population has almost doubled in those 105 years.

Below are some of the project highlights.

New Yorkers were invited to add to this public dialogue map, where they could list information ranging from “not enough time to cross” to “red light running.” The Department of Transportation ended up with over 10,000 comments, which led to 80 safety projects in 2015, including the creation of protected bike lanes, the introduction of leading pedestrian intervals, and the simplifying of complex intersections….

Data collected from the public dialogue map, town hall meetings, and past traffic accidents led to “changes to signals, street geometry and markings and regulations that govern actions like turning and parking. These projects simplify driving, walking and bicycling, increase predictability, improve visibility and reduce conflicts,” according to Vision Zero in NYC….(More)”

The impact of open access scientific knowledge


Jack Karsten and Darrell M. West at Brookings: “In spite of technological advancements like the Internet, academic publishing has operated in much the same way for centuries. Scientists voluntarily review their peers’ papers for little or no compensation; the paper’s author likewise does not receive payment from academic publishers. Though most of the costs of publishing a journal are administrative, the cost of subscribing to scientific journals nevertheless increased 600 percent between 1984 and 2002. The funding for the research libraries that form the bulk of journal subscribers has not kept pace, leading to campaigns at universities including Harvard to boycott for-profit publishers.

Though the Internet has not yet brought down the price of academic journal subscriptions, it has led to some interesting alternatives. In 2015, the Twitter hashtag #icanhazPDF was created to request copies of papers located behind paywalls. Anyone with access to a specific paper can download it and then e-mail it to the requester. The practice violates the copyright of publishers, but puts papers in reach of researchers who would otherwise not be able to read them. If a researcher cannot read a journal article in the first place, they cannot go on to cite it, which raises the profile of the cited article and the journal that published it. The publisher is caught between two conflicting goals: to increase the number of citations for their articles and earning revenue to stay in business.

Thinking outside the journal

A trio of University of Chicago researchers examines this issue through the lens of Wikipedia in a paper titled “Amplifying the Impact of Open Access: Wikipedia and the Diffusion of Science.” Wikipedia makes a compelling subject for scientific diffusion given its status as one of the most visited websites in the world, attracting 374 million unique visitors monthly as of September 2015. The study found that on English language articles, Wikipedia editors are 47 percent more likely to cite an article from an open access journal. Anyone using Wikipedia as a first source for information on a subject is more likely to read information from open source journals. If readers click through the links to cited articles, they can read the actual text of these open-source journal articles.

Given how much the federal government spends on scientific research ($66 billion on nondefense R&D in 2015), it has a large role to play in the diffusion of scientific knowledge. Since 2008, the National Institutes of Health (NIH) has required researchers who publish in academic journals to also publish in PubMed, an online open access journal. Expanding provisions like the NIH Public Access Policy to other agencies and to recipients of federal grants at universities would give the public and other researchers a wealth of scientific information. Scientific literacy, even on cutting-edge research, is increasingly important when science informs policy on major issues such as climate change and health care….(More)”

Met Office warns of big data floods on the horizon


 at V3: “The amount of data being collected by departments and agencies mean government services will not be able to implement truly open data strategies, according to Met Office CIO Charles Ewen.

Ewen said the rapidly increasing amount of data being stored by companies and government departments mean it will not be technologically possible able to share all their data in the near future.

During a talk at the Cloud World Forum on Wednesday, he said: “The future will be bigger and bigger data. Right now we’re talking about petabytes, in the near future it will be tens of petabytes, then soon after it’ll be hundreds of petabytes and then we’ll be off into imaginary figure titles.

“We see a future where data has gotten so big the notion of open data and the idea ‘lets share our data with everybody and anybody’ just won’t work. We’re struggling to make it work already and by 2020 the national infrastructure will not exist to shift this stuff [data] around in the way anybody could access and make use of it.”

Ewen added that to deal with the shift he expects many departments and agencies will adapt their processes to become digital curators that are more selective about the data they share, to try and ensure it is useful.

“This isn’t us wrapping our arms around our data and saying you can’t see it. We just don’t see how we can share all this big data in the way you would want it,” he said.

“We see a future where a select number of high-capacity nodes become information brokers and are used to curate and manage data. These curators will be where people bring their problems. That’s the future we see.”

Ewan added that the current expectations around open data are based on misguided views about the capabilities of cloud technology to host and provide access to huge amounts of data.

“The trendy stuff out there claims to be great at everything, but don’t get carried away. We don’t see cloud as anything but capability. We’ve been using appropriate IT and what’s available to deliver our mission services for over 50 to 60 years, and cloud is playing an increasing part of that, but purely for increased capability,” he said.

“It’s just another tool. The important thing is having the skill and knowledge to not just believe vendors but to look and identify the problem and say ‘we have to solve this’.”

The Met Office CIO’s comments follow reports from other government service providers that people’s desire for open data is growing exponentially….(More)”

Open Prescribing


“Every month, the NHS in England publishes anonymised data about the drugs prescribed by GPs. But the raw data files are large and unwieldy, with more than 600 million rows. We’re making it easier for GPs, managers and everyone to explore – supporting safer, more efficient prescribing.

OpenPrescribing is one of a range of projects built by Ben Goldacre and Anna Powell-Smith at the EBM Data Lab to help make complex medical and scientific data more accessible and more impactful in the real world…..

Data sources

Please read our guide to using the data.

Prescribing data is from the monthly files published by the Health and Social Care Information Centre(HSCIC), used under the terms of the Open Government Licence.

Practice list sizes are from the NHS Business Service Authority’s Information Portal, used under the terms of the Open Government Licence. ASTRO-PU and STAR-PUs are calculated from list sizes, based on standard formulas.

BNF codes and names are also from the NHS Business Service Authority’s Information Portal, used under the terms of the Open Government Licence.

CCG to practice relations, and practice prescribing settings, are from the HSCIC’s data downloads(epraccur.csv), used under the terms of the Open Government Licence.

CCG names and codes and CCG geographic boundaries are from the Office for National Statistics, used under the terms of the Open Government Licence.

Practice locations are approximate, geocoded using OpenCageData. If you know a better source of practice locations (not including Code-Point Open), please get in touch!…(More)”

Open data set to reshape charity and activism in 2016


The Guardian: “In 2015 the EU launched the world’s first international data portal, the Chinese government pledged to make state data public, and the UK lost its open data crown to Taiwan. Troves of data were unlocked by governments around the world last year, but the usefulness of much of that data is still to be determined by the civic groups, businesses and governments who use it. So what’s in the pipeline? And how will the open data ecosystem grow in 2016? We asked the experts.

1. Data will be seen as infrastructure (Heather Savory, director general for data capability, Office for National Statistics)….

2. Journalists, charities and civil society bodies will embrace open data (Hetan Shah, executive director, the Royal Statistical Society)…3. Activists will take it upon themselves to create data

3. Activists will take it upon themselves to create data (Pavel Richter, chief executive, Open Knowledge International)….

 

4. Data illiteracy will come at a heavy price (Sir Nigel Shadbolt, principal, Jesus College, Oxford, professorial research fellow in computer science, University of Oxford and chairman and co-founder of the Open Data Institute…)

5. We’ll create better tools to build a web of data (Dr Elena Simperl, associate professor, electronics and computer science, University of Southampton) …(More)”

Playing ‘serious games,’ adults learn to solve thorny real-world problems


Lawrence Susskind and Ella Kim in The Conversation: “…We have been testing the use of role-playing games to promote collaborative decision-making by nations, states and communities. Unlike online computer games, players in role-playing games interact face-to-face in small groups of six to eight. The games place them in a hypothetical setting that simulates a real-life problem-solving situation. People are often assigned roles that are very different from their real-life roles. This helps them appreciate how their political adversaries view the problem.

Players receive briefing materials to read ahead of time so they can perform their assigned roles realistically. The idea is to reenact the tensions that actual stakeholders will feel when they are making real-life decisions. In the game itself, participants are asked to reach agreement in their roles in 60-90 minutes. (Other games, like the Mercury Game or the Chlorine Game, take longer to play.) If multiple small groups play the game at the same time, the entire room – which may include 100 tables of game players or more – can discuss the results together. In these debriefings, the most potent learning often occurs when players hear about creative moves that others have used to reach agreement.

It can take up to several months to design a game. Designers start by interviewing real-life decision makers to understand how they view the problem. Game designers must also synthesize a great deal of scientific and technical information to present it in the game in a form that anyone can understand. After the design phase, games have to be tested and refined before they are ready for play.

Research shows that this immersive approach to learning is particularly effective for adults. Our own research shows that elected and appointed officials, citizen advocates and corporate leaders can absorb a surprising amount of new scientific information when it is embedded in a carefully crafted role-playing game. In one study of more than 500 people in four New England coastal communities, we found that a significant portion of game players (1) changed their minds about how urgent a threat climate change is; (2) became more optimistic about their local government’s ability to reduce climate change risks; and (3) became more confident that conflicting groups would be able to reach agreement on how to proceed with climate adaptation….

Our conclusion is that “serious games” can prepare citizens and officialsto participate successfully in science-based problem-solving. In related research in Ghana and Vietnam, we found that role-playing games had similarly valuable effects. While the agreements reached in games do not necessarily indicate what actual agreements may be reached, they can help officials and stakeholder representatives get a much clearer sense of what might be possible.

We believe that role-playing games can be used in a wide range of situations. We have designed games that have been used in different parts of the world to help all kinds of interest groups work together to draft new environmental regulations. We have brought together adversaries in energy facility siting and waste cleanup disputes to play a game prior to facing off against each other in real life. This approach has also facilitated decisions in regional economic development disputes, water allocation disputes in an international river basin and disputes among aboriginal communities, national governments and private industry….(More)”

Toward WSIS 3.0: Adopting Next-Gen Governance Solutions for Tomorrow’s Information Society


Lea Kaspar  & Stefaan G. Verhulst at CircleID: “… Collectively, this process has been known as the “World Summit on the Information Society” (WSIS). During December 2015 in New York, twelve years after that first meeting in Geneva and with more than 3 billion people now online, member states of the United Nations unanimously adopted the final outcome document of the WSIS ten-year Review process.

The document (known as the WSIS+10 document) reflects on the progress made over the past decade and outlines a set of recommendations for shaping the information society in coming years. Among other things, it acknowledges the role of different stakeholders in achieving the WSIS vision, reaffirms the centrality of human rights, and calls for a number of measures to ensure effective follow-up.

For many, these represent significant achievements, leading observers to proclaim the outcome a diplomatic victory. However, as is the case with most non-binding international agreements, the WSIS+10 document will remain little more than a hollow guidepost until it is translated into practice. Ultimately, it is up to the national policy-makers, relevant international agencies, and the WSIS community as a whole to deliver meaningful progress towards achieving the WSIS vision.

Unfortunately, the WSIS+10 document provides little actual guidance for practitioners. What is even more striking, it reveals little progress in its understanding of emerging governance trends and methods since Geneva and Tunis, or how these could be leveraged in our efforts to harness the benefits of information and communication technologies (ICT).

As such, the WSIS remains a 20th-century approach to 21st-century challenges. In particular, the document fails to seek ways to make WSIS post 2015:

  • evidence-based in how to make decisions;
  • collaborative in how to measure progress; and
  • innovative in how to solve challenges.

Three approaches toward WSIS 3.0

Drawing on lessons in the field of governance innovation, we suggest in what follows three approaches, accompanied by practical recommendations, that could allow the WSIS to address the challenges raised by the information society in a more evidence-based, innovative and participatory way:

1. Adopt an evidence-based approach to WSIS policy making and implementation.

Since 2003, we have had massive experimentation in both developed and developing countries in a number of efforts to increase access to the Internet. We have seen some failures and some successes; above all, we have gained insight into what works, what doesn’t, and why. Unfortunately, much of the evidence remains scattered and ad-hoc, poorly translated into actionable guidance that would be effective across regions; nor is there any reflection on what we don’t know, and how we can galvanize the research and funding community to address information gaps. A few practical steps we could take to address this:….

2. Measure progress towards WSIS goals in a more open, collaborative way, founded on metrics and data developed through a bottom-up approach

The current WSIS+10 document has many lofty goals, many of which will remain effectively meaningless unless we are able to measure progress in concrete and specific terms. This requires the development of clear metrics, a process which is inevitably subjective and value-laden. Metrics and indicators must therefore be chosen with great care, particularly as they become points of reference for important decisions and policies. Having legitimate, widely-accepted indicators is critical. The best way to do this is to develop a participatory process that engages those actors who will be affected by WSIS-related actions and decisions. …These could include:…

3. Experiment with governance innovations to achieve WSIS objectives.

Over the last few years, we have seen a variety of innovations in governance that have provided new and often improved ways to solve problems and make decisions. They include, for instance:

  • The use of open and big data to generate new insights in both the problem and the solution space. We live in the age of abundant data — why aren’t we using it to inform our decision making? Data on the current landscape and the potential implications of policies could make our predictions and correlations more accurate.
  • The adoption of design thinking, agile development and user-focused research in developing more targeted and effective interventions. A linear approach to policy making with a fixed set of objectives and milestones allows little room for dealing with unforeseen or changing circumstances, making it difficult to adapt and change course. Applying lessons from software engineering — including the importance of feedback loops, continuous learning, and agile approach to project design — would allow policies to become more flexible and solutions more robust.
  • The application of behavioral sciences — for example, the concept of ‘nudging’ individuals to act in their own best interest or adopt behaviors that benefit society. How choices (e.g. to use new technologies) are presented and designed can be more powerful in informing adoption than laws, rules or technical standards.
  • The use of prizes and challenges to tap into the wisdom of the crowd to solve complex problems and identify new ideas. Resource constraints can be addressed by creating avenues for people/volunteers to act as resource in creating solutions, rather than being only their passive benefactors….(More)

Swipe right to fix the world: can Tinder-like tech match solutions to problems?


Beth Noveck in The Guardian: “Increasingly, these technologies of expertise are making it possible for the individual to make searchable lived experience. The New York police department, for example, maintains a database of employee skills. As the social service agency of last resort, the department needs to be able to pinpoint quickly who within the organization has the know how to wrangle a runaway beehive in Brooklyn or sing the national anthem in Queens in Chinese.

In public institutions, especially, it is all too common for individual knowhow to be masked by vague titles like “manager” and “director”. Using software to give organizations insights about the aptitude of employees has the potential to improve effectiveness and efficiency for public good.

Already an accelerating practice in the private sector, where managers want granular evidence of hard skills not readily apparent from transcripts, this year the World Bank created its own expert network called SkillFinder to index the talents of its 27,000 employees, consultants and alumni. With the launch of SkillFinder, the bank is just beginning to explore how to use the tool to better organize its human capital to achieve the bank’s mission of eradicating poverty.

Giving people outside as well as inside institutions opportunities to share their knowledge could save time, financial resources and even lives. Take the example of PulsePoint, a smartphone app created by the fire department of San Ramon, California. Now used by 1400 communities across the United States, PulsePoint matches those with a specific skill, namely CPR training, with dramatic results.

By tapping into a feed of the 911 calls, PulsePoint sends a text message “CPR Needed!” to those registered members of the public – off-duty doctors, nurses, police and trained amateurs – near the victim. Effective bystander CPR immediately administered can potentially double or triple the victim’s chance of survival. By augmenting traditional government first response,  Pulsepoint’s matching has already helped over 7,000 victims.

Employers can accelerate this process by going beyond merely asking employees for HR information and, instead, begin to catalog systematically the unique skills of the individuals within their organization. Many employers are anyway turning to new technology to match employees (and would-be employees) with the right skills to available jobs. How easily they could develop and share databases with public information about who has what experience while at the same time protecting the privacy of personal information….(More)”

50 states, 50 public records stories


 at Poynter: “I try to feature journalists who are telling important stories using public records. For my final column of 2015, I wanted to do something big and decided to find public records stories from all 50 states (plus, a bonus: Washington, D.C.).

This is not meant to be a “best of” list. It’s simply a collection of public records stories from the past year that intrigued me. I found many of the stories by searching the National Freedom of Information Coalition’s website, as well as Investigative Reporters & Editors.…check out my list of public records stories from around the country and see what records journalists are requesting.  It’s full of great story ideas:

Alabama

Auburn spent $1.67 million on Outback Bowl trip

(Montgomery Advertiser)

Auburn spent more than $1.6 million on its Outback Bowl trip, according to the Institutional Bowl Expense report summary submitted to the NCAA and released in response to a Freedom of Information Act request.

Alaska

KMXT sues Kodiak City for documents in police brutality case

(KMXT)

The public radio station filed suit against the City of Kodiak to get records from police after three officers handcuffed and pepper-sprayed a man with autism.

Arizona

Legislature redacts, delays and denies access to messages

(Arizona Capitol Times)

The newspaper requested electronic messages sent among top state elected officials of both parties and their top staff. But getting access to those messages was difficult.

Arkansas

Some question email deletion policies

(Arkansas News)

After the state treasurer’s office instituted a policy requiring employees to delete all emails after 30 days, critics questioned whether it was necessary and whether it was consistent with the spirit of open government.

California

Collapsed I-10 bridge given an A rating just last year

(The Desert Sun)

After a bridge collapsed on Interstate 10, the newspaper reviewed Federal Highway Administration data and found that the bridge had been given an “A” rating and one of the highest possible flood safety ratings.

Colorado

Students accuse CU-Boulder of delaying release of debate documents

(Daily Camera)

University of Colorado students accused administrators of dragging their feet on an open records request the students filed to get letters, emails and documents related to the Republican presidential debate held on campus….(More)”