Sticky-note strategy: How federal innovation labs borrow from Silicon Valley


Carten Cordell in the Federal Times: “The framework for an integrated security solution in the Philippines is built on a bedrock of sticky notes. So is the strategy for combating piracy in East Africa and a handful of other plans that Zvika Krieger is crafting in a cauldron of collaboration within the State Department.

More specifically, Krieger, a senior adviser for strategy within the department’s Bureau of Political-Military Affairs, is working in the bureau’s Strategy Lab, just one pocket of federal government where a Silicon Valley-playbook for innovation is being used to develop policy solutions….

Krieger and a host of other policy thinkers learned a new way to channel innovation for policy solutions called human-centered design, or design thinking. While arguably new in government, the framework has long been in use by the tech sector to design products that will serve the needs of their customers. The strategy of group thinking towards a policy — which is more what these innovation labs seek to achieve — has been used before as well….Where the government has started to use HCD is in developing new policy solutions within a multifaceted group of stakeholders that can contribute a well-rounded slate of expertise. The product is a strategy that is developed from the creative thoughts of a team of experts, rather than a single specialized source….

The core tenet of HCD is to establish a meritocracy of ideas that is both empathetic of thought and immune to hierarchy. In order to get innovative solutions for a complex problem, Krieger forms a team of experts and stakeholders. He then mixes in outside thought leaders he calls “wild cards” to give the group outside perspective.

The delicate balance opens discussion and the mix of ideas ultimately form a strategy for handling the problem. That strategy might involve a technology; but it could also be a new partnership, a new function within an office, or a new acquisition program. Because the team is comprised of multiple experts, it can navigate the complexity more thoroughly, and the wild cards can offer their expertise to provide solutions the stakeholders may not have considered….

Human-centered design has been working its way through pockets of the federal government for a few years now. The Office of Personnel Management opened its Innovation Lab in 2012 and was tasked with improving the USAJobs website. The Department of Health and Human Services opened the IDEA Lab in 2013 to address innovation in its mission. The Department of Veteran Affairs has a Center of Innovation to identify new approaches to meet the current and future needs of veterans, and the departments of Defense and State both have innovation labs tackling policy solutions.

The concept is gaining momentum. This fall, the Obama administration released a strategy report calling for a network of innovation labs throughout federal agencies to develop new policy solutions through HCD.

“I think the word is spreading. It’s kind of like a whisper campaign, in the most positive way,” said an administration official with knowledge of innovation labs and HCD strategies, who was not authorized to speak to the press. “I think, again, the only constraint here is that we don’t have enough of them to be able to imbue this knowledge across government. We need many more people.”

A March 2014 GAO report said that the OPM Innovation Lab had not developed consistent performance targets that would allow it to assess the success of its projects. The report recommended more consistent milestones to assess progress, which the agency addressed through a series of pilot programs….

In the State Department’s Bureau of Educational and Cultural Affairs, an innovation lab called the Collaboratory is in its second year of existence, using HCD strategies to improve projects like the Fulbright program and other educational diplomacy efforts.

The Education Diplomacy initiative, for example, used HCD to devise ways to increase education access abroad using State resources. Defining U.S. embassies as the end user, the Collaboratory then analyzed the areas of need at the installations and began crafting policies.

“We identified a couple of area where we thought we could make substantial gains quite quickly and in a budget neutral way,” Collaboratory Deputy Director Paul Kruchoski said. The process allowed multiple stakeholders like the U.S. Agency for International Development, Peace Corps and the Department of Education to help craft the policy and create what Kruchoski called “feedback loops” to refine throughout the embassies…(More)”

 

Improving government effectiveness: lessons from Germany


Tom Gash at Global Government Forum: “All countries face their own unique challenges but advanced democracies also have much in common: the global economic downturn, aging populations, increasingly expensive health and pension spending, and citizens who remain as hard to please as ever.

At an event last week in Bavaria, attended by representatives of Bavaria’s governing party, the Christian Social Union (CSU) and their guests, it also became clear that there is a growing consensus that governments face another common problem. They have relied for too long on traditional legislation and regulation to drive change. The consensus was that simply prescribing in law what citizens and companies can and can’t do will not solve the complex problems governments are facing, that governments cannot legislate their way to improved citizen health, wealth and wellbeing….

…a number of developments …from which both UK and international policymakers and practitioners can learn to improve government effectiveness.

  1. Behavioural economics: The Behavioural Insights Team (BIT), which span out of government in 2013 and is the subject of a new book by one of its founders and former IfG Director of Research, David Halpern, is being watched carefully by many countries abroad. Some are using its services, while others – including the New South Wales Government in Australia –are building their own skills in this area. BIT and others using similar principles have shown that using insights from social psychology – alongside an experimental approach – can help save money and improve outcomes. Well known successes include increasing the tax take through changing wording of reminder letters (work led by another IfG alumni Mike Hallsworth) and increasing pension take-up through auto-enrolment.
  2. Market design: There is an emerging field of study which is examining how algorithms can be used to match people better with services they need – particularly in cases where it is unfair or morally repugnant to let allow a free market to operate. Alvin Roth, the Harvard Professor and Nobel prize winner, writes about these ‘matching markets’ in his book Who Gets What and Why – in which he also explains how the approach can ensure that more kidneys reach compatible donors, and children find the right education.
  3. Big data: Large datasets can now be mined far more effectively, whether it is to analyse crime patterns to spot where police patrols might be useful or to understand crowd flows on public transport. The use of real-time information allows far more sophisticated deployment of public sector resources, better targeted at demand and need, and better tailored to individual preferences.
  4. Transparency: Transparency has the potential to enhance both the accountability and effectiveness of governments across the world – as shown in our latest Whitehall Monitor Annual Report. The UK government is considered a world-leader for its transparency – but there are still areas where progress has stalled, including in transparency over the costs and performance of privately provided public services.
  5. New management models: There is a growing realisation that new methods are best harnessed when supported by effective management. The Institute’s work on civil service reform highlights a range of success factors from past reforms in the UK – and the benefits of clear mechanisms for setting priorities and sticking to them, as is being attempted by governments new(ish) Implementation Taskforces and the Departmental Implementation Units currently cropping up across Whitehall. I looked overseas for a different model that clearly aligns government activities behind citizens’ concerns – in this case the example of the single non-emergency number system operating in New York City and elsewhere. This system supports a powerful, highly responsive, data-driven performance management regime. But like many performance management regimes it can risk a narrow and excessively short-term focus – so such tools must be combined with the mind-set of system stewardship that the Institute has long championed in its policymaking work.
  6. Investment in new capability: It is striking that all of these developments are supported by technological change and research insights developed outside government. But to embed new approaches in government, there appear to be benefits to incubating new capacity, either in specialist departmental teams or at the centre of government….(More)”

Open government data and why it matters


Australian Government: “This was a key focus of the Prime Minister’s $1.1 billion innovation package announced this month.

The Bureau of Communications Research (BCR) today released analysis of the impact of open government data, revealing its potential to generate up to $25 billion per year, or 1.5 per cent of Australia’s GDP.

In Australia, users can already access and re-use more than 7000 government data sets published on data.gov.au,’ said Dr Paul Paterson, Chief Economist and Head of the Bureau of Communications Research (BCR).

‘Some of the high-value data sets include geospatial/mapping data, health data, transport data, mining data, environmental data, demographics data, and real-time emergency data.

‘Many Australians are unaware of the flow-on benefits from open government data as a result of the increased innovation and informed choice it creates. For example open data has the power to generate new careers, more efficient government revenues, improved business practices, and drive better public engagement,

Give Up Your Data to Cure Disease


David B. Agus in The New York Times: “How far would you go to protect your health records? Your privacy matters, of course, but consider this: Mass data can inform medicine like nothing else and save countless lives, including, perhaps, your own.

Over the past several years, using some $30 billion in federal stimulus money, doctors and hospitals have been installing electronic health record systems. ….Yet neither doctors nor patients are happy. Doctors complain about the time it takes to update digital records, while patients worry about confidentiality…

We need to get over it. These digital databases offer an incredible opportunity to examine trends that will fundamentally change how doctors treat patients. They will help develop cures, discover new uses for drugs and better track the spread of scary new illnesses like the Zika virus….

Case in point: Last year, a team led by researchers at the MD Anderson Cancer Center and Washington University found that a common class of heart drugs called beta blockers, which block the effects of adrenaline, may prolong ovarian cancer patients’ survival. This discovery came after the researchers reviewed more than 1,400 patient records, and identified an obvious pattern among those with ovarian cancer who were using beta blockers, most often to control their blood pressure. Women taking earlier versions of this class of drug typically lived for almost eight years after their cancer diagnosis, compared with just three and a half years for the women not taking any beta blocker….

We need to move past that. For one thing, more debate over data sharing is already leading to more data security. Last month a bill was signed into law calling for the Department of Health and Human Services to create a health care industry cybersecurity task force, whose members would hammer out new voluntary standards.

New technologies — and opportunities — come with unprecedented risks and the need for new policies and strategies. We must continue to improve our encryption capabilities and other methods of data security and, most important, mandate that they are used. The hack of the Anthem database last year, for instance, which allowed 80 million personal records to be accessed, was shocking not only for the break-in, but for the lack of encryption….

Medical research is making progress every day, but the next step depends less on scientists and doctors than it does on the public. Each of us has the potential to be part of tomorrow’s cures. (More)”

The Promise and Perils of Open Medical Data


Sharona Hoffman at the Hastings Center: “Not long ago I visited the Personal Genome Project’s website. The PGP describes its mission as “creating public genome, health, and trait data.” In the “Participant Profiles” section, I found several entries that disclosed the names of individuals along with their date of birth, sex, weight, height, blood type, race, health conditions, medications, allergies, medical procedures, and more. Other profiles did not feature names but provided all of the other details. I had no special access to this information. It is available to absolutely anyone with Internet access. The PGP is part of a trend known as “open data.” Many government and private entities have launched initiatives to compile very large data resources (also known as “big data”) and to make them available to the public. President Obama himself has endorsed open data by issuing a May 2013 executive order directing that, to the extent permitted by law, the federal government must release its data to the public in forms that make it easy to locate, access, and use.

Read more:http://www.thehastingscenter.org/Publications/HCR/Detail.aspx?id=7731#ixzz3zOSM2kF0

Big-data analytics: the power of prediction


Rachel Willcox in Public Finance: “The ability to anticipate demands will improve planning and financial efficiency, and collecting and analysing data will enable the public sector to look ahead…

Hospitals around the country are well accustomed to huge annual rises in patient numbers as winter demand hits accident and emergency departments. But Wrightington, Wigan and Leigh NHS Foundation Trust (WWL) had to rethink service planning after unprecedented A&E demand during a sunny July 2014, which saw ambulances queuing outside the hospital. The trust now employs computer analysis to help predict and prepare for peaks in demand.

As public sector organisations grapple with ever-tighter savings targets, analysis of a broad range of historical data – big data analytics – offers an opportunity to pre-empt service requirements and so help the public sector manage demand more effectively and target scarce resources better. However, working with data to gain insight and save money is not without its challenges.

At WWL, a partnership with business support provider NHS Shared Business Services – a 50:50 joint venture between the Department of Health and technology firm Sopra Steria – resulted in a project that uses an analysis of historical data and complex algorithms to predict the most likely scenarios. In September, the partners launched HealthIntell, a suite of data reporting tools for A&E, procurement and finance.

The suite includes an application designed to help hospitals better cope with A&E pressures and meet waiting time targets. HealthIntell presents real-time data on attendances at A&E departments to doctors and other decision makers. It can predict demand on a daily and hourly basis, and allows trusts to use their own data to identify peaks and troughs – for example, the likely rise in attendances due to bad weather or major sporting events – to help deploy the right people with the right expertise at the right time….

Rikke Duus, a senior teaching fellow at University College London’s School of Management, agrees strongly that an evidence-based approach to providing services is key to efficiency gains, using data that is already available. Although the use of big data across the public sector is trailing well behind that in the private sector, pressure is mounting for it to catch up. Consumers’ experiences with private sector organisations – in particular the growing personalisation of services – is raising expectations about the sort of public services people expect to receive.

Transparency, openness and integration can benefit consumers, Duus says. “It’s about reinventing the business model to cut costs and improve efficiency. We have to use data to predict and prevent. The public-sector mindset is getting there and the huge repositories of data held across the public sector offer a great starting point, but often they don’t know how to get into it and skills are an issue,” Duus says.

Burgeoning demand for analytics expertise in retail, banking and finance has created a severe skills shortage that is allowing big-data professionals to command an average salary of £55,000 – 31% higher than the average IT position, according to a report published in November 2014 by the Tech Partnership employers’ network and business analytics company SAS. More than three quarters of posts were considered “fairly” or “very” difficult to fill, and the situation is unlikely to have eased in the interim.

Professor Robert Fildes, director of the Lancaster Centre for Forecasting, part of Lancaster University Management School, warns that public sector organisations are at a distinct disadvantage when it comes to competing for such sought-after skills.

The centre has worked on a number of public sector forecasting projects, including a Department of Health initiative to predict pay drift for its non-medical workforce and a scheme commissioned by NHS Blackpool to forecast patient activity.

“The other constraint is data,” Fildes observes. “People talk about data as if it is a uniform value. But the Department of Health doesn’t have any real data on the demand for, say, hip operations. They only have data on the operations they’ve done. The data required for analysis isn’t good enough,” he says….

Despite the challenges, projects are reaping rewards across a variety of public sector organisations. Since 2008, the London Fire Brigade (LFB) has been using software from SAS to prioritise the allocation of fire prevention resources, even pinpointing specific households most at risk of fire. The software brings together around 60 data inputs including demographic information, geographical locations, historical data, land use and deprivation levels to create lifestyle profiles for London households.

Deaths caused by fire in the capital fell by almost 50% between 2010 and 2015, according to the LFB. It attributes much of the reduction to better targeting of around 90,000 home visits the brigade carries out each year, to advise on fire safety….(More)”

 

7 Ways Local Governments Are Getting Creative with Data Mapping


Ben Miller at GovTech:  “As government data collection expands, and as more of that data becomes publicly available, more people are looking to maps as a means of expressing the information.

And depending on the type of application, a map can be useful for both the government and its constituents. Many maps help government servants operate more efficiently and savemoney, while others will answer residents’ questions so they don’t have to call a government worker for theanswer…..

Here are seven examples of state and local governments using maps to help themselves and the people they serve.

1. DISTRICT OF COLUMBIA, IOWA GET LOCAL AND CURRENT WITH THE WEATHER

Washington%2C+D.C.+snow+plow+map

As Winter Storm Jonas was busy dropping nearly 30 inches of snow on the nation’s capital, officials in D.C. were working to clear it. And thanks to a mapping application they launched, citizens could see exactly how the city was going about that business.

The District of Columbia’s snow map lets users enter an address, and then shows what snow plows did near that address within a given range of days. The map also shows where the city received 311 requests for snow removal and gives users a chance to look at recent photos from road cameras showing driving conditions…..

2. LOS ANGELES MAPS EL NIÑO RESOURCES, TRENDS

El Niño Watch map

Throughout the winter, weather monitoring experts warned the public time and again that an El Niño system was brewing in the Pacific Ocean that looked to be one of the largest, if not the largest, ever. That would mean torrents of rain for a parched state that’s seen mudslides and flooding during storms in the past.

So to prepare its residents, the city of Los Angeles published a map in January that lets users see both decision-informing trends and the location of resources. Using the application, one can toggle layers that let them know what the weather is doing around the city, where traffic is backed up, where the power is out, where they can find sand bags to prevent flood damage and more….

3. CALIFORNIA DIVES DEEP INTO AIR POLLUTION RISKS

CalEnviroScreen

….So, faced with a legislative mandate to identify disadvantaged communities, the California Office of Environmental Health Hazard Assessment decided that it wouldn’t just examine smog levels — it also would also take a look at the prevalence of at-risk people across the state.

The result is a series of three maps, the first two examining both factors and the third combining them. That allows the state and its residents to see the places where air pollution is the biggest problem for people it poses a greater risk to….

4. STREAMLINING RESIDENT SERVICE INFORMATION

Manassas+curbside+pickup+map

The city of Manassas, Va., relied on an outdated paper map and a long-time, well-versed staffer to answer questions about municipal curbside pickup services until they launched this map in 2014. The map allows users to enter their address, and then gives them easy-to-read information about when to put out various things on their curb for pickup.

That’s useful because the city’s fall leaf collection schedule changes every year. So the map not only acts as a benefit to residents who want information, but to city staff who don’t have to deal with as many calls.

The map also shows users the locations of resources they can use and gives them city phone numbers in case they still have questions, and displays it all in a popup pane at the bottom of the map.

5. PLACING TOOLS IN THE HANDS OF THE PUBLIC

A lot of cities and counties have started publishing online maps showing city services and releasing government data.

But Chicago, Boston and Philadelphia stand out as examples of maps that take the idea one step further — because each one offers a staggering amount of choices for users.

Chicago’s new OpenGrid map, just launched in January, is a versatile map that lets users search for certain data like food inspection reports, street closures, potholes and more. That’s enough to answer a lot of questions, but what adds even more utility is the map’s various narrowing tools. Users can narrow searches to a zip code, or they can draw a shape on the map and only see results within that shape. They can perform sub-searches within results and they can choose how they’d like to see the data displayed.

Philadelphia’s platform makes use of buttons, icons and categories to help users sift through the spatially-enabled data available to them. Options include future lane closures, bicycle paths, flu shots, city resources, parks and more.

Boston’s platform is open for users to submit their own maps. And submit they have. The city portal offers everything from maps of bus stops to traffic data pulled from the Waze app.

6. HOUSTON TRANSFORMS SERVICE REQUEST DATA

Houston+311+service+request+map

A 311 service functions as a means of bringing problems to city staff’s attention. But the data itself only goes so far — it needs interpretation.

Houston’s 311 service request map helps users easily analyze the data so as to spot trends. The tool offers lots of ways to narrow data down, and can isolate many different kinds of request so users can see whether one problem is reported more often in certain areas.

7. GUIDING BUSINESS GROWTH

For the last several years, the city of Rancho Cucamonga, Calif., has been designing all sorts of maps through its Rancho Enterprise Geographic Information Systems (REGIS) project. Many of them have served specific city purposes, such as tracking code enforcement violations and offering police a command system tool for special events.

The utilitarian foundation of REGIS extends to its public-facing applications as well. One example is INsideRancho, a map built with economic development efforts in mind. The map lets users search and browse available buildings to suit business needs, narrowing results by square footage, zoning and building type. Users can also find businesses by name or address, and look at property exteriors via an embedded connection with Google Street View….(More)”

The Crusade Against Multiple Regression Analysis


Richard Nisbett at the Edge: (VIDEO) “…The thing I’m most interested in right now has become a kind of crusade against correlational statistical analysis—in particular, what’s called multiple regression analysis. Say you want to find out whether taking Vitamin E is associated with lower prostate cancer risk. You look at the correlational evidence and indeed it turns out that men who take Vitamin E have lower risk for prostate cancer. Then someone says, “Well, let’s see if we do the actual experiment, what happens.” And what happens when you do the experiment is that Vitamin E contributes to the likelihood of prostate cancer. How could there be differences? These happen a lot. The correlational—the observational—evidence tells you one thing, the experimental evidence tells you something completely different.

In the case of health data, the big problem is something that’s come to be called the healthy user bias, because the guy who’s taking Vitamin E is also doing everything else right. A doctor or an article has told him to take Vitamin E, so he does that, but he’s also the guy who’s watching his weight and his cholesterol, gets plenty of exercise, drinks alcohol in moderation, doesn’t smoke, has a high level of education, and a high income. All of these things are likely to make you live longer, to make you less subject to morbidity and mortality risks of all kinds. You pull one thing out of that correlate and it’s going to look like Vitamin E is terrific because it’s dragging all these other good things along with it.

This is not, by any means, limited to health issues. A while back, I read a government report in The New York Times on the safety of automobiles. The measure that they used was the deaths per million drivers of each of these autos. It turns out that, for example, there are enormously more deaths per million drivers who drive Ford F150 pickups than for people who drive Volvo station wagons. Most people’s reaction, and certainly my initial reaction to it was, “Well, it sort of figures—everybody knows that Volvos are safe.”

Let’s describe two people and you tell me who you think is more likely to be driving the Volvo and who is more likely to be driving the pickup: a suburban matron in the New York area and a twenty-five-year-old cowboy in Oklahoma. It’s obvious that people are not assigned their cars. We don’t say, “Billy, you’ll be driving a powder blue Volvo station wagon.” Because of this self-selection problem, you simply can’t interpret data like that. You know virtually nothing about the relative safety of cars based on that study.

I saw in The New York Times recently an article by a respected writer reporting that people who have elaborate weddings tend to have marriages that last longer. How would that be? Maybe it’s just all the darned expense and bother—you don’t want to get divorced. It’s a cognitive dissonance thing.

Let’s think about who makes elaborate plans for expensive weddings: people who are better off financially, which is by itself a good prognosis for marriage; people who are more educated, also a better prognosis; people who are richer; people who are older—the later you get married, the more likelihood that the marriage will last, and so on.

The truth is you’ve learned nothing. It’s like saying men who are a somebody III or IV have longer-lasting marriages. Is it because of the suffix there? No, it’s because those people are the types who have a good prognosis for a lengthy marriage.

A huge range of science projects are done with multiple regression analysis. The results are often somewhere between meaningless and quite damaging….(More)

What Is Citizen Science? – A Scientometric Meta-Analysis


Christopher Kullenberg and Dick Kasperowski at PLOS One: “The concept of citizen science (CS) is currently referred to by many actors inside and outside science and research. Several descriptions of this purportedly new approach of science are often heard in connection with large datasets and the possibilities of mobilizing crowds outside science to assists with observations and classifications. However, other accounts refer to CS as a way of democratizing science, aiding concerned communities in creating data to influence policy and as a way of promoting political decision processes involving environment and health.

Objective

In this study we analyse two datasets (N = 1935, N = 633) retrieved from the Web of Science (WoS) with the aim of giving a scientometric description of what the concept of CS entails. We account for its development over time, and what strands of research that has adopted CS and give an assessment of what scientific output has been achieved in CS-related projects. To attain this, scientometric methods have been combined with qualitative approaches to render more precise search terms.

Results

Results indicate that there are three main focal points of CS. The largest is composed of research on biology, conservation and ecology, and utilizes CS mainly as a methodology of collecting and classifying data. A second strand of research has emerged through geographic information research, where citizens participate in the collection of geographic data. Thirdly, there is a line of research relating to the social sciences and epidemiology, which studies and facilitates public participation in relation to environmental issues and health. In terms of scientific output, the largest body of articles are to be found in biology and conservation research. In absolute numbers, the amount of publications generated by CS is low (N = 1935), but over the past decade a new and very productive line of CS based on digital platforms has emerged for the collection and classification of data….(More)”

How Measurement Fails Doctors and Teachers


Robert M. Wachter at the New York Times: “Two of our most vital industries, health care and education, have become increasingly subjected to metrics and measurements. Of course, we need to hold professionals accountable. But the focus on numbers has gone too far. We’re hitting the targets, but missing the point.

Through the 20th century, we adopted a hands-off approach, assuming that the pros knew best. Most experts believed that the ideal “products” — healthy patients and well-educated kids — were too strongly influenced by uncontrollable variables (the sickness of the patient, the intellectual capacity of the student) and were too complex to be judged by the measures we use for other industries.

By the early 2000s, as evidence mounted that both fields were producing mediocre outcomes at unsustainable costs, the pressure for measurement became irresistible. In health care, we saw hundreds of thousands of deaths from medical errors, poor coordination of care and backbreaking costs. In education, it became clear that our schools were lagging behind those in other countries.

So in came the consultants and out came the yardsticks. In health care, we applied metrics to outcomes and processes. Did the doctor document that she gave the patient a flu shot? That she counseled the patient about smoking? In education, of course, the preoccupation became student test scores.

All of this began innocently enough. But the measurement fad has spun out of control. There are so many different hospital ratings that more than 1,600 medical centers can now lay claim to being included on a “top 100,” “honor roll,” grade “A” or “best” hospitals list. Burnout rates for doctors top 50 percent, far higher than other professions. A 2013 study found that the electronic health record was a dominant culprit. Another 2013 study found that emergency room doctors clicked a mouse 4,000 times during a 10-hour shift. The computer systems have become the dark force behind quality measures.

Education is experiencing its own version of measurement fatigue. Educators complain that the focus on student test performance comes at the expense of learning. Art, music and physical education have withered, because, really,why bother if they’re not on the test?…

Thoughtful and limited assessment can be effective in motivating improvements and innovations, and in weeding out the rare but disproportionately destructive bad apples.

But in creating a measurement and accountability system, we need to tone down the fervor and think harder about the unanticipated consequences….(More)”