How Measurement Fails Doctors and Teachers


Robert M. Wachter at the New York Times: “Two of our most vital industries, health care and education, have become increasingly subjected to metrics and measurements. Of course, we need to hold professionals accountable. But the focus on numbers has gone too far. We’re hitting the targets, but missing the point.

Through the 20th century, we adopted a hands-off approach, assuming that the pros knew best. Most experts believed that the ideal “products” — healthy patients and well-educated kids — were too strongly influenced by uncontrollable variables (the sickness of the patient, the intellectual capacity of the student) and were too complex to be judged by the measures we use for other industries.

By the early 2000s, as evidence mounted that both fields were producing mediocre outcomes at unsustainable costs, the pressure for measurement became irresistible. In health care, we saw hundreds of thousands of deaths from medical errors, poor coordination of care and backbreaking costs. In education, it became clear that our schools were lagging behind those in other countries.

So in came the consultants and out came the yardsticks. In health care, we applied metrics to outcomes and processes. Did the doctor document that she gave the patient a flu shot? That she counseled the patient about smoking? In education, of course, the preoccupation became student test scores.

All of this began innocently enough. But the measurement fad has spun out of control. There are so many different hospital ratings that more than 1,600 medical centers can now lay claim to being included on a “top 100,” “honor roll,” grade “A” or “best” hospitals list. Burnout rates for doctors top 50 percent, far higher than other professions. A 2013 study found that the electronic health record was a dominant culprit. Another 2013 study found that emergency room doctors clicked a mouse 4,000 times during a 10-hour shift. The computer systems have become the dark force behind quality measures.

Education is experiencing its own version of measurement fatigue. Educators complain that the focus on student test performance comes at the expense of learning. Art, music and physical education have withered, because, really,why bother if they’re not on the test?…

Thoughtful and limited assessment can be effective in motivating improvements and innovations, and in weeding out the rare but disproportionately destructive bad apples.

But in creating a measurement and accountability system, we need to tone down the fervor and think harder about the unanticipated consequences….(More)”

 

Distributed ledger technology: beyond block chain


UK Government Office for Science: “In a major report on distributed ledgers published today (19 January 2016), the Government Chief Scientist, Sir Mark Walport, sets out how this technology could transform the delivery of public services and boost productivity.

A distributed ledger is a database that can securely record financial, physical or electronic assets for sharing across a network through entirely transparent updates of information.

Its first incarnation was ‘Blockchain’ in 2008, which underpinned digital cash systems such as Bitcoin. The technology has now evolved into a variety of models that can be applied to different business problems and dramatically improve the sharing of information.

Distributed ledger technology could provide government with new tools to reduce fraud, error and the cost of paper intensive processes. It also has the potential to provide new ways of assuring ownership and provenance for goods and intellectual property.

Distributed ledgers are already being used in the diamond markets and in the disbursing of international aid payments.

Sir Mark Walport said:

Distributed ledger technology has the potential to transform the delivery of public and private services. It has the potential to redefine the relationship between government and the citizen in terms of data sharing, transparency and trust and make a leading contribution to the government’s digital transformation plan.

Any new technology creates challenges, but with the right mix of leadership, collaboration and sound governance, distributed ledgers could yield significant benefits for the UK.

The report makes a number of recommendations which focus on ministerial leadership, research, standards and the need for proof of concept trials.

They include:

  • government should provide ministerial leadership to ensure that it provides the vision, leadership and the platform for distributed ledger technology within government; this group should consider governance, privacy, security and standards
  • government should establish trials of distributed ledgers in order to assess the technology’s usability within the public sector
  • government could support the creation of distributed ledger demonstrators for local government that will bring together all the elements necessary to test the technology and its application.
  • the UK research community should invest in the research required to ensure that distributed ledgers are scalable, secure and provide proof of correctness of their contents….View the report ‘Distributed ledger technology: beyond block chain’.”

The impact of open access scientific knowledge


Jack Karsten and Darrell M. West at Brookings: “In spite of technological advancements like the Internet, academic publishing has operated in much the same way for centuries. Scientists voluntarily review their peers’ papers for little or no compensation; the paper’s author likewise does not receive payment from academic publishers. Though most of the costs of publishing a journal are administrative, the cost of subscribing to scientific journals nevertheless increased 600 percent between 1984 and 2002. The funding for the research libraries that form the bulk of journal subscribers has not kept pace, leading to campaigns at universities including Harvard to boycott for-profit publishers.

Though the Internet has not yet brought down the price of academic journal subscriptions, it has led to some interesting alternatives. In 2015, the Twitter hashtag #icanhazPDF was created to request copies of papers located behind paywalls. Anyone with access to a specific paper can download it and then e-mail it to the requester. The practice violates the copyright of publishers, but puts papers in reach of researchers who would otherwise not be able to read them. If a researcher cannot read a journal article in the first place, they cannot go on to cite it, which raises the profile of the cited article and the journal that published it. The publisher is caught between two conflicting goals: to increase the number of citations for their articles and earning revenue to stay in business.

Thinking outside the journal

A trio of University of Chicago researchers examines this issue through the lens of Wikipedia in a paper titled “Amplifying the Impact of Open Access: Wikipedia and the Diffusion of Science.” Wikipedia makes a compelling subject for scientific diffusion given its status as one of the most visited websites in the world, attracting 374 million unique visitors monthly as of September 2015. The study found that on English language articles, Wikipedia editors are 47 percent more likely to cite an article from an open access journal. Anyone using Wikipedia as a first source for information on a subject is more likely to read information from open source journals. If readers click through the links to cited articles, they can read the actual text of these open-source journal articles.

Given how much the federal government spends on scientific research ($66 billion on nondefense R&D in 2015), it has a large role to play in the diffusion of scientific knowledge. Since 2008, the National Institutes of Health (NIH) has required researchers who publish in academic journals to also publish in PubMed, an online open access journal. Expanding provisions like the NIH Public Access Policy to other agencies and to recipients of federal grants at universities would give the public and other researchers a wealth of scientific information. Scientific literacy, even on cutting-edge research, is increasingly important when science informs policy on major issues such as climate change and health care….(More)”

Systematic Thinking for Social Action


Re-issued book by Alice M. Rivlin: “In January 1970 Alice M. Rivlin spoke to an audience at the University of California–Berkeley. The topic was developing a more rational approach to decision-making in government. If digital video, YouTube, and TED Talks had been inventions of the 1960s, Rivlin’s talk would have been a viral hit. As it was, the resulting book, Systematic Thinking for Social Action, spent years on the Brookings Press bestseller list. It is a very personal and conversational volume about the dawn of new ways of thinking about government.

As a deputy assistant secretary for program coordination, and later as assistant secretary for planning and evaluation, at the Department of Health, Education and Welfare from 1966 to 1969, Rivlin was an early advocate of systems analysis, which had been introduced by  Robert McNamara at the Department of Defense as  PPBS (planning-programming-budgeting-system).

While Rivlin brushes aside the jargon, she digs into the substance of systematic analysis and a “quiet revolution in government.” In an evaluation of the evaluators, she issues mixed grades, pointing out where analysts had been helpful in finding solutions and where—because of inadequate data or methods—they had been no help at all.

Systematic Thinking for Social Action offers important insights for anyone interested in working to find the smartest ways to allocate scarce funds to promote the maximum well-being of all citizens.

This reissue is a Brookings Classics, a series of republished books for readers to revisit or discover previous, notable works by the Brookings Institution Press.

Chicago Is Predicting Food Safety Violations. Why Aren’t Other Cities?


Julian Spector at CityLab: “The three dozen inspectors at the Chicago Department of Public Health scrutinize 16,000 eating establishments to protect diners from gut-bombing food sickness. Some of those pose more of a health risk than others; approximately 15 percent of inspections catch a critical violation.

For years, Chicago, like most every city in the U.S., scheduled these inspections by going down the complete list of food vendors and making sure they all had a visit in the mandated timeframe. That process ensured that everyone got inspected, but not that the most likely health code violators got inspected first. And speed matters in this case. Every day that unsanitary vendors serve food is a new chance for diners to get violently ill, paying in time, pain, and medical expenses.

That’s why, in 2014, Chicago’s Department of Innovation and Technology started sifting through publicly available city data and built an algorithm to predict which restaurants were most likely to be in violation of health codes, based on the characteristics of previously recorded violations. The program generated a ranked list of which establishments the inspectors should look at first. The project is notable not just because it worked—the algorithm identified violations significantly earlier than business as usual did—but because the team made it as easy as possible for other cities to replicate the approach.

And yet, more than a year after Chicago published its code, only one local government, in metro D.C., has tried to do the same thing. All cities face the challenge of keeping their food safe and therefore have much to gain from this data program. The challenge, then, isn’t just to design data solutions that work, but to do so in a way that facilitates sharing them with other cities. The Chicago example reveals the obstacles that might prevent a good urban solution from spreading to other cities, but also how to overcome them….(More)”

Met Office warns of big data floods on the horizon


 at V3: “The amount of data being collected by departments and agencies mean government services will not be able to implement truly open data strategies, according to Met Office CIO Charles Ewen.

Ewen said the rapidly increasing amount of data being stored by companies and government departments mean it will not be technologically possible able to share all their data in the near future.

During a talk at the Cloud World Forum on Wednesday, he said: “The future will be bigger and bigger data. Right now we’re talking about petabytes, in the near future it will be tens of petabytes, then soon after it’ll be hundreds of petabytes and then we’ll be off into imaginary figure titles.

“We see a future where data has gotten so big the notion of open data and the idea ‘lets share our data with everybody and anybody’ just won’t work. We’re struggling to make it work already and by 2020 the national infrastructure will not exist to shift this stuff [data] around in the way anybody could access and make use of it.”

Ewen added that to deal with the shift he expects many departments and agencies will adapt their processes to become digital curators that are more selective about the data they share, to try and ensure it is useful.

“This isn’t us wrapping our arms around our data and saying you can’t see it. We just don’t see how we can share all this big data in the way you would want it,” he said.

“We see a future where a select number of high-capacity nodes become information brokers and are used to curate and manage data. These curators will be where people bring their problems. That’s the future we see.”

Ewan added that the current expectations around open data are based on misguided views about the capabilities of cloud technology to host and provide access to huge amounts of data.

“The trendy stuff out there claims to be great at everything, but don’t get carried away. We don’t see cloud as anything but capability. We’ve been using appropriate IT and what’s available to deliver our mission services for over 50 to 60 years, and cloud is playing an increasing part of that, but purely for increased capability,” he said.

“It’s just another tool. The important thing is having the skill and knowledge to not just believe vendors but to look and identify the problem and say ‘we have to solve this’.”

The Met Office CIO’s comments follow reports from other government service providers that people’s desire for open data is growing exponentially….(More)”

Crowdsourcing Diagnosis for Patients With Undiagnosed Illnesses: An Evaluation of CrowdMed


Paper by Ashley N.D Meyer et al in the Journal of Medical Internet Research: ” Background: Despite visits to multiple physicians, many patients remain undiagnosed. A new online program, CrowdMed, aims to leverage the “wisdom of the crowd” by giving patients an opportunity to submit their cases and interact with case solvers to obtain diagnostic possibilities.

Objective: To describe CrowdMed and provide an independent assessment of its impact.

Methods: Patients submit their cases online to CrowdMed and case solvers sign up to help diagnose patients. Case solvers attempt to solve patients’ diagnostic dilemmas and often have an interactive online discussion with patients, including an exchange of additional diagnostic details. At the end, patients receive detailed reports containing diagnostic suggestions to discuss with their physicians and fill out surveys about their outcomes. We independently analyzed data collected from cases between May 2013 and April 2015 to determine patient and case solver characteristics and case outcomes.

Results: During the study period, 397 cases were completed. These patients previously visited a median of 5 physicians, incurred a median of US $10,000 in medical expenses, spent a median of 50 hours researching their illnesses online, and had symptoms for a median of 2.6 years. During this period, 357 active case solvers participated, of which 37.9% (132/348) were male and 58.3% (208/357) worked or studied in the medical industry. About half (50.9%, 202/397) of patients were likely to recommend CrowdMed to a friend, 59.6% (233/391) reported that the process gave insights that led them closer to the correct diagnoses, 57% (52/92) reported estimated decreases in medical expenses, and 38% (29/77) reported estimated improvement in school or work productivity.

Conclusions: Some patients with undiagnosed illnesses reported receiving helpful guidance from crowdsourcing their diagnoses during their difficult diagnostic journeys. However, further development and use of crowdsourcing methods to facilitate diagnosis requires long-term evaluation as well as validation to account for patients’ ultimate correct diagnoses….(More)”

Hacking the streets: ‘Smart’ writing in the smart city


Spencer Jordan at FirstMonday: “Cities have always been intimately bound up with technology. As important nodes within commercial and communication networks, cities became centres of sweeping industrialisation that affected all facets of life (Mumford, 1973). Alienation and estrangement became key characteristics of modernity, Mumford famously noting the “destruction and disorder within great cities” during the long nineteenth century. The increasing use of digital technology is yet another chapter in this process, exemplified by the rise of the ‘smart city’. Although there is no agreed definition, smart cities are understood to be those in which digital technology helps regulate, run and manage the city (Caragliu,et al., 2009). This article argues that McQuire’s definition of ‘relational space’, what he understands as the reconfiguration of urban space by digital technology, is critical here. Although some see the impact of digital technology on the urban environment as deepening social exclusion and isolation (Virilio, 1991), others, such as de Waal perceive digital technology in a more positive light. What is certainly clear, however, is that the city is once again undergoing rapid change. As Varnelis and Friedberg note, “place … is in a process of a deep and contested transformation”.

If the potential benefits from digital technology are to be maximised it is necessary that the relationship between the individual and the city is understood. This paper examines how digital technology can support and augment what de Certeau calls spatial practice, specifically in terms of constructions of ‘home’ and ‘belonging’ (de Certeau, 1984). The very act of walking is itself an act of enunciation, a process by which the city is instantiated; yet, as de Certeau and Bachelard remind us, the city is also wrought from the stories we tell, the narratives we construct about that space (de Certeau, 1984; Bachelard, 1994). The city is thus envisioned both through physical exploration but also language. As Turchi has shown, the creative stories we make on these voyages can be understood as maps of that world and those we meet (Turchi, 2004). If, as the situationists Kotányi and Vaneigem stated, “Urbanism is comparable to the advertising propagated around Coca-Cola — pure spectacular ideology”, there needs to be a way by which the hegemony of the market, Benjamin’s phantasmagoria, can be challenged. This would wrestle control from the market forces that are seen to have overwhelmed the high street, and allow a refocusing on the needs of both the individual and the community.

This article argues that, though anachronistic, some of the situationists’ ideas persist within hacking, what Himanen (2001) identified as the ‘hacker ethic’. As Taylor argues, although hacking is intimately connected to the world of computers, it can refer to the unorthodox use of any ‘artefact’, including social ‘systems’ . In this way, de Certeau’s urban itineraries, the spatial practice of each citizen through the city, can be understood as a form of hacking. As Wark states, “We do not lack communication. On the contrary, we have too much of it. We lack creation. We lack resistance to the present.” If the city itself is called into being through our physical journeys, in what de Certeau called ‘spaces of enunciation’, then new configurations and possibilities abound. The walker becomes hacker, Wark’s “abstractors of new worlds”, and the itinerary a deliberate subversion of an urban system, the dream houses of Benjamin’s arcades. This paper examines one small research project, Waterways and Walkways, in its investigation of a digitally mediated exploration across Cardiff, the Welsh capital. The article concludes by showing just one small way in which digital technology can play a role in facilitating the re-conceptualisation of our cities….(More)”

Algorithmic Life: Calculative Devices in the Age of Big Data


Book edited by Louise Amoore and Volha Piotukh: “This book critically explores forms and techniques of calculation that emerge with digital computation, and their implications. The contributors demonstrate that digital calculative devices matter beyond their specific functions as they progressively shape, transform and govern all areas of our life. In particular, it addresses such questions as:

  • How does the drive to make sense of, and productively use, large amounts of diverse data, inform the development of new calculative devices, logics and techniques?
  • How do these devices, logics and techniques affect our capacity to decide and to act?
  • How do mundane elements of our physical and virtual existence become data to be analysed and rearranged in complex ensembles of people and things?
  • In what ways are conventional notions of public and private, individual and population, certainty and probability, rule and exception transformed and what are the consequences?
  • How does the search for ‘hidden’ connections and patterns change our understanding of social relations and associative life?
  • Do contemporary modes of calculation produce new thresholds of calculability and computability, allowing for the improbable or the merely possible to be embraced and acted upon?
  • As contemporary approaches to governing uncertain futures seek to anticipate future events, how are calculation and decision engaged anew?

Drawing together different strands of cutting-edge research that is both theoretically sophisticated and empirically rich, this book makes an important contribution to several areas of scholarship, including the emerging social science field of software studies, and will be a vital resource for students and scholars alike….(More)”

Innovation in the Public and Nonprofit Sectors


A Public Solutions Handbook edited by Patria De Lancer Julnes and  Ed Gibson: “In the organizational context, the word “innovation” is often associated with private sector organizations, which are often perceived as more agile, adaptable, and able to withstand change than government agencies and nonprofit organizations. But the reality is that, while they may struggle, public and nonprofit organizations do innovate. These organizations must find ways to use shrinking resources effectively, improve their performance, and achieve desirable societal outcomes. Innovation in the Public Sector provides alternative frameworks for defining, categorizing, and studying innovation in government and in the nonprofit sector.

Through a diverse collection of international case studies, this book broadens the discussion of innovation in public and nonprofit organizations, demonstrating the hurdles organizations face and examining the technological advances and managerial ingenuity innovators use to achieve their goals, both within and beyond the boundaries of the innovating organization. The chapters shed light on key issues including:

  • how to conceptualize innovation;
  • how organizations decide between competing good ideas;
  • how to implement innovation;
  • how to contend with challenges to innovation;
  • how to judge success in innovation

This book provides current and future public managers with the understanding and skills required to manage change and innovation, and is essential reading for all those studying public management, public administration, and public policy….(More)”