A Doctor’s Prescription: Data May Finally Be Good for Your Health


Interview by Art Kleiner: “In 2015, Robert Wachter published The Digital Doctor: Hope, Hype, and Harm at the Dawn of Medicine’s Computer Age, a skeptical account of digitization in hospitals. Despite the promise offered by the digital transformation of healthcare, electronic health records had not delivered better care and greater efficiency. The cumbersome design, legacy procedures, and resistance from staff were frustrating everyone — administrators, nurses, consultants, and patients. Costs continued to rise, and preventable medical mistakes were not spotted. One patient at Wachter’s own hospital, one of the nation’s finest, was given 39 times the correct dose of antibiotics by an automated system that nobody questioned. The teenager survived, but it was clear that there needed to be a new approach to the management and use of data.

Wachter has for decades considered the delivery of healthcare through a lens focused on patient safety and quality. In 1996, he coauthored a paper in the New England Journal of Medicine that coined the term hospitalist in describing and promoting a new way of managing patients in hospitals: having one doctor — the hospitalist — “own” the patient journey from admission to discharge. The primary goal was to improve outcomes and save lives. Wachter argued it would also reduce costs and increase efficiency, making the business case for better healthcare. And he was right. Today there are more than 50,000 hospitalists, and it took just two years from the article’s publication to have the first data proving his point. In 2016, Wachter was named chair of the Department of Medicine at the University of California, San Francisco (UCSF), where he has worked since 1990.

Today, Wachter is, to paraphrase the title of a recent talk, less grumpy than he used to be about health tech. The hope part of his book’s title has materialized in some areas faster than he predicted. AI’s advances in imaging are already helping the detection of cancers become more accurate. As data collection has become better systematized, big technology firms such as Google, Amazon, and Apple are entering (in Google’s case, reentering) the field and having more success focusing their problem-solving skills on healthcare issues. In his San Francisco office, Wachter sat down with strategy+businessto discuss why the healthcare system may finally be about to change….

Systems for Fresh Thinking

S+B: The changes you appreciate seem to have less to do with technological design and more to do with people getting used to the new systems, building their own variations, and making them work.
WACHTER:
 The original electronic health record was just a platform play to get the data in digital form. It didn’t do anything particularly helpful in terms of helping the physicians make better decisions or helping to connect one kind of doctor with another kind of doctor. But it was a start.

I remember that when we were starting to develop our electronic health record at UCSF, 12 or 13 years ago, I hired a physician who is now in charge of our health computer system. I said to him, “We don’t have our electronic health record in yet, but I’m pretty sure we will in seven or eight years. What will your job be when that’s done?” I actually thought once the system was fully implemented, we’d be done with the need to innovate and evolve in health IT. That, of course, was asinine.

S+B: That’s like saying to an auto mechanic, “What will your job be when we have automatic transmissions?”
WACHTER:
 Right, but even more so, because many of us saw electronic health records as the be-all and end-all of digitally facilitated medicine. But putting in the electronic health record is just step one of 10. Then you need to start connecting all the pieces, and then you add analytics that make sense of the data and make predictions. Then you build tools and apps to fit into the workflow and change the way you work.

One of my biggest epiphanies was this: When you digitize, in any industry, nobody is clever enough to actually change anything. All they know how to do is digitize the old practice. You only start seeing real progress when smart people come in, begin using the new system, and say, “Why the hell do we do it that way?” And then you start thinking freshly about the work. That’s when you have a chance to reimagine the work in a digital environment…(More)”.

Text Analysis Systems Mine Workplace Emails to Measure Staff Sentiments


Alan Rothman at LLRX: “…For all of these good, bad or indifferent workplaces, a key question is whether any of the actions of management to engage the staff and listen to their concerns ever resulted in improved working conditions and higher levels of job satisfaction?

The answer is most often “yes”. Just having a say in, and some sense of control over, our jobs and workflows can indeed have a demonstrable impact on morale, camaraderie and the bottom line. As posited in the Hawthorne Effect, also termed the “Observer Effect”, this was first discovered during studies in the 1920’s and 1930’s when the management of a factory made improvements to the lighting and work schedules. In turn, worker satisfaction and productivity temporarily increased. This was not so much because there was more light, but rather, that the workers sensed that management was paying attention to, and then acting upon, their concerns. The workers perceived they were no longer just cogs in a machine.

Perhaps, too, the Hawthorne Effect is in some ways the workplace equivalent of the Heisenberg’s Uncertainty Principle in physics. To vastly oversimplify this slippery concept, the mere act of observing a subatomic particle can change its position.¹

Giving the processes of observation, analysis and change at the enterprise level a modern (but non-quantum) spin, is a fascinating new article in the September 2018 issue of The Atlantic entitled What Your Boss Could Learn by Reading the Whole Company’s Emails, by Frank Partnoy.  I highly recommend a click-through and full read if you have an opportunity. I will summarize and annotate it, and then, considering my own thorough lack of understanding of the basics of y=f(x), pose some of my own physics-free questions….

Today the text analytics business, like the work done by KeenCorp, is thriving. It has been long-established as the processing behind email spam filters. Now it is finding other applications including monitoring corporate reputations on social media and other sites.²

The finance industry is another growth sector, as investment banks and hedge funds scan a wide variety of information sources to locate “slight changes in language” that may point towards pending increases or decreases in share prices. Financial research providers are using artificial intelligence to mine “insights” from their own selections of news and analytical sources.

But is this technology effective?

In a paper entitled Lazy Prices, by Lauren Cohen (Harvard Business School and NBER), Christopher Malloy (Harvard Business School and NBER), and Quoc Nguyen (University of Illinois at Chicago), in a draft dated February 22, 2018, these researchers found that the share price of company, in this case NetApp in their 2010 annual report, measurably went down after the firm “subtly changes” its reporting “descriptions of certain risks”. Algorithms can detect such changes more quickly and effectively than humans. The company subsequently clarified in its 2011 annual report their “failure to comply” with reporting requirements in 2010. A highly skilled stock analyst “might have missed that phrase”, but once again its was captured by “researcher’s algorithms”.

In the hands of a “skeptical investor”, this information might well have resulted in them questioning the differences in the 2010 and 2011 annual reports and, in turn, saved him or her a great deal of money. This detection was an early signal of a looming decline in NetApp’s stock. Half a year after the 2011 report’s publication, it was reported that the Syrian government has bought the company and “used that equipment to spy on its citizen”, causing further declines.

Now text analytics is being deployed at a new target: The composition of employees’ communications. Although it has been found that workers have no expectations of privacy in their workplaces, some companies remain reluctant to do so because of privacy concerns. Thus, companies are finding it more challenging to resist the “urge to mine employee information”, especially as text analysis systems continue to improve.

Among the evolving enterprise applications are the human resources departments in assessing overall employee morale. For example, Vibe is such an app that scans through communications on Slack, a widely used enterprise platform. Vibe’s algorithm, in real-time reporting, measures the positive and negative emotions of a work team….(More)”.

Open Government Data Report: Enhancing Policy Maturity for Sustainable Impact


Report by the OECD: This report provides an overview of the state of open data policies across OECD member and partner countries, based on data collected through the OECD Open Government Data survey (2013, 2014, 2016/17), country reviews and comparative analysis. The report analyses open data policies using an analytical framework that is in line with the OECD OUR data Index and the International Open Data Charter. It assesses governments’ efforts to enhance the availability, accessibility and re-use of open government data. It makes the case that beyond countries’ commitment to open up good quality government data, the creation of public value requires engaging user communities from the entire ecosystem, such as journalists, civil society organisations, entrepreneurs, major tech private companies and academia. The report also underlines how open data policies are elements of broader digital transformations, and how public sector data policies require interaction with other public sector agendas such as open government, innovation, employment, integrity, public budgeting, sustainable development, urban mobility and transport. It stresses the relevance of measuring open data impacts in order to support the business case for open government data….(More)”.

Whither large International Non-Governmental Organisations?


Working Paper by Penny Lawrence: “Large international non-government organisations (INGOs) seem to be in an existential crisis in their role in the fight for social justice. Many, such as Save the Children or Oxfam, have become big well-known brands with compliance expectations similar to big businesses. Yet the public still imagine them to be run by volunteers. Their context is changing so fast, and so unpredictably, that they are struggling to keep up. It is a time of extraordinary disruptive change including the digital transformation, changing societal norms and engagement expectations and political upheaval and challenge. Fifteen years ago the political centre-ground in the UK seemed firm, with expanding space for civil society organisations to operate. Space for civil society voice now seems more threatened and challenged (Kenny 2015).

There has been a decline in trust in large charities in particular, partly as a result of their own complacency, acting as if the argument for aid has been won. Partly as a result of questioned practices e.g. the fundraising scandal of 2016/17 (where repeated mail drops to individuals requesting funds caused public backlash) and the safeguarding scandal of 2018 (where historic cases of sexual abuse by INGO staff, including Oxfam, were revisited by media in the wake of the #me too movement). This is also partly as a result of political challenge on INGOs’ advocacy and influencing role, their bias and their voice:

‘Some government ministers regard the charity sector with suspicion because it largely employs senior people with a left-wing perspective on life and because of other unfair criticisms of government it means there is regularly a tension between big charities and the conservative party’ Richard Wilson (Former Minister for Civil Society) 2018

On the other hand many feel that charities who have taken significant contracts to deliver services for the state have forfeited their independent voice and lost their way:

‘The voluntary sector risks declining over the next ten years into a mere instrument of a shrunken state, voiceless and toothless, unless it seizes the agenda and creates its own vision.’ Professor Nicholas Deakin 2014

It’s a tough context to be leading an INGO through, but INGOs have appeared ill prepared and slow to respond to the threats and opportunities, not realising how much they may need to change to respond to the fast evolving context and expectations. Large INGOs spend most of their energy exploiting present grant and contract business models, rather than exploring the opportunities to overcome poverty offered by such disruptive change. Their size and structures do not enable agility. They are too internally focused and self-referencing at a time when the world around them is changing so fast, and when political sands have shifted. Focussing on the internationalisation of structures and decision-making means large INGOs are ‘defeated by our own complexity’, as one INGO interviewee put it.

The purpose of this paper is to stimulate thinking amongst large INGOs at a time of such extraordinary disruptive change. The paper explores options for large INGOs, in terms of function and structure. After outlining large INGOs’ history, changing context, value and current thinking, it explores learning from others outside the development sector before suggesting the emerging options. It reflects on what’s encouraging and what’s stopping change and offers possible choices and pathways forwards….(More)”.

Why Is Behavioral Economics So Popular?


David Gal at the New York Times: “Behavioral economics seems to have captured the popular imagination. Authors like Michael Lewis write about it in best sellers like “The Undoing Project,” while pioneers of the field like Daniel Kahneman popularize it in books like “Thinking, Fast and Slow.” Its lexicon of “nudging,” “framing bias” and “the endowment effect” has become part of the vernacular of business, finance and policymaking. Even “Crazy Rich Asians,” the summer’s blockbuster romantic comedy, features an explicit nod to “loss aversion,” a key concept in the field.

What is behavioral economics, and why has it become so popular? The field has been described by Richard Thaler, one of its founders, as “economics done with strong injections of good psychology.” Proponents view it as a way to make economics more accurate by incorporating more realistic assumptions about how humans behave.

In practice, much of behavioral economics consists in using psychological insights to influence behavior. These interventions tend to be small, often involving subtle changes in how choices are presented: for example, whether you have to “opt in” to a 401(k) savings plan versus having to “opt out.” In this respect, behavioral economics can be thought of as endorsing the outsize benefits of psychological “tricks,” rather than as calling for more fundamental behavioral or policy change.

The popularity of such low-cost psychological interventions, or “nudges,” under the label of behavioral economics is in part a triumph of marketing. It reflects the widespread perception that behavioral economics combines the cleverness and fun of pop psychology with the rigor and relevance of economics.

Yet this triumph has come at a cost. In order to appeal to other economists, behavioral economists are too often concerned with describing how human behavior deviates from the assumptions of standard economic models, rather than with understanding why people behave the way they do.

Consider loss aversion. This is the notion that losses have a bigger psychological impact than gains do — that losing $5, for example, feels worse than gaining $5 feels good. Behavioral economists point to loss aversion as a psychological glitch that explains a lot of puzzling human conduct. But in an article published this year, the psychologist Derek D. Rucker and I contend that the behaviors most commonly attributed to loss aversion are a result of other causes….

There is nothing wrong with achieving small victories with minor interventions. The worry, however, is that the perceived simplicity and efficacy of such tactics will distract decision makers from more substantive efforts — for example, reducing electricity consumption by taxing it more heavily or investing in renewable energy resources.

It is great that behavioral economics is receiving its due; the field has contributed significantly to our understanding of ourselves. But in all the excitement, it’s important to keep an eye on its limits….(More)”.

The law and ethics of big data analytics: A new role for international human rights in the search for global standards


David Nersessian at Business Horizons: “The Economist recently declared that digital information has overtaken oil as the world’s most valuable commodity. Big data technology is inherently global and borderless, yet little international consensus exists over what standards should govern its use. One source of global standards benefitting from considerable international consensus might be used to fill the gap: international human rights law.

This article considers the extent to which international human rights law operates as a legal or ethical constraint on global commercial use of big data technologies. By providing clear baseline standards that apply worldwide, human rights can help shape cultural norms—implemented as ethical practices and global policies and procedures—about what businesses should do with their information technologies. In this way, human rights could play a broad and important role in shaping business thinking about the proper handling of this increasingly valuable commodity in the modern global society…(More)”.

Translating science into business innovation: The case of open food and nutrition data hackathons


Paper by Christopher TucciGianluigi Viscusi and Heidi Gautschi: “In this article, we explore the use of hackathons and open data in corporations’ open innovation portfolios, addressing a new way for companies to tap into the creativity and innovation of early-stage startup culture, in this case applied to the food and nutrition sector. We study the first Open Food Data Hackdays, held on 10-11 February 2017 in Lausanne and Zurich. The aim of the overall project that the Hackdays event was part of was to use open food and nutrition data as a driver for business innovation. We see hackathons as a new tool in the innovation manager’s toolkit, a kind of live crowdsourcing exercise that goes beyond traditional ideation and develops a variety of prototypes and new ideas for business innovation. Companies then have the option of working with entrepreneurs and taking some of the ideas forward….(More)”.

The secret data collected by dockless bikes is helping cities map your movement


Lime is able to collect this information because its bikes, like all those in dockless bike-share programs, are built to operate without fixed stations or corrals. …In the 18 months or so since dockless bike-share arrived in the US, the service has spread to at least 88 American cities. (On the provider side, at least 10 companies have jumped into the business; Lime is one of the largest.) Some of those cities now have more than a year of data related to the programs, and they’ve started gleaning insights and catering to the increased number of cyclists on their streets.

South Bend is one of those leaders. It asked Lime to share data when operations kicked off in June 2017. At first, Lime provided the information in spreadsheets, but in early 2018 the startup launched a browser-based dashboard where cities could see aggregate statistics for their residents, such as how many of them rented bikes, how many trips they took, and how far and long they rode. Lime also added heat maps that reveal where most rides occur within a city and a tool for downloading data that shows individual trips without identifying the riders. Corcoran can glance at his dashboard and see, for example, that people in South Bend have taken 340,000 rides, traveled 158,000 miles, and spent more than 7 million minutes on Lime bikes since the company started service. He can also see there are 700 Lime bikes active in the city, down from an all-time high of 1,200 during the University of Notre Dame’s 2017 football season….(More)”.

Emerging Labour Market Data Sources towards Digital Technical and Vocational Education and Training (TVET)


Paper by Nikos Askitas, Rafik Mahjoubi, Pedro S. Martins, Koffi Zougbede for Paris21/OECD: “Experience from both technology and policy making shows that solutions for labour market improvements are simply choices of new, more tolerable problems. All data solutions supporting digital Technical and Vocational Education and Training (TVET) will have to incorporate a roadmap of changes rather than an unrealistic super-solution. The ideal situation is a world in which labour market participants engage in intelligent strategic behavior in an informed, fair and sophisticated manner.

Labour market data captures transactions within labour market processes. In order to successfully capture such data, we need to understand the specifics of these market processes. Designing an ecosystem of labour market matching facilitators and rules of engagement for contributing to a lean and streamlined Logistics Management and Information System (LMIS) is the best way to create Big Data with context relevance. This is in contrast with pre-existing Big Data captured by global job boards or social media for which relevance is limited by the technology access gap and its variations across the developing world.

Network effects occur in technology and job facilitation, as seen in the developed world. Managing and instigating the right network effects might be crucial to avoid fragmented stagnation and inefficiency. This is key to avoid throwing money behind wrong choices that do not gain traction.

A mixed mode approach is possibly the ideal approach for developing countries. Mixing offline and online elements correctly will be crucial in bridging the technology access gap and reaping the benefits of digitisation at the same time.

Properly incentivising the various entities is critical for progression, and more specifically the private sector, which is significantly more agile and inventive, has “skin in the game” and a long-term commitment to the conditions in the field, has intimate knowledge of how to solve the the technology gap and brings a better understanding of the particular ambient context they are operating in. To summarise: Big Data starts small.

Managing expectations and creating incentives for the various stakeholders will be crucial in establishing digitally supported TVET. Developing the right business models will be crucial in the short term and beyond, and it will be the result of creating the right mix of technological and policy expertise with good knowledge of the situation on the ground….(More)”.

The New York City Business Atlas: Leveling the Playing Field for Small Businesses with Open Data


Chapter by Stefaan Verhulst and Andrew Young in Smarter New York City:How City Agencies Innovate. Edited by André Corrêa d’Almeida: “While retail entrepreneurs, particularly those operating in the small-business space, are experts in their respective trades, they often lack access to high-quality information about social, environmental, and economic conditions in the neighborhoods where they operate or are considering operating.

The New York City Business Atlas, conceived by the Mayor’s Office of Data Analytics (MODA) and the Department of Small Business Services, is designed to alleviate that information gap by providing a public web-based tool that gives small businesses access to high-quality data to help them decide where to establish a new business or expand an existing one. e tool brings together a diversity of data, including business-fling data from the Department of Consumer Affairs, sales-tax data from the Department of Finance, demographic data from the census, and traffic data from Placemeter, a New York City startup focusing on real-time traffic information.

The initial iteration of the Business Atlas made useful and previously inaccessible data available to small-business owners and entrepreneurs in an innovative manner. After a few years, however, it became clear that the tool was not experiencing the level of use or creating the level of demonstrable impact anticipated. Rather than continuing down the same path or abandoning the effort entirely, MODA pivoted to a new approach, moving from the Business Atlas as a single information-providing tool to the Business Atlas as a suite of capabilities aimed at bolstering New York’s small-business community.

Through problem- and user-centered efforts, the Business Atlas is now making important insights available to stakeholders who can put it to meaningful use—from how long it takes to open a restaurant in the city to which areas are most in need of education and outreach to improve their code compliance. This chapter considers the open data environment from which the Business Atlas was launched, details the initial version of the Business Atlas and the lessons it generated and describes the pivot to this new approach….(More)”.