Real-time flu tracking. By monitoring social media, scientists can monitor outbreaks as they happen.


Charles Schmidt at Nature: “Conventional influenza surveillance describes outbreaks of flu that have already happened. It is based on reports from doctors, and produces data that take weeks to process — often leaving the health authorities to chase the virus around, rather than get on top of it.

But every day, thousands of unwell people pour details of their symptoms and, perhaps unknowingly, locations into search engines and social media, creating a trove of real-time flu data. If such data could be used to monitor flu outbreaks as they happen and to make accurate predictions about its spread, that could transform public-health surveillance.

Powerful computational tools such as machine learning and a growing diversity of data streams — not just search queries and social media, but also cloud-based electronic health records and human mobility patterns inferred from census information — are making it increasingly possible to monitor the spread of flu through the population by following its digital signal. Now, models that track flu in real time and forecast flu trends are making inroads into public-health practice.

“We’re becoming much more comfortable with how these models perform,” says Matthew Biggerstaff, an epidemiologist who works on flu preparedness at the US Centers for Disease Control and Prevention (CDC) in Atlanta, Georgia.

In 2013–14, the CDC launched the FluSight Network, a website informed by digital modelling that predicts the timing, peak and short-term intensity of the flu season in ten regions of the United States and across the whole country. According to Biggerstaff, flu forecasting helps responders to plan ahead, so they can be ready with vaccinations and communication strategies to limit the effects of the virus. Encouraged by progress in the field, the CDC announced in January 2019 that it will spend US$17.5 million to create a network of influenza-forecasting centres of excellence, each tasked with improving the accuracy and communication of real-time forecasts.

The CDC is leading the way on digital flu surveillance, but health agencies elsewhere are following suit. “We’ve been working to develop and apply these models with collaborators using a range of data sources,” says Richard Pebody, a consultant epidemiologist at Public Health England in London. The capacity to predict flu trajectories two to three weeks in advance, Pebody says, “will be very valuable for health-service planning.”…(More)”.

The Art of Values-Based Innovation for Humanitarian Action


Chris Earney & Aarathi Krishnan at SSIR: “Contrary to popular belief, innovation isn’t new to the humanitarian sector. Organizations like the Red Cross and Red Crescent have a long history of innovating in communities around the world. Humanitarians have worked both on a global scale—for example, to innovate financing and develop the Humanitarian Code of Conduct—and on a local level—to reduce urban fire risks in informal settlements in Kenya, for instance, and improve waste management to reduce flood risks in Indonesia.

Even in its more-bureaucratic image more than 50 years ago, the United Nations commissioned a report to better understand the role that innovation, science, and technology could play in advancing human rights and development. Titled the “Sussex Manifesto,” the report outlined how to reshape and reorganize the role of innovation and technology so that it was more relevant, equitable, and accessible to the humanitarian and development sectors. Although those who commissioned the manifesto ultimately deemed it too ambitious for its era, the effort nevertheless reflects the UN’s longstanding interest in understanding how far-reaching ideas can elicit fundamental and needed progress. It challenged the humanitarian system to be explicit about its values and understand how those values could lead to radical actions for the betterment of humanity.

Since then, 27 UN organizations have formed teams dedicated to supporting innovation. Today, the aspiration to innovate extends to NGOs and donor communities, and has led to myriad approaches to brainstorming, design thinking, co-creation, and other activities developed to support novelty.

However, in the face of a more-globalized, -connected, and -complex world, we need to, more than ever, position innovation as a bold and courageous way of doing things. It’s common for people to demote innovation as a process that tinkers around the edges of organizations, but we need to think about innovation as a tool for changing the way systems work and our practices so that they better serve communities. This matters, because humanitarian needs are only going to grow, and the resources available to us likely won’t match that need. When the values that underpin our attitudes and behaviors as humanitarians drive innovation, we can better focus our efforts and create more impact with less—and we’re going to have to…(More)”.

When Ostrom Meets Blockchain: Exploring the Potentials of Blockchain for Commons Governance


Paper by David Rozas , Antonio Tenorio-Fornés , Silvia Díaz-Molina , and Samer Hassan: “Blockchain technologies have generated excitement, yet their potential to enable new forms of governance remains largely unexplored. Two confronting standpoints dominate the emergent debate around blockchain-based governance: discourses characterised by the presence of techno-determinist and market-driven values, which tend to ignore the complexity of social organisation; and critical accounts of such discourses which, whilst contributing to identifying limitations, consider the role of traditional centralised institutions as inherently necessary to enable democratic forms of governance. Therefore the question arises, can we build perspectives of blockchain-based governance that go beyond markets and states?

In this article we draw on the Nobel laureate economist Elinor Ostrom’s principles for self-governance of communities to explore the transformative potential of blockchain. We approach blockchain through the identification and conceptualisation of affordances that this technology may provide to communities. For each affordance, we carry out a detailed analysis situating each in the context of Ostrom’s principles, considering both the potentials of algorithmic governance and the importance of incorporating communities’ social practices. The relationships found between these affordances and Ostrom’s principles allow us to provide a perspective focussed on blockchain-based commons governance. By carrying out this analysis, we aim to expand the debate from one dominated by a culture of competition to one that promotes a culture of cooperation…(More)”.

Accountability in the Age of the Artificial


2019 Solomon Lecture by Fiona McLeod: “Our aspiration for open and accountable government faces innumerable challenges, not least the natural reluctance of all governments to expose themselves to criticism and accept responsibility for failure.

Time and again, corporate and political goals take priority over just outcomes, and the human rights of individuals and communities are undervalued and ignored.

Numerous examples of bad behaviour shock us for a while, some even receiving the focused attention of high quality investigative journalism and Royal Commissions, but we are left unsatisfied, cynical and disengaged, more jaded than before, accepting the inevitability of existential threats, the comfort of algorithmic news feeds and vague promises to ‘drain the swamp’.

In this context, are big data and artificial intelligence the enemies of the people, the ultimate tools of the oligarch, or the vital tools needed to eliminate bias, improve scrutiny and just outcomes for the visionary?  Is there a future in which humanity evolves alongside an enhanced hive-mind in time to avert global catastrophe and create a new vision for humanity?…(More)”

The Internet Relies on People Working for Free


Owen Williams at OneZero: “When you buy a product like Philips Hue’s smart lights or an iPhone, you probably assume the people who wrote their code are being paid. While that’s true for those who directly author a product’s software, virtually every tech company also relies on thousands of bits of free code, made available through “open-source” projects on sites like GitHub and GitLab.

Often these developers are happy to work for free. Writing open-source software allows them to sharpen their skills, gain perspectives from the community, or simply help the industry by making innovations available at no cost. According to Google, which maintains hundreds of open-source projects, open source “enables and encourages collaboration and the development of technology, solving real-world problems.”

But when software used by millions of people is maintained by a community of people, or a single person, all on a volunteer basis, sometimes things can go horribly wrong. The catastrophic Heartbleed bug of 2014, which compromised the security of hundreds of millions of sites, was caused by a problem in an open-source library called OpenSSL, which relied on a single full-time developer not making a mistake as they updated and changed that code, used by millions. Other times, developers grow bored and abandon their projects, which can be breached while they aren’t paying attention.

It’s hard to demand that programmers who are working for free troubleshoot problems or continue to maintain software that they’ve lost interest in for whatever reason — though some companies certainly try. Not adequately maintaining these projects, on the other hand, makes the entire tech ecosystem weaker. So some open-source programmers are asking companies to pay, not for their code, but for their support services….(More)”.

Agora: Towards An Open Ecosystem for Democratizing Data Science & Artificial Intelligence


Paper by Jonas Traub et al: “Data science and artificial intelligence are driven by a plethora of diverse data-related assets including datasets, data streams, algorithms, processing software, compute resources, and domain knowledge. As providing all these assets requires a huge investment, data sciences and artificial intelligence are currently dominated by a small number of providers who can afford these investments. In this paper, we present a vision of a data ecosystem to democratize data science and artificial intelligence. In particular, we envision a data infrastructure for fine-grained asset exchange in combination with scalable systems operation. This will overcome lock-in effects and remove entry barriers for new asset providers. Our goal is to enable companies, research organizations, and individuals to have equal access to data, data science, and artificial intelligence. Such an open ecosystem has recently been put on the agenda of several governments and industrial associations. We point out the requirements and the research challenges as well as outline an initial data infrastructure architecture for building such a data ecosystem…(More)”.

Citizens need to know numbers


David Spiegelhalter at Aeon: “…Many criticised the Leave campaign for its claim that Britain sends the EU £350 million a week. When Boris Johnson repeated it in 2017 – by which time he was Foreign Secretary – the chair of the UK Statistics Authority (the official statistical watchdog) rebuked him, noting it was a ‘clear misuse of official statistics’. A private criminal prosecution was even made against Johnson for ‘misconduct in a public office’, but it was halted by the High Court.

The message on the bus had a strong emotional resonance with millions of people, even though it was essentially misinformation. The episode demonstrates both the power and weakness of statistics: they can be used to amplify an entire worldview, and yet they often do not stand up to scrutiny. This is why statistical literacy is so important – in an age in which data plays an ever-more prominent role in society, the ability to spot ways in which numbers can be misused, and to be able to deconstruct claims based on statistics, should be a standard civic skill.

Statistics are not cold hard facts – as Nate Silver writes in The Signal and the Noise (2012): ‘The numbers have no way of speaking for themselves. We speak for them. We imbue them with meaning.’ Not only has someone used extensive judgment in choosing what to measure, how to define crucial ideas, and to analyse them, but the manner in which they are communicated can utterly change their emotional impact. Let’s assume that £350 million is the actual weekly contribution to the EU. I often ask audiences to suggest what they would put on the side of the bus if they were on the Remain side. A standard option for making an apparently big number look small is to consider it as a proportion of an even bigger number: for example, the UK’s GDP is currently around £2.3 trillion, and so this contribution would comprise less than 1 per cent of GDP, around six months’ typical growth. An alternative device is to break down expenditure into smaller, more easily grasped units: for example, as there are 66 million people in the UK, £350 million a week is equivalent to around 75p a day, less than $1, say about the cost of a small packet of crisps (potato chips). If the bus had said: We each send the EU the price of a packet of crisps each day, the campaign might not have been so successful.

Numbers are often used to persuade rather than inform, statistical literacy needs to be improved, and so surely we need more statistics courses in schools and universities? Well, yes, but this should not mean more of the same. After years of researching and teaching statistical methods, I am not alone in concluding that the way in which we teach statistics can be counterproductive, with an overemphasis on mathematical foundations through probability theory, long lists of tests and formulae to apply, and toy problems involving, say, calculating the standard deviation of the weights of cod. The American Statistical Association’s Guidelines for Assessment and Instruction in Statistics Education (2016) strongly recommended changing the pedagogy of statistics into one based on problemsolving, real-world examples, and with an emphasis on communication….(More)”.

Experimental Innovation Policy


Paper by Albert Bravo-Biosca: “Experimental approaches are increasingly being adopted across many policy fields, but innovation policy has been lagging. This paper reviews the case for policy experimentation in this field, describes the different types of experiments that can be undertaken, discusses some of the unique challenges to the use of experimental approaches in innovation policy, and summarizes some of the emerging lessons, with a focus on randomized trials. The paper concludes describing how at the Innovation Growth Lab we have been working with governments across the OECD to help them overcome the barriers to policy experimentation in order to make their policies more impactful….(More)”.

The promise and peril of a digital ecosystem for the planet


Blog post by Jillian Campbell and David E Jensen: “A range of frontier and digital technologies have dramatically boosted the ways in which we can monitor the health of our planet. And sustain our future on it (Figure 1).

Figure 1. A range of frontier an digital technologies can be combined to monitor our planet and the sustainable use of natural resources (1)

If we can leverage this technology effectively, we will be able to assess and predict risks, increase transparency and accountability in the management of natural resources and inform markets as well as consumer choice. These actions are all required if we are to stand a better chance of achieving the Sustainable Development Goals (SDGs).

However, for this vision to become a reality, public and private sector actors must take deliberate action and collaborate to build a global digital ecosystem for the planet — one consisting of data, infrastructure, rapid analytics, and real-time insights. We are now at a pivotal moment in the history of our stewardship of this planet. A “tipping point” of sorts. And in order to guide the political action which is required to counter the speed, scope and severity of the environmental and climate crises, we must acquire and deploy these data sets and frontier technologies. Doing so can fundamentally change our economic trajectory and underpin a sustainable future.

This article shows how such a global digital ecosystem for the planet can be achieved — as well as what we risk if we do not take decisive action within the next 12 months….(More)”.

The business case for integrating claims and clinical data


Claudia Williams at MedCityNews: “The path to value-based care is arduous. For health plans, their ability to manage care, assess quality, lower costs, and streamline reporting is directly impacted by access to clinical data. For providers, the same can be said due to their lack of access to claims data. 

Providers and health plans are increasingly demanding integrated claims and clinical data to drive and support value-based care programs. These organizations know that clinical and claims information from more than a single organization is the only way to get a true picture of patient care. From avoiding medication errors to enabling an evidence-based approach to treatment or identifying at-risk patients, the value of integrated claims and clinical data is immense — and will have far-reaching influence on both health outcomes and costs of care over time.

On July 30, Medicare announced the Data at the Point of Care pilot to share valuable claims data with Medicare providers in order to “fill in information gaps for clinicians, giving them a more structured and complete patient history with information like previous diagnoses, past procedures, and medication lists.” But that’s not the only example. To transition from fee-for-service to value-based care, providers and health plans have begun to partner with health data networks to access integrated clinical and claims data: 

Health plan adoption of integrated data strategy

A California health plan is partnering with one of the largest nonprofit health data networks in California, to better integrate clinical and claims data. …

Providers leveraging claims data to understand patient medication patterns 

Doctors using advanced health data networks typically see a full list of patients’ medications, derived from claims, when they treat them. With this information available, doctors can avoid dangerous drug to-drug interactions when they prescribe new medications. After a visit, they can also follow up and see if a patient actually filled a prescription and is still taking it….(More)”.