The Social Afterlife


Paper by Andrew Gilden: “Death is not what it used to be. With the rise of social media and advances in digital technology, postmortem decision-making increasingly involves difficult questions about the ongoing social presence of the deceased. Should a Twitter account keep tweeting? Should a YouTube singer keep singing? Should Tinder photos be swiped left for the very last time? The traditional touchstones of effective estate planning — reducing transaction costs and maximizing estate value — do little to guide this new social afterlife. Managing a person’s legacy has shifted away from questions of financial investment and asset management to questions of emotional and cultural stewardship. This Article brings together the diverse areas of law that shape a person’s legacy and develops a new framework for addressing the evolving challenges of legacy stewardship

This Article makes two main contributions. First, it identifies and critically examines the four models of stewardship that currently structure the laws of legacy: (1) the “freedom of disposition” model dominant in the laws of wills and trusts, (2) the “family inheritance” model dominant in copyright law, (3) the “public domain” model dominant in many states’ publicity rights laws, and (4) the “consumer contract” model dominant in over forty states’ new digital assets laws. Second, this Article develops a new stewardship model, which it calls the “decentered decedent.” The decentered decedent model recognizes that individuals occupy heterogenous social contexts, and it channels postmortem decision-making into each of those contexts. Unlike existing stewardship models, this new model does not try to centralize stewardship decisions in any one stakeholder — the family, the public, the market, or even the decedent themselves. Instead, the decentered decedent model distributes stewardship across the diverse, dispersed communities that we all leave behind….(More)”.

AI Global Surveillance Technology


Carnegie Endowment: “Artificial intelligence (AI) technology is rapidly proliferating around the world. A growing number of states are deploying advanced AI surveillance tools to monitor, track, and surveil citizens to accomplish a range of policy objectives—some lawful, others that violate human rights, and many of which fall into a murky middle ground.

In order to appropriately address the effects of this technology, it is important to first understand where these tools are being deployed and how they are being used.

To provide greater clarity, Carnegie presents an AI Global Surveillance (AIGS) Index—representing one of the first research efforts of its kind. The index compiles empirical data on AI surveillance use for 176 countries around the world. It does not distinguish between legitimate and unlawful uses of AI surveillance. Rather, the purpose of the research is to show how new surveillance capabilities are transforming the ability of governments to monitor and track individuals or systems. It specifically asks:

  • Which countries are adopting AI surveillance technology?
  • What specific types of AI surveillance are governments deploying?
  • Which countries and companies are supplying this technology?

Learn more about our findings and how AI surveillance technology is spreading rapidly around the globe….(More)”.

Real-time flu tracking. By monitoring social media, scientists can monitor outbreaks as they happen.


Charles Schmidt at Nature: “Conventional influenza surveillance describes outbreaks of flu that have already happened. It is based on reports from doctors, and produces data that take weeks to process — often leaving the health authorities to chase the virus around, rather than get on top of it.

But every day, thousands of unwell people pour details of their symptoms and, perhaps unknowingly, locations into search engines and social media, creating a trove of real-time flu data. If such data could be used to monitor flu outbreaks as they happen and to make accurate predictions about its spread, that could transform public-health surveillance.

Powerful computational tools such as machine learning and a growing diversity of data streams — not just search queries and social media, but also cloud-based electronic health records and human mobility patterns inferred from census information — are making it increasingly possible to monitor the spread of flu through the population by following its digital signal. Now, models that track flu in real time and forecast flu trends are making inroads into public-health practice.

“We’re becoming much more comfortable with how these models perform,” says Matthew Biggerstaff, an epidemiologist who works on flu preparedness at the US Centers for Disease Control and Prevention (CDC) in Atlanta, Georgia.

In 2013–14, the CDC launched the FluSight Network, a website informed by digital modelling that predicts the timing, peak and short-term intensity of the flu season in ten regions of the United States and across the whole country. According to Biggerstaff, flu forecasting helps responders to plan ahead, so they can be ready with vaccinations and communication strategies to limit the effects of the virus. Encouraged by progress in the field, the CDC announced in January 2019 that it will spend US$17.5 million to create a network of influenza-forecasting centres of excellence, each tasked with improving the accuracy and communication of real-time forecasts.

The CDC is leading the way on digital flu surveillance, but health agencies elsewhere are following suit. “We’ve been working to develop and apply these models with collaborators using a range of data sources,” says Richard Pebody, a consultant epidemiologist at Public Health England in London. The capacity to predict flu trajectories two to three weeks in advance, Pebody says, “will be very valuable for health-service planning.”…(More)”.

The Art of Values-Based Innovation for Humanitarian Action


Chris Earney & Aarathi Krishnan at SSIR: “Contrary to popular belief, innovation isn’t new to the humanitarian sector. Organizations like the Red Cross and Red Crescent have a long history of innovating in communities around the world. Humanitarians have worked both on a global scale—for example, to innovate financing and develop the Humanitarian Code of Conduct—and on a local level—to reduce urban fire risks in informal settlements in Kenya, for instance, and improve waste management to reduce flood risks in Indonesia.

Even in its more-bureaucratic image more than 50 years ago, the United Nations commissioned a report to better understand the role that innovation, science, and technology could play in advancing human rights and development. Titled the “Sussex Manifesto,” the report outlined how to reshape and reorganize the role of innovation and technology so that it was more relevant, equitable, and accessible to the humanitarian and development sectors. Although those who commissioned the manifesto ultimately deemed it too ambitious for its era, the effort nevertheless reflects the UN’s longstanding interest in understanding how far-reaching ideas can elicit fundamental and needed progress. It challenged the humanitarian system to be explicit about its values and understand how those values could lead to radical actions for the betterment of humanity.

Since then, 27 UN organizations have formed teams dedicated to supporting innovation. Today, the aspiration to innovate extends to NGOs and donor communities, and has led to myriad approaches to brainstorming, design thinking, co-creation, and other activities developed to support novelty.

However, in the face of a more-globalized, -connected, and -complex world, we need to, more than ever, position innovation as a bold and courageous way of doing things. It’s common for people to demote innovation as a process that tinkers around the edges of organizations, but we need to think about innovation as a tool for changing the way systems work and our practices so that they better serve communities. This matters, because humanitarian needs are only going to grow, and the resources available to us likely won’t match that need. When the values that underpin our attitudes and behaviors as humanitarians drive innovation, we can better focus our efforts and create more impact with less—and we’re going to have to…(More)”.

When Ostrom Meets Blockchain: Exploring the Potentials of Blockchain for Commons Governance


Paper by David Rozas , Antonio Tenorio-Fornés , Silvia Díaz-Molina , and Samer Hassan: “Blockchain technologies have generated excitement, yet their potential to enable new forms of governance remains largely unexplored. Two confronting standpoints dominate the emergent debate around blockchain-based governance: discourses characterised by the presence of techno-determinist and market-driven values, which tend to ignore the complexity of social organisation; and critical accounts of such discourses which, whilst contributing to identifying limitations, consider the role of traditional centralised institutions as inherently necessary to enable democratic forms of governance. Therefore the question arises, can we build perspectives of blockchain-based governance that go beyond markets and states?

In this article we draw on the Nobel laureate economist Elinor Ostrom’s principles for self-governance of communities to explore the transformative potential of blockchain. We approach blockchain through the identification and conceptualisation of affordances that this technology may provide to communities. For each affordance, we carry out a detailed analysis situating each in the context of Ostrom’s principles, considering both the potentials of algorithmic governance and the importance of incorporating communities’ social practices. The relationships found between these affordances and Ostrom’s principles allow us to provide a perspective focussed on blockchain-based commons governance. By carrying out this analysis, we aim to expand the debate from one dominated by a culture of competition to one that promotes a culture of cooperation…(More)”.

Accountability in the Age of the Artificial


2019 Solomon Lecture by Fiona McLeod: “Our aspiration for open and accountable government faces innumerable challenges, not least the natural reluctance of all governments to expose themselves to criticism and accept responsibility for failure.

Time and again, corporate and political goals take priority over just outcomes, and the human rights of individuals and communities are undervalued and ignored.

Numerous examples of bad behaviour shock us for a while, some even receiving the focused attention of high quality investigative journalism and Royal Commissions, but we are left unsatisfied, cynical and disengaged, more jaded than before, accepting the inevitability of existential threats, the comfort of algorithmic news feeds and vague promises to ‘drain the swamp’.

In this context, are big data and artificial intelligence the enemies of the people, the ultimate tools of the oligarch, or the vital tools needed to eliminate bias, improve scrutiny and just outcomes for the visionary?  Is there a future in which humanity evolves alongside an enhanced hive-mind in time to avert global catastrophe and create a new vision for humanity?…(More)”

The Internet Relies on People Working for Free


Owen Williams at OneZero: “When you buy a product like Philips Hue’s smart lights or an iPhone, you probably assume the people who wrote their code are being paid. While that’s true for those who directly author a product’s software, virtually every tech company also relies on thousands of bits of free code, made available through “open-source” projects on sites like GitHub and GitLab.

Often these developers are happy to work for free. Writing open-source software allows them to sharpen their skills, gain perspectives from the community, or simply help the industry by making innovations available at no cost. According to Google, which maintains hundreds of open-source projects, open source “enables and encourages collaboration and the development of technology, solving real-world problems.”

But when software used by millions of people is maintained by a community of people, or a single person, all on a volunteer basis, sometimes things can go horribly wrong. The catastrophic Heartbleed bug of 2014, which compromised the security of hundreds of millions of sites, was caused by a problem in an open-source library called OpenSSL, which relied on a single full-time developer not making a mistake as they updated and changed that code, used by millions. Other times, developers grow bored and abandon their projects, which can be breached while they aren’t paying attention.

It’s hard to demand that programmers who are working for free troubleshoot problems or continue to maintain software that they’ve lost interest in for whatever reason — though some companies certainly try. Not adequately maintaining these projects, on the other hand, makes the entire tech ecosystem weaker. So some open-source programmers are asking companies to pay, not for their code, but for their support services….(More)”.

Agora: Towards An Open Ecosystem for Democratizing Data Science & Artificial Intelligence


Paper by Jonas Traub et al: “Data science and artificial intelligence are driven by a plethora of diverse data-related assets including datasets, data streams, algorithms, processing software, compute resources, and domain knowledge. As providing all these assets requires a huge investment, data sciences and artificial intelligence are currently dominated by a small number of providers who can afford these investments. In this paper, we present a vision of a data ecosystem to democratize data science and artificial intelligence. In particular, we envision a data infrastructure for fine-grained asset exchange in combination with scalable systems operation. This will overcome lock-in effects and remove entry barriers for new asset providers. Our goal is to enable companies, research organizations, and individuals to have equal access to data, data science, and artificial intelligence. Such an open ecosystem has recently been put on the agenda of several governments and industrial associations. We point out the requirements and the research challenges as well as outline an initial data infrastructure architecture for building such a data ecosystem…(More)”.

Citizens need to know numbers


David Spiegelhalter at Aeon: “…Many criticised the Leave campaign for its claim that Britain sends the EU £350 million a week. When Boris Johnson repeated it in 2017 – by which time he was Foreign Secretary – the chair of the UK Statistics Authority (the official statistical watchdog) rebuked him, noting it was a ‘clear misuse of official statistics’. A private criminal prosecution was even made against Johnson for ‘misconduct in a public office’, but it was halted by the High Court.

The message on the bus had a strong emotional resonance with millions of people, even though it was essentially misinformation. The episode demonstrates both the power and weakness of statistics: they can be used to amplify an entire worldview, and yet they often do not stand up to scrutiny. This is why statistical literacy is so important – in an age in which data plays an ever-more prominent role in society, the ability to spot ways in which numbers can be misused, and to be able to deconstruct claims based on statistics, should be a standard civic skill.

Statistics are not cold hard facts – as Nate Silver writes in The Signal and the Noise (2012): ‘The numbers have no way of speaking for themselves. We speak for them. We imbue them with meaning.’ Not only has someone used extensive judgment in choosing what to measure, how to define crucial ideas, and to analyse them, but the manner in which they are communicated can utterly change their emotional impact. Let’s assume that £350 million is the actual weekly contribution to the EU. I often ask audiences to suggest what they would put on the side of the bus if they were on the Remain side. A standard option for making an apparently big number look small is to consider it as a proportion of an even bigger number: for example, the UK’s GDP is currently around £2.3 trillion, and so this contribution would comprise less than 1 per cent of GDP, around six months’ typical growth. An alternative device is to break down expenditure into smaller, more easily grasped units: for example, as there are 66 million people in the UK, £350 million a week is equivalent to around 75p a day, less than $1, say about the cost of a small packet of crisps (potato chips). If the bus had said: We each send the EU the price of a packet of crisps each day, the campaign might not have been so successful.

Numbers are often used to persuade rather than inform, statistical literacy needs to be improved, and so surely we need more statistics courses in schools and universities? Well, yes, but this should not mean more of the same. After years of researching and teaching statistical methods, I am not alone in concluding that the way in which we teach statistics can be counterproductive, with an overemphasis on mathematical foundations through probability theory, long lists of tests and formulae to apply, and toy problems involving, say, calculating the standard deviation of the weights of cod. The American Statistical Association’s Guidelines for Assessment and Instruction in Statistics Education (2016) strongly recommended changing the pedagogy of statistics into one based on problemsolving, real-world examples, and with an emphasis on communication….(More)”.

Experimental Innovation Policy


Paper by Albert Bravo-Biosca: “Experimental approaches are increasingly being adopted across many policy fields, but innovation policy has been lagging. This paper reviews the case for policy experimentation in this field, describes the different types of experiments that can be undertaken, discusses some of the unique challenges to the use of experimental approaches in innovation policy, and summarizes some of the emerging lessons, with a focus on randomized trials. The paper concludes describing how at the Innovation Growth Lab we have been working with governments across the OECD to help them overcome the barriers to policy experimentation in order to make their policies more impactful….(More)”.