One year into pandemic, federal digital government is largely business as usual


Article by Amanda Clarke: “It’s been a year since the Government of Canada, like every other organization, household and individual, was forced to move its work to the web in response to the COVID-19 pandemic. When this shift first took hold, many predicted that the digital demands of the crisis would provide the push the government needed to finally give its workforce access to modern digital tools (Slack, Google Drive, etc.), to design online services that actually work, and to effectively harness data for public good. By this logic, the pandemic would step in to close the deal on the elusive goal of “digital government transformation,” where digital strategies, chief information officers and high-level political commitments had failed.

Of course, this was a ridiculous prediction. This early enthusiasm was rightfully checked by a series of thoughtful analyses that reminded us that a COVID-induced digital government transformation would not arise simply because the public service faced immediate pressures to shift its workforce online and to expand its digital services. Existing research underscores that digital government transformation requires significant structural and cultural reforms within the public service and a slate of legislative and policy changes. Without this groundwork, any apparent advances ushered in by the pandemic will at best be ephemeral wins, and at worst, shiny distractions that obscure the reality of a federal public service that has been cycling through failed renewal exercises for decades.

With this in mind, now that we are at the one-year anniversary of the pandemic, I asked a group of federal public servants leading digital government efforts if COVID-19 is triggering the kinds of administrative reforms needed to meaningfully update the Government of Canada for the realities of the digital age.

The answer, universally, without even a moment of hesitation: No….(More)”.

Runaway Technology: Can Law Keep Up?


Book by Joshua A. T. Fairfield: “In an era of corporate surveillance, artificial intelligence, deep fakes, genetic modification, automation, and more, law often seems to take a back seat to rampant technological change. To listen to Silicon Valley barons, there’s nothing any of us can do about it. In this riveting work, Joshua A. T. Fairfield calls their bluff. He provides a fresh look at law, at what it actually is, how it works, and how we can create the kind of laws that help humans thrive in the face of technological change. He shows that law can keep up with technology because law is a kind of technology – a social technology built by humans out of cooperative fictions like firms, nations, and money. However, to secure the benefits of changing technology for all of us, we need a new kind of law, one that reflects our evolving understanding of how humans use language to cooperate….(More)”.

How to Put Out Democracy’s Dumpster Fire


Yoshi Sodeoka in The Atlantic: “…With the wholesale transfer of so much entertainment, social interaction, education, commerce, and politics from the real world to the virtual world—a process recently accelerated by the coronavirus pandemic—many Americans have come to live in a nightmarish inversion of the Tocquevillian dream, a new sort of wilderness. Many modern Americans now seek camaraderie online, in a world defined not by friendship but by anomie and alienation. Instead of participating in civic organizations that give them a sense of community as well as practical experience in tolerance and consensus-building, Americans join internet mobs, in which they are submerged in the logic of the crowd, clicking Like or Share and then moving on. Instead of entering a real-life public square, they drift anonymously into digital spaces where they rarely meet opponents; when they do, it is only to vilify them.

Conversation in this new American public sphere is governed not by established customs and traditions in service of democracy but by rules set by a few for-profit companies in service of their needs and revenues. Instead of the procedural regulations that guide a real-life town meeting, conversation is ruled by algorithms that are designed to capture attention, harvest data, and sell advertising. The voices of the angriest, most emotional, most divisive—and often the most duplicitous—participants are amplified. Reasonable, rational, and nuanced voices are much harder to hear; radicalization spreads quickly. Americans feel powerless because they are.

In this new wilderness, democracy is becoming impossible. If one half of the country can’t hear the other, then Americans can no longer have shared institutions, apolitical courts, a professional civil service, or a bipartisan foreign policy. We can’t compromise. We can’t make collective decisions—we can’t even agree on what we’re deciding. No wonder millions of Americans refuse to accept the results of the most recent presidential election, despite the verdicts of state electoral committees, elected Republican officials, courts, and Congress. We no longer are the America Tocqueville admired, but have become the enfeebled democracy he feared, a place where each person,…(More)”.

Smart weather app helps Kenya’s herders brace for drought


Thomson Reuters Foundation: “Sitting under a low tree to escape the blazing Kenyan sun, Kaltuma Milkalkona and two young men hunch intently over the older woman’s smartphone – but they are not transfixed by the latest sports scores or a trending internet meme.

The men instead are looking at a weather alert for their village in the country’s north, sent through an app that uses weather station data to help pastoralists prepare for drought.

The myAnga app on Milkalkona’s phone showed that Merille would continue facing dry weather and that “pasture conditions (were) expected to be very poor with no grass and browse availability.”

One of the young men said he would warn his older brother, who had taken the family’s livestock to another area where there was water and pasture, not to come home yet.

Milkalkona, 42, who lives and sells clothing in the neighbouring town of Laisamis, said she often shared data from her phone with others who did not have smartphones.

“When I get the weather alerts, I usually show the people who are close to me,” she said, as well as calling others in more distant villages.

Extreme and erratic weather linked to a warming climate can be devastating for Kenya’s pastoralists, with prolonged droughts making it difficult to find enough pasture for their animals.

But armed with up-to-date weather information and advice, herders can plan ahead to ensure their livestock make it through the region’s frequent dry spells, said Frankline Agolla, co-founder of Amfratech, a Nairobi-based social enterprise that developed the myAnga app.

The app – its name means “my weather” – goes further than the weather reports anyone can get from the meteorological department by interpreting them and making recommendations to herders on the best way to protect their livelihoods.

“If there is an imminent drought, we advise them to sell their livestock early to reduce their losses,” said Agolla in an interview with the Thomson Reuters Foundation….

The app is part of Amfratech’s Climate Livestock and Markets (CLIMARK) project, which the company aims to roll out to more than 300,000 pastoralists in Kenya over the next five years, with funding and other help from partners including the Technical Centre for Agricultural and Rural Cooperation and the Kenya Livestock Marketing Council.

The app sends out weekly weather information in English, Swahili and other languages used in northern Kenya, and users can see forecasts for areas as small as a single village, Agolla said….(More)”.

How governments use evidence to make transport policy


Report by Alistair Baldwin, and Kelly Shuttleworth: “The government’s ambitious transport plans will falter unless policy makers – ministers, civil servants and other public officials – improve the way they identify and use evidence to inform their decisions.

This report compares the use of evidence in the UK, the Netherlands, Sweden, Germany and New Zealand, and finds that England is an outlier in not having a coordinated transport strategy. This damages both scrutiny and coordination of transport policy.

The government has plans to reform bus services, support cycling, review rail franchising, and invest more than £60 billion in transport projects over the next five years. But these plans are not integrated. The Department for Transport should develop a new strategy integrating different modes of transport, rather than mode by mode, to improve political understanding of trade-offs and scrutiny of policy decisions.

The DfT is a well-resourced department, with significant expertise, responsibilities and a wide array of analysts. But its reliance on economic evidence means other forms of evidence can appear neglected in transport decision making – including social research, evaluation or engineering. Decision makers are often too attached to the importance of the Benefit-Cost Ratio at the expense of other forms of evidence.

The government needs to improve its attitude to evaluation of past projects. There are successes – like the evaluation of the Cycle City Ambition Fund – but they are outnumbered by failures – like the evaluation of projects in the Local Growth Fund.  For example, good practice from Highways England should be common across the transport sector, helped by providing dedicated funding to local authorities to properly evaluate projects….(More)”.

Do conversations end when people want them to?


Paper by Adam M. Mastroianni et al: “Do conversations end when people want them to? Surprisingly, behavioral science provides no answer to this fundamental question about the most ubiquitous of all human social activities. In two studies of 932 conversations, we asked conversants to report when they had wanted a conversation to end and to estimate when their partner (who was an intimate in Study 1 and a stranger in Study 2) had wanted it to end. Results showed that conversations almost never ended when both conversants wanted them to and rarely ended when even one conversant wanted them to and that the average discrepancy between desired and actual durations was roughly half the duration of the conversation. Conversants had little idea when their partners wanted to end and underestimated how discrepant their partners’ desires were from their own. These studies suggest that ending conversations is a classic “coordination problem” that humans are unable to solve because doing so requires information that they normally keep from each other. As a result, most conversations appear to end when no one wants them to….(More)”.

Theories of Choice: The Social Science and the Law of Decision Making


Book by Stefan Grundmann and Philipp Hacker: “Choice is a key concept of our time. It is a foundational mechanism for every legal order in societies that are, politically, constituted as democracies and, economically, built on the market mechanism. Thus, choice can be understood as an atomic structure that grounds core societal processes. In recent years, however, the debate over the right way to theorise choice—for example, as a rational or a behavioural type of decision making—has intensified. This collection therefore provides an in-depth discussion of the promises and perils of specific types of theories of choice. It shows how the selection of a specific theory of choice can make a difference for concrete legal questions, in particularly in the regulation of the digital economy or in choosing between market, firm, or network.

In its first part, the volume provides an accessible overview of the current debates about rational versus behavioural approaches to theories of choice. The remainder of the book structures the vast landscape of theories of choice along three main types: individual, collective, and organisational decision making. As theories of choice proliferate and become ever more sophisticated, however, the process of choosing an adequate theory of choice becomes increasingly intricate, too. This volume addresses this selection problem for the various legal arenas in which individual, organisational, and collective decisions matter. By drawing on economic, technological, political, and legal points of view, the volume shows which theories of choice are at the disposal of the legally relevant decision maker, and how they can be implemented for the solution of concrete legal problems….(More)

Liability of online platforms


European Parliament Think Tank: “Given the central role that online platforms (OPs) play in the digital economy, questions arise about their responsibility in relation to illegal/harmful content or products hosted in the frame of their operation. Against this background, this study reviews the main legal/regulatory challenges associated with OP operations and analyses the incentives for OPs, their users and third parties to detect and remove illegal/harmful and dangerous material, content and/or products. To create a functional classification which can be used for regulatory purposes, it discusses the notion of OPs and attempts to categorise them under multiple criteria. The study then maps and critically assesses the whole range of OP liabilities, taking hard and soft law, self-regulation and national legislation into consideration, whenever relevant. Finally, the study puts forward policy options for an efficient EU liability regime: (i) maintaining the status quo; (ii) awareness-raising and media literacy; (iii)promoting self-regulation; (iv) establishing co-regulation mechanisms and tools; (v) adoptingstatutory legislation; (vi) modifying OPs’ secondaryliability by employing two different models – (a) byclarifying the conditions for liability exemptionsprovided by the e-Commerce Directive or (b) byestablishing a harmonised regime of liability….(More)”.

Lessons from a year of Covid


Essay by Yuval Noah Harari in the Financial Times: “…The Covid year has exposed an even more important limitation of our scientific and technological power. Science cannot replace politics. When we come to decide on policy, we have to take into account many interests and values, and since there is no scientific way to determine which interests and values are more important, there is no scientific way to decide what we should do.

For example, when deciding whether to impose a lockdown, it is not sufficient to ask: “How many people will fall sick with Covid-19 if we don’t impose the lockdown?”. We should also ask: “How many people will experience depression if we do impose a lockdown? How many people will suffer from bad nutrition? How many will miss school or lose their job? How many will be battered or murdered by their spouses?”

Even if all our data is accurate and reliable, we should always ask: “What do we count? Who decides what to count? How do we evaluate the numbers against each other?” This is a political rather than scientific task. It is politicians who should balance the medical, economic and social considerations and come up with a comprehensive policy.

Similarly, engineers are creating new digital platforms that help us function in lockdown, and new surveillance tools that help us break the chains of infection. But digitalisation and surveillance jeopardise our privacy and open the way for the emergence of unprecedented totalitarian regimes. In 2020, mass surveillance has become both more legitimate and more common. Fighting the epidemic is important, but is it worth destroying our freedom in the process? It is the job of politicians rather than engineers to find the right balance between useful surveillance and dystopian nightmares.

Three basic rules can go a long way in protecting us from digital dictatorships, even in a time of plague. First, whenever you collect data on people — especially on what is happening inside their own bodies — this data should be used to help these people rather than to manipulate, control or harm them. My personal physician knows many extremely private things about me. I am OK with it, because I trust my physician to use this data for my benefit. My physician shouldn’t sell this data to any corporation or political party. It should be the same with any kind of “pandemic surveillance authority” we might establish….(More)”.

Why Transparency Won’t Save Us


Essay by Sun-ha Hong: “In a society beset with black-boxed algorithms and vast surveillance systems, transparency is often hailed as liberal democracy’s superhero. It’s a familiar story: inject the public with information to digest, then await their rational deliberation and improved decision making. Whether in discussions of facial recognition software or platform moderation, we run into the argument that transparency will correct the harmful effects of algorithmic systems. The trouble is that in our movies and comic books, superheroes are themselves deus ex machina: black boxes designed to make complex problems disappear so that the good guys can win. Too often, transparency is asked to save the day on its own, under the assumption that disinformation or abuse of power can be shamed away with information.

Transparency without adequate support, however, can quickly become fuel for speculation and misunderstanding….

All this is part of a broader pattern in which the very groups who should be held accountable by the data tend to be its gatekeepers. Facebook is notorious for transparency-washing strategies, in which it dangles data access like a carrot but rarely follows through in actually delivering it. When researchers worked to create more independent means of holding Facebook accountable — as New York University’s Ad Observatory did last year, using volunteer researchers to build a public database of ads on the platform — Facebook threatened to sue them. Despite the lofty rhetoric around Facebook’s Oversight Board (often described as a “Supreme Court” for the platform), it falls into the same trap of transparency without power: the scope is limited to individual cases of content moderation, with no binding authority over the company’s business strategy, algorithmic design, or even similar moderation cases in the future.

Here, too, the real bottleneck is not information or technology, but power: the legal, political and economic pressure necessary to compel companies like Facebook to produce information and to act on it. We see this all too clearly when ordinary people do take up this labour of transparency, and attempt to hold technological systems accountable. In August 2020, Facebook users reported the Kenosha Guard group more than 400 times for incitement of violence. But Facebook declined to take any action until an armed shooter travelled to Kenosha, Wisconsin, and killed two protesters. When transparency is compromised by the concentration of power, it is often the vulnerable who are asked to make up the difference — and then to pay the price.

Transparency cannot solve our problems on its own. In his book The Rise of the Right to Know, journalism scholar Michael Schudson argues that transparency is better understood as a “secondary or procedural morality”: a tool that only becomes effective by other means. We must move beyond the pernicious myth of transparency as a universal solution, and address the distribution of economic and political power that is the root cause of technologically amplified irrationality and injustice….(More)”.