The Work of the Future: Shaping Technology and Institutions


Report by David Autor, David Mindell and Elisabeth Reynolds for the MIT Future of Work Task Force: “The world now stands on the cusp of a technological revolution in artificial intelligence and robotics that may prove as transformative for economic growth and human potential as were electrification, mass production, and electronic telecommunications in their eras. New and emerging technologies will raise aggregate economic output and boost the wealth of nations. Will these developments enable people to attain higher living standards, better working conditions, greater economic security, and improved health and longevity? The answers to these questions are not predetermined. They depend upon the institutions, investments, and policies that we deploy to harness the opportunities and confront the challenges posed by this new era.

How can we move beyond unhelpful prognostications about the supposed end of work and toward insights that will enable policymakers, businesses, and people to better navigate the disruptions that are coming and underway? What lessons should we take from previous epochs of rapid technological change? How is it different this time? And how can we strengthen institutions, make investments, and forge policies to ensure that the labor market of the 21st century enables workers to contribute and succeed?

To help answer these questions, and to provide a framework for the Task Force’s efforts over the next year, this report examines several aspects of the interaction between work and technology….(More)”.

Tackling misinformation during crisis


Paper by Elizabether Seger and Mark Briers: “The current COVID-19 pandemic and the accompanying ‘infodemic’ clearly illustrate that access to reliable information is crucial to coordinating a timely crisis response in democratic societies. Inaccurate information and the muzzling of important information sources have degraded trust in health authorities and slowed public response to the crisis. Misinformation about ineffective cures, the origins and malicious spread of COVID-19, unverified treatment discoveries, and the efficacy of face coverings have increased the difficulty of coordinating a unified public response during the crisis. 

In a recent report, researchers at the Cambridge Centre for the Study of Existential Risk (CSER) in collaboration with The Alan Turing Institute and the Defence Science and Technology Laboratory (Dstl) workshopped an array of hypothetical crisis scenarios to investigate social and technological factors that interfere with well-informed decision-making and timely collective action in democratic societies.

Crisis scenarios

Crisis scenarios are useful tools for appraising threats and vulnerabilities to systems of information production, dissemination, and evaluation. Factors influencing how robust a society is to such threats and vulnerabilities are not always obvious when life is relatively tranquil but are often highlighted under the stress of a crisis. 

CSER and Dstl workshop organisers, together with workshop participants (a diverse group of professionals interested in topics related to [mis/dis]information, information technology, and crisis response), co-developed and explored six hypothetical crisis scenarios and complex challenges:

  • Global health crisis
  • Character assassination
  • State fake news campaign
  • Economic collapse
  • Xenophobic ethnic cleansing
  • Epistemic babble, where the ability for the general population to tell the difference between truth and fiction (presented as truth) is lost

We analysed each scenario to identify various interest groups and actors, to pinpoint vulnerabilities in systems of information production and exchange, and to visualise how the system might be interfered with. We also considered interventions that could help bolster the society against threats to informed decision-making.

The systems map below is an example from workshop scenario 1: Global health crisis. The map shows how adversarial actors (red) and groups working to mitigate the crisis (blue) interact, impact each other’s actions, and influence the general public and other interest groups (green) such as those affected by the health crisis. 

Systems maps help visualise vulnerabilities in both red and blue actor systems, which, in turn, helps identify areas where intervention (yellow) is possible to help mitigate the crisis….(More)

Transparency in Local Governments: Patterns and Practices of Twenty-first Century


Paper by Redeemer Dornudo Yao Krah and Gerard Mertens: “The study is a systematic literature review that assembles scientific knowledge in local government transparency in the twenty-first Century. The study finds a remarkable growth in research on local government transparency in the first nineteen years, particularly in Europe and North America. Social, economic, political, and institutional factors are found to account for this trend. In vogue among local governments is the use of information technology to enhance transparency. The pressure to become transparent largely comes from the passage of Freedom of Information Laws and open data initiatives of governments….(More)”.

Putting Games to Work in the Battle Against COVID-19


Sara Frueh at the National Academies: “While video games often give us a way to explore other worlds, they can also help us learn more about our own — including how to navigate a pandemic. That was the premise underlying “Jamming the Curve,” a competition that enlisted over 400 independent video game developers around the world to develop concepts for games that reflect the real-world dynamics of COVID-19.

“Games can help connect our individual actions to larger-scale impact … and help translate data into engaging stories,” said Rick Thomas, associate program officer of LabX, a program of the National Academy of Sciences that supports creative approaches to public engagement.

Working with partners IndieCade and Georgia Tech, LabX brought Jamming the Curve to life over two weeks in September.

The “game jam” generated over 50 game concepts that drop players into a wide array of roles — from a subway rider trying to minimize the spread of infection among passengers, to a grocery store cashier trying to help customers while avoiding COVID-19, to a fox ninja tasked with dispensing masks to other forest creatures.

The five winning game concepts (see below) were announced at an award ceremony in late October, where each winning team was given a $1,000 prize and the chance to compete for a $20,000 grant to develop their game further.

The power of games

“Sometimes public health concepts can be a little dry,” said Carla Alvarado, a public health expert and program officer at the National Academies who served as a judge for the competition, during the awards ceremony. “Games package that information — it’s bite-sized, it’s digestible, and it’s palatable.”

And because games engage the senses and involve movement, they help people remember what they learn, she said. “That type of learning — experiential learning — helps retain a lot of the concepts.”

The idea of doing a game jam around COVID-19 began when Janet Murray of Georgia Tech reached out to Stephanie Barish and her colleagues at IndieCade about games’ potential to help express the complicated data around the disease. “Not everybody really knows how to look at that all of that information, and games are so wonderful at reaching people in ways that people understand,” Barish said.

Rick Thomas and the LabX team heard about the idea for Jamming the Curve and saw how they could contribute. The program had experience organizing other game projects around role-playing and storytelling — along with access to a range of scientists and public health experts through the National Academies’ networks.

“Given the high stakes of the topic around COVID-19 and the amount of misinformation around the pandemic, we really needed to make sure that we were doing this right when it came to creating these games,” said Thomas. LabX helped to recruit public health professionals involved in the COVID-19 response, as well as experts in science communication and risk perception, to serve as mentors to the game developers.

Play the Winning Games!

Trailers and some playable prototypes for the five winning game concepts can be found online:

  • Everyday Hero, in which players work to stop the spread of COVID-19 through measures such as social distancing and mask use
  • PandeManager, which gives players the job of a town’s mayor who must slow the spread of disease among citizens
  • Lab Hero, in which users play a first responder who is working hard to find a vaccine while following proper health protocols
  • Cat Colony Crisis, in which a ship of space-faring cats must deal with a mysterious disease outbreak
  • Outbreak in Space, which challenges players to save friends and family from a spreading epidemic in an alien world

All of the games submitted to Jamming the Curve can be found at itch.io.

The games needed to be fun as well as scientifically accurate — and so IndieCade, Georgia Tech, and Seattle Indies recruited gaming experts who could advise participants on how to make their creations engaging and easy to understand….(More)“.

How to Use the Bureaucracy to Govern Well


Good Governance Paper by Rebecca Ingber:”…Below I offer four concrete recommendations for deploying Intentional Bureaucratic Architecture within the executive branch. But first, I will establish three key background considerations that provide context for these recommendations.  The focus of this piece is primarily executive branch legal decisionmaking, but many of these recommendations apply equally to other areas of policymaking.

First, make room for the views and expertise of career officials. As a political appointee entering a new office, ask those career officials: What are the big issues on the horizon on which we will need to take policy or legal views?  What are the problems with the positions I am inheriting?  What is and is not working?  Where are the points of conflict with our allies abroad or with Congress?  Career officials are the institutional memory of the government and often the only real experts in the specific work of their agency.  They will know about the skeletons in the closet and where the bodies are buried and all the other metaphors for knowing things that other people do not. Turn to them early. Value them. They will have views informed by experience rather than partisan politics. But all bureaucratic actors, including civil servants, also bring to the table their own biases, and they may overvalue the priorities of their own office over others. Valuing their role does not mean handing the reins over to the civil service—good governance requires exercising judgement and balancing the benefits of experience and expertise with fresh eyes and leadership. A savvy bureaucratic actor might know how to “get around” the bureaucratic roadblocks, but the wise bureaucratic player also knows how much the career bureaucracy has to offer and exercises judgment based in clear values about when to defer and when to overrule.

Second, get ahead of decisions: choose vehicles for action carefully and early. The reality of government life is that much of the big decisionmaking happens in the face of a fire drill. As I’ve written elsewhere, the trigger or “interpretation catalyst” that compels the government to consider and assert a position—in other words, the cause of that fire drill—shapes the whole process of decisionmaking and the resulting decision. When an issue arises in defensive litigation, a litigation-driven process controls.  That means that career line attorneys shape the government’s legal posture, drawing from longstanding positions and often using language from old briefs. DOJ calls the shots in a context biased toward zealous defense of past action. That looks very different from a decisionmaking process that results from the president issuing an executive order or presidential memorandum, a White House official deciding to make a speech, the State Department filing a report with a treaty body, or DOD considering whether to engage in an operation involving force. Each of these interpretation catalysts triggers a different process for decisionmaking that will shape the resulting outcome.  But because of the stickiness of government decisions—and the urgent need to move on to the next fire drill—these positions become entrenched once taken. That means that the process and outcome are driven by the hazards of external events, unless officials find ways to take the reins and get ahead of them.

And finally, an incoming administration must put real effort into Intentional Bureaucratic Architecture by deliberately and deliberatively creating and managing the bureaucratic processes in which decisionmaking happens. Novel issues arise and fire drills will inevitably happen in even the best prepared administrations.  The bureaucratic architecture will dictate how decisionmaking happens from the novel crises to the bread and butter of daily agency work. There are countless varieties of decisionmaking models inside the executive branch, which I have classified in other work. These include a unitary decider model, of which DOJ’s Office of Legal Counsel (OLC) is a prime example, an agency decider model, and a group lawyering model. All of these models will continue to co-exist. Most modern national security decisionmaking engages the interests and operations of multiple agencies. Therefore, in a functional government, most of these decisions will involve group lawyering in some format—from agency lawyers picking up the phone to coordinate with counterparts in other agencies to ad hoc meetings to formal regularized working groups with clear hierarchies all the way up to the cabinet. Often these processes evolve organically, as issues arise. Some are created from the top down by presidential administrations that want to impose order on the process. But all of these group lawyering dynamics often lack a well-defined process for determining the outcome in cases of conflict or deciding how to establish a clear output. This requires rule setting and organizing the process from the top down….(More).

Learning like a State: Statecraft in the Digital Age


Essay by Marion Fourcade and Jeff Gordon: “…Recent books have argued that we live in an age of “informational” or “surveillance” capitalism, a new form of market governance marked by the accumulation and assetization of information, and by the dominance of platforms as sites of value extraction. Over the last decade-plus, both actual and idealized governance have been transformed by a combination of neoliberal ideology, new technologies for tracking and ranking populations, and the normative model of the platform behemoths, which carry the banner of technological modernity. In concluding a review of Julie Cohen’s and Shoshana Zuboff’s books, Amy Kapcyznski asks how we might build public power sufficient to govern the new private power. Answering that question, we believe, requires an honest reckoning with how public power has been warped by the same ideological, technological, and legal forces that brought about informational capitalism.

In our contribution to the inaugural JLPE issue, we argue that governments and their agents are starting to conceive of their role differently than in previous techno-social moments. Our jumping-off point is the observation that what may first appear as mere shifts in the state’s use of technology—from the “open data” movement to the NSA’s massive surveillance operation—actually herald a deeper transformation in the nature of statecraft itself. By “statecraft,” we mean the state’s mode of learning about society and intervening in it. We contrast what we call the “dataist” state with its high modernist predecessor, as portrayed memorably by the anthropologist James C. Scott, and with neoliberal governmentality, described by, among others, Michel Foucault and Wendy Brown.

The high modernist state expanded the scope of sovereignty by imposing borders, taking censuses, and coercing those on the outskirts of society into legibility through broad categorical lenses. It deployed its power to support large public projects, such as the reorganization of urban infrastructure. As the ideological zeitgeist evolved toward neoliberalism in the 1970s, however, the priority shifted to shoring up markets, and the imperative of legibility trickled down to the individual level. The poor and working class were left to fend for their rights and benefits in the name of market fitness and responsibility, while large corporations and the wealthy benefited handsomely.

As a political rationality, dataism builds on both of these threads by pursuing a project of total measurement in a neoliberal fashion—that is, by allocating rights and benefits to citizens and organizations according to (questionable) estimates of moral desert, and by re-assembling a legible society from the bottom up. Weakened by decades of anti-government ideology and concomitantly eroded capacity, privatization, and symbolic degradation, Western states have determined to manage social problems as they bubble up into crises rather than affirmatively seeking to intervene in their causes. The dataist state sets its sights on an expanse of emergent opportunities and threats. Its focus is not on control or competition, but on “readiness.” Its object is neither the population nor a putative homo economicus, but (as Gilles Deleuze put it) “dividuals,” that is, discrete slices of people and things (e.g. hospital visits, police stops, commuting trips). Under dataism, a well-governed society is one where events (not persons) are aligned to the state’s models and predictions, no matter how disorderly in high modernist terms or how irrational in neoliberal terms….(More)”.

Taming Complexity


Martin Reeves , Simon Levin , Thomas Fink and Ania Levina at Harvard Business Review: “….“Complexity” is one of the most frequently used terms in business but also one of the most ambiguous. Even in the sciences it has numerous definitions. For our purposes, we’ll define it as a large number of different elements (such as specific technologies, raw materials, products, people, and organizational units) that have many different connections to one another. Both qualities can be a source of advantage or disadvantage, depending on how they’re managed.

Let’s look at their strengths. To begin with, having many different elements increases the resilience of a system. A company that relies on just a few technologies, products, and processes—or that is staffed with people who have very similar backgrounds and perspectives—doesn’t have many ways to react to unforeseen opportunities and threats. What’s more, the redundancy and duplication that also characterize complex systems typically give them more buffering capacity and fallback options.

Ecosystems with a diversity of elements benefit from adaptability. In biology, genetic diversity is the grist for natural selection, nature’s learning mechanism. In business, as environments shift, sustained performance requires new offerings and capabilities—which can be created by recombining existing elements in fresh ways. For example, the fashion retailer Zara introduces styles (combinations of components) in excess of immediate needs, allowing it to identify the most popular products, create a tailored selection from them, and adapt to fast-changing fashion as a result.

Another advantage that complexity can confer on natural ecosystems is better coordination. That’s because the elements are often highly interconnected. Flocks of birds or herds of animals, for instance, share behavioral protocols that connect the members to one another and enable them to move and act as a group rather than as an uncoordinated collection of individuals. Thus they realize benefits such as collective security and more-effective foraging.

Finally, complexity can confer inimitability. Whereas individual elements may be easily copied, the interrelationships among multiple elements are hard to replicate. A case in point is Apple’s attempt in 2012 to compete with Google Maps. Apple underestimated the complexity of Google’s offering, leading to embarrassing glitches in the initial versions of its map app, which consequently struggled to gain acceptance with consumers. The same is true of a company’s strategy: If its complexity makes it hard to understand, rivals will struggle to imitate it, and the company will benefit….(More)”.

Cyber Republic


Book by George Zarkadakis: “Around the world, liberal democracies are in crisis. Citizens have lost faith in their government; right-wing nationalist movements frame the political debate. At the same time, economic inequality is increasing dramatically; digital technologies have created a new class of super-rich entrepreneurs. Automation threatens to transform the free economy into a zero-sum game in which capital wins and labor loses. But is this digital dystopia inevitable? In Cyber Republic, George Zarkadakis presents an alternative, outlining a plan for using technology to make liberal democracies more inclusive and the digital economy more equitable. Cyber Republic is no less than a guide for the coming Fourth Industrial Revolution and the post-pandemic world.

Zarkadakis, an expert on technology and management, explains how artificial intelligence, together with intelligent robotics, sophisticated sensors, communication networks, and big data, will fundamentally reshape the global economy; a new “intelligent machine age” will force us to adopt new forms of economic and political organization. He envisions a future liberal democracy in which intelligent machines facilitate citizen assemblies, helping to extend citizen rights, and blockchains and cryptoeconomics enable new forms of democratic governance and business collaboration. Moreover, the same technologies can be applied to scientific research and technological innovation. We need not fear automation, Zarkadakis argues; in a post-work future, intelligent machines can collaborate with humans to achieve the human goals of inclusivity and equality….(More)”.

Technology and Democracy: understanding the influence of online technologies on political behaviour and decision-making


Report by the Joint Research Center (EU): “…The report analyses the cognitive challenges posed by four pressure points: attention economy, platform choice architectures, algorithmic content curation and disinformation, and makes policy recommendations to address them.

Specific actions could include banning microtargeting for political ads, transparency rules so that users understand how an algorithm uses their data and to what effect, or requiring online platforms to provide reports to users showing when, how and which of their data is sold.

This report is the second output from the JRC’s Enlightenment 2.0 multi-annual research programme….(More)”.

Policy making in a digital world


Report by Lewis Lloyd: “…Policy makers across government lack the necessary skills and understanding to take advantage of digital technologies when tackling problems such as coronavirus and climate change. This report says already poor data management has been exacerbated by a lack of leadership, with the role of government chief data officer unfilled since 2017. These failings have been laid bare by the stuttering coronavirus Test and Trace programme. Drawing on interviews with policy experts and digital specialists inside and outside government, the report argues that better use of data and new technologies, such as artificial intelligence, would improve policy makers’ understanding of problems like coronavirus and climate change, and aid collaboration with colleagues, external organisations and the public in seeking solutions to them. It urges government to trial innovative applications of data and technology to ​a wider range of policies, but warns recent failures such as the A-level algorithm fiasco mean it must also do more to secure public trust in its use of such technologies. This means strengthening oversight and initiating a wider public debate about the appropriate use of digital technologies, and improving officials’ understanding of the limitations of data-driven analysis. The report recommends that the government:

  1. Appoints a chief data officer as soon as possible to drive work on improving data quality, tackle problems with legacy IT and make sure new data standards are applied and enforced across government.
  2. ​Places more emphasis on statistical and technological literacy when recruiting and training policy officials.
  3. Sets up a new independent body to lead on public engagement in policy making, with an initial focus on how and when government should use data and technology…(More)”.