Re-Thinking Think Tanks: Differentiating Knowledge-Based Policy Influence Organizations


Paper by Adam Wellstead and Michael P. Howlett: “The idea of “think tanks” is one of the oldest in the policy sciences. While the topic has been studied for decades, however, recent work dealing with advocacy groups, policy and Behavioural Insight labs, and into the activities of think tanks themselves have led to discontent with the definitions used in the field, and especially with the way the term may obfuscate rather than clarify important distinctions between different kinds of knowledge-based policy influence organizations (KBPIO). In this paper, we examine the traditional and current definitions of think tanks utilized in the discipline and point out their weaknesses. We then develop a new framework to better capture the variation in such organizations which operate in many sectors….(More)”.

Innovation in Real Places: Strategies for Prosperity in an Unforgiving World


Innovation in Real Places – Strategies for Prosperity in an Unforgiving World - Oxford Scholarship Online

Book by Dan Breznitz: “Across the world, cities and regions have wasted trillions of dollars blindly copying the Silicon Valley model of growth creation. We have lived with this system for decades, and the result is clear: a small number of regions and cities are at the top of the high-tech industry, but many more are fighting a losing battle to retain economic dynamism. But, as this books details, there are other models for innovation-based growth that don’t rely on a flourishing high-tech industry. Breznitz argues that the purveyors of the dominant ideas on innovation have a feeble understanding of the big picture on global production and innovation.

They conflate innovation with invention and suffer from techno-fetishism. In their devotion to start-ups, they refuse to admit that the real obstacle to growth for most cities is the overwhelming power of the real hubs, which siphon up vast amounts of talent and money. Communities waste time, money, and energy pursuing this road to nowhere. Instead, Breznitz proposes that communities focus on where they fit within the four stages in the global production process. Success lies in understanding the changed structure of the global system of production and then using those insights to enable communities to recognize their own advantages, which in turn allows to them to foster surprising forms of specialized innovation. All localities have certain advantages relative to at least one stage of the global production process, and the trick is in recognizing it….(More)”.

The Co-Creation Compass: From Research to Action.


Policy Brief by Jill Dixon et al: ” Modern public administrations face a wider range of challenges than in the past, from designing effective social services that help vulnerable citizens to regulating data sharing between banks and fintech startups to ensure competition and growth to mainstreaming gender policies effectively across the departments of a large public administration.

These very different goals have one thing in common. To be solved, they require collaboration with other entities – citizens, companies and other public administrations and departments. The buy-in of these entities is the factor determining success or failure in achieving the goals. To help resolve this problem, social scientists, researchers and students of public administration have devised several novel tools, some of which draw heavily on the most advanced management thinking of the last decade.

First and foremost is co-creation – an awkward sounding word for a relatively simple idea: the notion that better services can be designed and delivered by listening to users, by creating feedback loops where their success (or failure) can be studied, by frequently innovating and iterating incremental improvements through small-scale experimentation so they can deliver large-scale learnings and by ultimately involving users themselves in designing the way these services can be made most effective and best be delivered.

Co-creation tools and methods provide a structured manner for involving users, thereby maximising the probability of satisfaction, buy-in and adoption. As such, co-creation is not a digital tool; it is a governance tool. There is little doubt that working with citizens in re-designing the online service for school registration will boost the usefulness and effectiveness of the service. And failing to do so will result in yet another digital service struggling to gain adoption….(More)”

Tracking Economic Activity in Response to the COVID-19 using nighttime Lights


Paper by Mark Roberts: “Over the last decade, nighttime lights – artificial lighting at night that is associated with human activity and can be detected by satellite sensors – have become a proxy for monitoring economic activity. To examine how the COVID-19 crisis has affected economic activity in Morocco, we calculated monthly lights estimates for both the country overall and at a sub-national level. By examining the intensity of Morocco’s lights in comparison with the quarterly GDP data at the national level, we are also able to confirm that nighttime lights are able to track movements in real economic activity for Morocco….(More)”.

Budget transparency and governance quality: a cross-country analysis


Paper by Marco Bisogno and Beatriz Cuadrado-Ballesteros: “The aim of this study is to assess whether there is a relationship between budget transparency and governance quality. The so-called openness movement and the global financial crises, which have put significant pressure on governments to cut expenditures and ensure balanced budgets, have motivated this research. The public choice and principal-agent theories have been used to investigate this relationship, implementing econometric models based on a sample of 96 countries over the period 2008–2019. The results show that higher levels of budget transparency positively affect the quality of governance, and vice versa, documenting simultaneous causality between both issues….(More)”

Foundations of complexity economics


Article by W. Brian Arthur: “Conventional, neoclassical economics assumes perfectly rational agents (firms, consumers, investors) who face well-defined problems and arrive at optimal behaviour consistent with — in equilibrium with — the overall outcome caused by this behaviour. This rational, equilibrium system produces an elegant economics, but is restrictive and often unrealistic. Complexity economics relaxes these assumptions. It assumes that agents differ, that they have imperfect information about other agents and must, therefore, try to make sense of the situation they face. Agents explore, react and constantly change their actions and strategies in response to the outcome they mutually create. The resulting outcome may not be in equilibrium and may display patterns and emergent phenomena not visible to equilibrium analysis. The economy becomes something not given and existing but constantly forming from a developing set of actions, strategies and beliefs — something not mechanistic, static, timeless and perfect but organic, always creating itself, alive and full of messy vitality….(More)”.

Data Is Power: Washington Needs to Craft New Rules for the Digital Age


Matthew Slaughter and David McCormick at Foreign Affairs: “…Working with all willing and like-minded nations, it should seek a structure for data that maximizes its immense economic potential without sacrificing privacy and individual liberty. This framework should take the form of a treaty that has two main parts.

First would be a set of binding principles that would foster the cross-border flow of data in the most data-intensive sectors—such as energy, transportation, and health care. One set of principles concerns how to value data and determine where it was generated. Just as traditional trade regimes require goods and services to be priced and their origins defined, so, too, must this framework create a taxonomy to classify data flows by value and source. Another set of principles would set forth the privacy standards that governments and companies would have to follow to use data. (Anonymizing data, made easier by advances in encryption and quantum computing, will be critical to this step.) A final principle, which would be conditional on achieving the other two, would be to promote as much cross-border and open flow of data as possible. Consistent with the long-established value of free trade, the parties should, for example, agree to not levy taxes on data flows—and diligently enforce that rule. And they would be wise to ensure that any negative impacts of open data flows, such as job losses or reduced wages, are offset through strong programs to help affected workers adapt to the digital economy.

Such standards would benefit every sector they applied to. Envision, for example, dozens of nations with data-sharing arrangements for autonomous vehicles, oncology treatments, and clean-tech batteries. Relative to their experience in today’s Balkanized world, researchers would be able to discover more data-driven innovations—and in more countries, rather than just in those that already have a large presence in these industries.

The second part of the framework would be free-trade agreements regulating the capital goods, intermediate inputs, and final goods and services of the targeted sectors, all in an effort to maximize the gains that might arise from data-driven innovations. Thus would the traditional forces of comparative advantage and global competition help bring new self-driving vehicles, new lifesaving chemotherapy compounds, and new sources of renewable energy to participating countries around the world. 

There is already a powerful example of such agreements. In 1996, dozens of countries accounting for nearly 95 percent of world trade in information technology ratified the Information Technology Agreement, a multilateral trade deal under the WTO. The agreement ultimately eliminated all tariffs for hundreds of IT-related capital goods, intermediate inputs, and final products—from machine tools to motherboards to personal computers. The agreement proved to be an important impetus for the subsequent wave of the IT revolution, a competitive spur that led to productivity gains for firms and price declines for consumers….(More)”.

Citizen science is booming during the pandemic


Sigal Samuel at Vox: “…The pandemic has driven a huge increase in participation in citizen science, where people without specialized training collect data out in the world or perform simple analyses of data online to help out scientists.

Stuck at home with time on their hands, millions of amateurs arouennd the world are gathering information on everything from birds to plants to Covid-19 at the request of institutional researchers. And while quarantine is mostly a nightmare for us, it’s been a great accelerant for science.

Early in the pandemic, a firehose of data started gushing forth on citizen science platforms like Zooniverse and SciStarter, where scientists ask the public to analyze their data online.It’s a form of crowdsourcing that has the added bonus of giving volunteers a real sense of community; each project has a discussion forum where participants can pose questions to each other (and often to the scientists behind the projects) and forge friendly connections.

“There’s a wonderful project called Rainfall Rescue that’s transcribing historical weather records. It’s a climate change project to understand how weather has changed over the past few centuries,” Laura Trouille, vice president of citizen science at the Adler Planetarium in Chicago and co-lead of Zooniverse, told me. “They uploaded a dataset of 10,000 weather logs that needed transcribing — and that was completed in one day!”

Some Zooniverse projects, like Snapshot Safari, ask participants to classify animals in images from wildlife cameras. That project saw daily classifications go from 25,000 to 200,000 per day in the initial days of lockdown. And across all its projects, Zooniverse reported that 200,000 participants contributed more than 5 million classifications of images in one week alone — the equivalent of 48 years of research. Although participation has slowed a bit since the spring, it’s still four times what it was pre-pandemic.

Many people are particularly eager to help tackle Covid-19, and scientists have harnessed their energy. Carnegie Mellon University’s Roni Rosenfeld set up a platform where volunteers can help artificial intelligence predict the spread of the coronavirus, even if they know nothing about AI. Researchers at the University of Washington invited people to contribute to Covid-19 drug discovery using a computer game called Foldit; they experimented with designing proteins that could attach to the virus that causes Covid-19 and prevent it from entering cells….(More)”.

In AI We Trust: Power, Illusion and Control of Predictive Algorithms


Book by Helga Nowotny: “One of the most persistent concerns about the future is whether it will be dominated by the predictive algorithms of AI – and, if so, what this will mean for our behaviour, for our institutions and for what it means to be human. AI changes our experience of time and the future and challenges our identities, yet we are blinded by its efficiency and fail to understand how it affects us.

At the heart of our trust in AI lies a paradox: we leverage AI to increase control over the future and uncertainty, while at the same time the performativity of AI, the power it has to make us act in the ways it predicts, reduces our agency over the future. This happens when we forget that that we humans have created the digital technologies to which we attribute agency. These developments also challenge the narrative of progress, which played such a central role in modernity and is based on the hubris of total control. We are now moving into an era where this control is limited as AI monitors our actions, posing the threat of surveillance, but also offering the opportunity to reappropriate control and transform it into care.

As we try to adjust to a world in which algorithms, robots and avatars play an ever-increasing role, we need to understand better the limitations of AI and how their predictions affect our agency, while at the same time having the courage to embrace the uncertainty of the future….(More)”.

Leave No Migrant Behind: The 2030 Agenda and Data Disaggregation


Guide by the International Organization for Migration (IOM): “To date, disaggregation of global development data by migratory status remains low. Migrants are largely invisible in official SDG data. As the global community approaches 2030, very little is known about the impact of the 2030 Agenda on migrants. Despite a growing focus worldwide on data disaggregation, namely the breaking down of data into smaller sub-categories, there is a lack of practical guidance on the topic that can be tailored to address individual needs and capacities of countries.

Developed by IOM’s Global Migration Data Analysis Centre (GMDAC), the guide titled ‘Leave No Migrant Behind: The 2030 Agenda and Data Disaggregation‘ centres on nine SDGs focusing on hunger, education, and gender equality among others. The document is the first of its kind, in that it seeks to address a range of different categorization interests and needs related to international migrants and suggests practical steps that practitioners can tailor to best fit their context…The guide also highlights the key role disaggregation plays in understanding the many positive links between migration and the SDGs, highlighting migrants’ contributions to the 2030 Agenda.

The guide outlines key steps for actors to plan and implement initiatives by looking at sex, gender, age and disability, in addition to migratory status. These steps include undertaking awareness raising, identifying priority indicators, conducting data mapping, and more….Read more about the importance of data disaggregation for SDG indicators here….(More)”