Hypotheses devised by AI could find ‘blind spots’ in research


Article by Matthew Hutson: “One approach is to use AI to help scientists brainstorm. This is a task that large language models — AI systems trained on large amounts of text to produce new text — are well suited for, says Yolanda Gil, a computer scientist at the University of Southern California in Los Angeles who has worked on AI scientists. Language models can produce inaccurate information and present it as real, but this ‘hallucination’ isn’t necessarily bad, Mullainathan says. It signifies, he says, “‘here’s a kind of thing that looks true’. That’s exactly what a hypothesis is.”

Blind spots are where AI might prove most useful. James Evans, a sociologist at the University of Chicago, has pushed AI to make ‘alien’ hypotheses — those that a human would be unlikely to make. In a paper published earlier this year in Nature Human Behaviour4, he and his colleague Jamshid Sourati built knowledge graphs containing not just materials and properties, but also researchers. Evans and Sourati’s algorithm traversed these networks, looking for hidden shortcuts between materials and properties. The aim was to maximize the plausibility of AI-devised hypotheses being true while minimizing the chances that researchers would hit on them naturally. For instance, if scientists who are studying a particular drug are only distantly connected to those studying a disease that it might cure, then the drug’s potential would ordinarily take much longer to discover.

When Evans and Sourati fed data published up to 2001 to their AI, they found that about 30% of its predictions about drug repurposing and the electrical properties of materials had been uncovered by researchers, roughly six to ten years later. The system can be tuned to make predictions that are more likely to be correct but also less of a leap, on the basis of concurrent findings and collaborations, Evans says. But “if we’re predicting what people are going to do next year, that just feels like a scoop machine”, he adds. He’s more interested in how the technology can take science in entirely new directions….(More)”

Understanding AI jargon: Artificial intelligence vocabulary


Article by Kate Woodford: “Today, the Cambridge Dictionary announces its Word of the Year for 2023: hallucinate. You might already be familiar with this word, which we use to talk about seeing, hearing, or feeling things that don’t really exist. But did you know that it has a new meaning when it’s used in the context of artificial intelligence?

To celebrate the Word of the Year, this post is dedicated to AI terms that have recently come into the English language. AI, as you probably know, is short for artificial intelligence – the use of computer systems with qualities similar to the human brain that allow them to ‘learn’ and ‘think’. It’s a subject that arouses a great deal of interest and excitement and, it must be said, a degree of anxiety. Let’s have a look at some of these new words and phrases and see what they mean and how we’re using them to talk about AI…

As the field of AI continues to develop quickly, so does the language we use to talk about it. In a recent New Words post, we shared some words about AI that are being considered for addition to the Cambridge Dictionary…(More)”.

Policy primer on non-personal data 


Primer by the International Chamber of Commerce: “Non-personal data plays a critical role in providing solutions to global challenges. Unlocking its full potential requires policymakers, businesses, and all other stakeholders to collaborate to construct policy environments that can capitalise on its benefits.  

This report gives insights into the different ways that non-personal data has a positive impact on society, with benefits including, but not limited to: 

  1. Tracking disease outbreaks; 
  2. Facilitating international scientific cooperation; 
  3. Understanding climate-related trends; 
  4.  Improving agricultural practices for increased efficiency; 
  5. Optimising energy consumption; 
  6. Developing evidence-based policy; 
  7. Enhancing cross-border cybersecurity cooperation. 

In addition, businesses of all sizes benefit from the transfer of data across borders, allowing companies to establish and maintain international supply chains and smaller businesses to enter new markets or reduce operating costs. 

Despite these benefits, international flows of non-personal data are frequently limited by restrictions and data localisation measures. A growing patchwork of regulations can also create barriers to realising the potential of non-personal data. This report explores the impact of data flow restrictions including: 

  • Hindering global supply chains; 
  • Limiting the use of AI reliant on large datasets; 
  • Disincentivising data sharing amongst companies; 
  • Preventing companies from analysing the data they hold…(More)”.

GovTech in Fragile and Conflict Situations Trends, Challenges, and Opportunities


Report by the World Bank: “This report takes stock of the development of GovTech solutions in Fragile and Conflict-Affected Situations (FCS), be they characterized by low institutional capacity and/or by active conflict and provides insights on challenges and opportunities for implementing GovTech reforms in such contexts. It is aimed at practitioners and policy makers working in FCS but will also be useful for practitioners working in Fragility, Conflict, and Violence (FCV) contexts, at-risk countries, or low-income countries as some similar challenges and opportunities can be present…(More)”.

Design Thinking Misses the Mark


Article by Anne-Laure Fayard & Sarah Fathallah: “Nonprofits, governments, and international agencies often turn to design thinking to tackle complex social challenges and develop innovative solutions with—rather than for—people. Design thinking was conceptualized by designer Nigel Cross more than four decades ago, notably in the 1982 Design Studies article Designerly Ways of Knowing.” The approach was later packaged for popular consumption by global design and innovation consultancy IDEO. Design thinking quickly became the go-to innovation tool kit in the for-profit world—and, soon after, in the international development and social sectors—because of its commitment to center communities in the collaborative design process.

IDEO’s then-CEO Tim Brown and Jocelyn Wyatt, who was then lead of the IDEO social innovation group that became IDEO.org, championed design thinking for the social sector in their 2010 Stanford Social Innovation Review article, “Design Thinking for Social Innovation,” which has become an important reference for design thinking in the social sector. Embraced by high-profile philanthropists like Bill & Melinda Gates Foundation cofounder Melinda Gates and Acumen founder and CEO Jacqueline Novogratz, design thinking soared in popularity because it promised to deliver profound societal change. Brown even claimed, in a 2014 Harvard Business Review article, that design thinking could improve democratic capitalism.

However, design thinking has not lived up to such promises. In a 2023 MIT Technology Review article, writer and designer Rebecca Ackerman argued that while “design thinking was supposed to fix the world,” organizations rarely implement the ideas generated during the design-thinking process. The failure to implement these ideas resulted from either an inadequate understanding of the problem and/or of the complexities of the institutional and cultural contexts…(More)”.

Social Economy Science


Open Access Book edited by Gorgi Krlev, Dominika Wruk, Giulio Pasi, and Marika Bernhard: “Lack of progress in the area of global sustainable development and difficulties in crisis management highlight the need to transform the economy and find new ways of making society more resilient. The social economy is increasingly recognized as a driver of such transformations; it comprises traditional forms of cooperative or solidarity-based organizations alongside new phenomena such as impact investing or social tech ventures that aim to contribute to the public good. Social Economy Science provides the first comprehensive analysis of why and how social economy organizations create superior value for society. The book draws on organizational theory and transition studies to provide a systematic perspective on complex multi-stakeholder forms of action. It discusses the social economy’s role in promoting innovation for impact, as well as its role as an agent of societal change and as a partner to businesses, governments, and citizens…(More)”.

The public good of statistics – narratives from around the world


Blog by Ken Roy:” I have been looking at some of the narratives used by bodies producing Official Statistics – specifically those in a sample of recent strategies and business plans from different National Statistical Offices. Inevitably these documents focus on planned programmes of work – the key statistical outputs, the technical and methodological investments etc – and occasionally on interesting things like budgets.

When these documents touch on the rationale for (or purpose of) Official Statistics, one approach is to present Official Statistics as a ‘right’ of citizens or as essential national infrastructure. For example Statistics Finland frame Official Statistics as “our shared national capital”. A further common approach is to reference the broad purpose of improved decision making – Statistics Canada has the aim that “Canadians have the key information they need to make evidence-based decisions.”

Looking beyond these high-level statements, I was keen to find any further, more specific, expressions of real-world impacts. The following sets out some initial groups of ideas and some representative quotes.

In terms of direct impacts for citizens, some strategies have a headline aim that citizens are knowledgeable about their world – Statistics Iceland aims to enable an “informed society”. A slightly different ambition is that different groups of citizens are represented or ‘seen’ by Official Statistics. The UK Statistics Authority aims to “reflect the experiences of everyone in our society so that everyone counts, and is counted, and no one is forgotten”. There are also references to the role of Official Statistics (and data more broadly) in empowering citizens – most commonly through giving them the means to hold government to account. One of the headline aims of New Zealand’s Data Investment Plan is that “government is held to account through a robust and transparent data system”.

Also relevant to citizens is the ambition for Official Statistics to enable healthy, informed public debate – one aim of the Australian Bureau of Statistics is that their work will “provide reliable information on a range of matters critical to public debate”.

Some narratives hint at the contribution of Official Statistics systems to national economic success. Stats NZ notes that “the integrity of official data can have wide-ranging implications … such as the interest charged on government borrowing.” The Papua New Guinea statistics office references a focus on “private sector investors who want to use data and statistics to aid investment decisions”.

Finally, we come to governments. Official Statistics are regularly presented as essential to a better, more effective, government process – through establishing understanding of the circumstances and needs of citizens, businesses and places and hence supporting the development and implementation of better policies, programmes and services in response. The National Bureau of Statistics (Tanzania) sees Official Statistics as enabling “evidence-based formulation, planning, monitoring and evaluation which are key in the realization of development aspirations.” A related theme is the contribution to good governance – the United Nations presents Official Statistics as “an essential element of the accountability of governments and public bodies to the public in a democratic society…(More)”.

The Time is Now: Establishing a Mutual Commitment Framework (MCF) to Accelerate Data Collaboratives


Article by Stefaan Verhulst, Andrew Schroeder and William Hoffman: “The key to unlocking the value of data lies in responsibly lowering the barriers and shared risks of data access, re-use, and collaboration in the public interest. Data collaboratives, which foster responsible access and re-use of data among diverse stakeholders, provide a solution to these challenges.

Today, however, setting up data collaboratives takes too much time and is prone to multiple delays, hindering our ability to understand and respond swiftly and effectively to urgent global crises. The readiness of data collaboratives during crises faces key obstacles in terms of data use agreements, technical infrastructure, vetted and reproducible methodologies, and a clear understanding of the questions which may be answered more effectively with additional data.

Organizations aiming to create data collaboratives often face additional challenges, as they often lack established operational protocols and practices which can streamline implementation, reduce costs, and save time. New regulations are emerging that should help drive the adoption of standard protocols and processes. In particular, the EU Data Governance Act and the forthcoming Data Act aim to enable responsible data collaboration. Concepts like data spaces and rulebooks seek to build trust and strike a balance between regulation and technological innovation.

This working paper advances the case for creating a Mutual Commitment Framework (MCF) in advance of a crisis that can serve as a necessary and practical means to break through chronic choke points and shorten response times. By accelerating the establishment of operational (and legally cognizable) data collaboratives, duties of care can be defined and a stronger sense of trust, clarity, and purpose can be instilled among participating entities. This structured approach ensures that data sharing and processing are conducted within well-defined, pre-authorized boundaries, thereby lowering shared risks and promoting a conducive environment for collaboration…(More)”.

Open Government for Stronger Democracies


A Global Assessment by the OECD: “Open government is a powerful catalyst for driving democracy, public trust, and inclusive growth. In recognition of this, the OECD Council adopted the Recommendation on Open Government in 2017. To date, it remains the first – and only – internationally recognised legal instrument on open government and has guided many countries in designing and implementing their open government agendas. This report takes stock of countries’ implementation of the Recommendation, its dissemination, and its ongoing significance. It is based on an OECD survey carried out in 2020/2021 among all countries that adhered to the Recommendation and other partner countries, as well as on further data collected through a perception survey with delegates to the OECD Working Party on Open Government…(More)”.

Innovation in Anticipation for Migration: A Deep Dive into Methods, Tools, and Data Sources


Blog by Sara Marcucci and Stefaan Verhulst: “In the ever-evolving landscape of anticipatory methods for migration policy, innovation is a dynamic force propelling the field forward. This seems to be happening in two main ways: first, as we mentioned in our previous blog, one of the significant shifts lies in the blurring of boundaries between quantitative forecasting and qualitative foresight, as emerging mixed-method approaches challenge traditional paradigms. This transformation opens up new pathways for understanding complex phenomena, particularly in the context of human migration flows. 

Innovation in Anticipation for Migration: A Deep Dive into Methods, Tools, and Data Sources

Second, the innovation happening today is not necessarily rooted in the development of entirely new methodologies, but rather in how existing methods are adapted and enhanced. Indeed, innovation seems to extend to the utilization of diverse tools and data sources that bolster the effectiveness of existing methods, offering a more comprehensive and timely perspective on migration trends.

In the context of this blog series, methods refer to the various approaches and techniques used to anticipate and analyze migration trends, challenges, and opportunities. These methods are employed to make informed decisions and develop policies related to human migration. They can include a wide range of strategies to gather and interpret data and insights in the field of migration policy. 

Tools, on the other hand, refer to the specific instruments or technologies used to support and enhance the effectiveness of these methods. They encompass a diverse set of resources and technologies that facilitate data collection, analysis, and decision-making in the context of migration policy. These tools can include both quantitative and qualitative data collection and analysis tools, as well as innovative data sources, software, and techniques that help enhance anticipatory methods.

This blog aims to deep dive into the main anticipatory methods adopted in the field of migration, as well as some of the tools and data sources employed to enhance and experiment with them. First, the blog will provide a list of methods considered; second, it will illustrate the main innovative tools employed, and finally it will provide a set of new, non-traditional data sources that are increasingly being used to feed anticipatory methods…(More)”.