US Department of Commerce: “…This guidance provides actionable guidelines and best practices for publishing open data optimized for generative AI systems. While it is designed for use by the Department of Commerce and its bureaus, this guidance has been made publicly available to benefit open data publishers globally…(More)”. See also: A Fourth Wave of Open Data? Exploring the Spectrum of Scenarios for Open Data and Generative AI
The Future of Jobs Report 2025
Report by the World Economic Forum: “Technological change, geoeconomic fragmentation, economic uncertainty, demographic shifts and the green transition – individually and in combination are among the major drivers expected to shape and transform the global labour market by 2030. The Future of Jobs Report 2025 brings together the perspective of over 1,000 leading global employers—collectively representing more than 14 million workers across 22 industry clusters and 55 economies from around the world—to examine how these macrotrends impact jobs and skills, and the workforce transformation strategies employers plan to embark on in response, across the 2025 to 2030 timeframe…(More)”.
The Bridging Dictionary
About: “What if generative AI could help us understand people with opposing views better just by showing how they use common words and phrases differently? That’s the deceptively simple-sounding idea behind a new experiment from MIT’s Center for Constructive Communication (CCC).
It’s called the Bridging Dictionary (BD), a research prototype that’s still very much a work in progress – one we hope your feedback will help us improve.
The Bridging Dictionary identifies words and phrases that both reflect and contribute to sharply divergent views in our fractured public sphere. That’s the “dictionary” part. If that’s all it did, we could just call it the “Frictionary.” But the large language model (LLM) that undergirds the BD also suggests less polarized alternatives – hence “bridging.”
In this prototype, research scientist Doug Beeferman and a team at CCC led by Maya Detwiller and Dennis Jen used thousands of transcripts and opinion articles from foxnews.com and msnbc.com as proxies for the conversation on the right and the left. You’ll see the most polarized words and phrases when you sample the BD for yourself, but you can also plug any term of your choosing into the search box. (For a more complete explanation of the methodology behind the BD, see https://bridgingdictionary.org/info/ .)…(More)”.
The People Say
About: “The People Say is an online research hub that features first-hand insights from older adults and caregivers on the issues most important to them, as well as feedback from experts on policies affecting older adults.
This project particularly focuses on the experiences of communities often under-consulted in policymaking, including older adults of color, those who are low income, and/or those who live in rural areas where healthcare isn’t easily accessible. The People Say is funded by The SCAN Foundation and developed by researchers and designers at the Public Policy Lab.
We believe that effective policymaking listens to most-affected communities—but policies and systems that serve older adults are typically formed with little to no input from older adults themselves. We hope The People Say will help policymakers hear the voices of older adults when shaping policy…(More)”
The Circle of Sharing: How Open Datasets Power AI Innovation
A Sankey diagram developed by AI World and Hugging Face:”… illustrating the flow from top open-source datasets through AI organizations to their derivative models, showcasing the collaborative nature of AI development…(More)”.

Trust but Verify: A Guide to Conducting Due Diligence When Leveraging Non-Traditional Data in the Public Interest
New Report by Sara Marcucci, Andrew J. Zahuranec, and Stefaan Verhulst: “In an increasingly data-driven world, organizations across sectors are recognizing the potential of non-traditional data—data generated from sources outside conventional databases, such as social media, satellite imagery, and mobile usage—to provide insights into societal trends and challenges. When harnessed thoughtfully, this data can improve decision-making and bolster public interest projects in areas as varied as disaster response, healthcare, and environmental protection. However, with these new data streams come heightened ethical, legal, and operational risks that organizations need to manage responsibly. That’s where due diligence comes in, helping to ensure that data initiatives are beneficial and ethical.

The report, Trust but Verify: A Guide to Conducting Due Diligence When Leveraging Non-Traditional Data in the Public Interest, co-authored by Sara Marcucci, Andrew J. Zahuranec, and Stefaan Verhulst, offers a comprehensive framework to guide organizations in responsible data partnerships. Whether you’re a public agency or a private enterprise, this report provides a six-step process to ensure due diligence and maintain accountability, integrity, and trust in data initiatives…(More) (Blog)”.
Global Trends in Government Innovation 2024
OECD Report: “Governments worldwide are transforming public services through innovative approaches that place people at the center of design and delivery. This report analyses nearly 800 case studies from 83 countries and identifies five critical trends in government innovation that are reshaping public services. First, governments are working with users and stakeholders to co-design solutions and anticipate future needs to create flexible, responsive, resilient and sustainable public services. Second, governments are investing in scalable digital infrastructure, experimenting with emergent technologies (such as automation, AI and modular code), and expanding innovative and digital skills to make public services more efficient. Third, governments are making public services more personalised and proactive to better meet people’s needs and expectations and reduce psychological costs and administrative frictions, ensuring they are more accessible, inclusive and empowering, especially for persons and groups in vulnerable and disadvantaged circumstances. Fourth, governments are drawing on traditional and non-traditional data sources to guide public service design and execution. They are also increasingly using experimentation to navigate highly complex and unpredictable environments. Finally, governments are reframing public services as opportunities and channels for citizens to exercise their civic engagement and hold governments accountable for upholding democratic values such as openness and inclusion…(More)”.
Harnessing AI: How to develop and integrate automated prediction systems for humanitarian anticipatory action
CEPR Report: “Despite unprecedented access to data, resources, and wealth, the world faces an escalating wave of humanitarian crises. Armed conflict, climate-induced disasters, and political instability are displacing millions and devastating communities. Nearly one in every five children are living in or fleeing conflict zones (OCHA, 2024). Often the impacts of conflict and climatic hazards – such as droughts and flood – exacerbate each other, leading to even greater suffering. As crises unfold and escalate, the need for timely and effective humanitarian action becomes paramount.
Sophisticated systems for forecasting and monitoring natural and man-made hazards have emerged as critical tools to help inform and prompt action. The full potential for the use of such automated forecasting systems to inform anticipatory action (AA) is immense but is still to be realised. By providing early warnings and predictive insights, these systems could help organisations allocate resources more efficiently, plan interventions more effectively, and ultimately save lives and prevent or reduce humanitarian impact.
This Policy Insight provides an account of the significant technical, ethical, and organisational difficulties involved in such systems, and the current solutions in place…(More)”.
The Recommendation on Information Integrity
OECD Recommendation: “…The digital transformation of societies has reshaped how people interact and engage with information. Advancements in digital technologies and novel forms of communication have changed the way information is produced, shared, and consumed, locally and globally and across all media. Technological changes and the critical importance of online information platforms offer unprecedented access to information, foster citizen engagement and connection, and allow for innovative news reporting. However, they can also provide a fertile ground for the rapid spread of false, altered, or misleading content. In addition, new generative AI tools have greatly reduced the barriers to creating and spreading content.
Promoting the availability and free flow of high-quality, evidence-based information is key to upholding individuals’ ability to seek and receive information and ideas of all kinds and to safeguarding freedom of opinion and expression.
The volume of content to which citizens are exposed can obscure and saturate public debates and help widen societal divisions. In this context, the quality of civic discourse declines as evidence-based information, which helps people make sense of their social environment, becomes harder to find. This reality has acted as a catalyst for governments to explore more closely the roles they can play, keeping as a priority in our democracies the necessity that governments should not exercise control of the information ecosystem and that, on the contrary, they support an environment where a plurality of information sources, views, and opinions can thrive…Building on the detailed policy framework outlined in the OECD report Facts not Fakes: Tackling Disinformation, Strengthening Information Integrity, the Recommendation provides an ambitious and actionable international standard that will help governments develop a systemic approach to foster information integrity, relying on a multi-stakeholder approach…(More)”.
Must NLP be Extractive?
Paper by Steven Bird: “How do we roll out language technologies across a world with 7,000 languages? In one story, we scale the successes of NLP further into ‘low-resource’ languages, doing ever more with less. However, this approach does not recognise the fact that – beyond the 500 institutional languages – the remaining languages are oral vernaculars. These speech communities interact with the outside world using a ‘con-
tact language’. I argue that contact languages are the appropriate target for technologies like speech recognition and machine translation, and that the 6,500 oral vernaculars should be approached differently. I share stories from an Indigenous community where local people reshaped an extractive agenda to align with their relational agenda. I describe the emerging paradigm of Relational NLP and explain how it opens the way to non-extractive methods and to solutions that enhance human agency…(More)”