A Guide to Designing New Institutions


Guide by TIAL: “We have created this guide as part of TIAL’s broader programme of work to help with the design of new institutions needed in fields ranging from environmental change to data stewardship and AI to mental health.This toolkit offers a framework for thinking about the design of new public institutions — whether at the level of a region or city, a nation, or at a transnational level. We welcome comments, critiques and additions.

This guide covers all the necessary steps of creating a new institution:

  • Preparation
  • Design (from structures and capabilities to processes and resources)
  • Socialisation (to ensure buy-in and legitimacy)
  • Implementation…(More)”.

Climate change may kill data sovereignty


Article by Trisha Ray: “Data centres are the linchpin of a nation’s technological progress, serving as the nerve centers that power the information age. The need for robust and reliable data centre infrastructure cuts across the UN Sustainable Development Goals (SDGs), serving as an essential foundation for e-government, innovation and entrepreneurship, decent work, and economic growth. It comes as no surprise then that data sovereignty has gained traction over the past decade, particularly in the Global South. However, climate change threatens the very infrastructure that underpins the digital future, and its impact on data centres is a multifaceted challenge, with rising temperatures, extreme weather events, and changing environmental conditions posing significant threats to their reliability and sustainability, even as developing countries begin rolling out ambitious strategies and incentives to attract data centres…(More)”.

Data Science for Social Impact in Higher Education:  First Steps


Data.org playbook: “… was designed to help you expand opportunities for social impact data science learning. As you browse, you will see a range of these opportunities including courses, modules for other courses, research and internship opportunities, and a variety of events and activities. The playbook also offers lessons learned to guide you through your process. Additionally, the Playbook includes profiles of students who have engaged in data science for social impact, guidance for engaging partners, and additional resources relating to evaluation and courses. We hope that this playbook will inspire and support your efforts to bring social impact data science to your institutions…

As you look at the range of ways you might bring data science for social impact to your students, remember that the intention is not for you to replicate what is here, but rather adapt them to your local contexts and conditions. You might draw pieces from several activities and combine them to create a customized strategy that works for you. Consider the assets you have around you and how you might be able to leverage them. At the same time, imagine how some of the lessons learned might reflect barriers you might face, as well. Most importantly, know that it is possible for you to create data science for social impact at your institution to bring benefit to your students and society…(More)”.

Medical AI could be ‘dangerous’ for poorer nations, WHO warns


Article by David Adam: “The introduction of health-care technologies based on artificial intelligence (AI) could be “dangerous” for people in lower-income countries, the World Health Organization (WHO) has warned.

The organization, which today issued a report describing new guidelines on large multi-modal models (LMMs), says it is essential that uses of the developing technology are not shaped only by technology companies and those in wealthy countries. If models aren’t trained on data from people in under-resourced places, those populations might be poorly served by the algorithms, the agency says.

“The very last thing that we want to see happen as part of this leap forward with technology is the propagation or amplification of inequities and biases in the social fabric of countries around the world,” Alain Labrique, the WHO’s director for digital health and innovation, said at a media briefing today.

The WHO issued its first guidelines on AI in health care in 2021. But the organization was prompted to update them less than three years later by the rise in the power and availability of LMMs. Also called generative AI, these models, including the one that powers the popular ChatGPT chatbot, process and produce text, videos and images…(More)”.

2024 Edelman Trust Barometer


Edelman: “The 2024 Edelman Trust Barometer reveals a new paradox at the heart of society. Rapid innovation offers the promise of a new era of prosperity, but instead risks exacerbating trust issues, leading to further societal instability and political polarization.

Innovation is accelerating – in regenerative agriculture, messenger RNA, renewable energy, and most of all in artificial intelligence. But society’s ability to process and accept rapid change is under pressure, with skepticism about science’s relationship with Government and the perception that the benefits skew towards the wealthy.

There is one issue on which the world stands united: innovation is being poorly managed – defined by lagging government regulation, uncertain impacts, lack of transparency, and an assumption that science is immutable. Our respondents cite this as a problem by nearly a two to one margin across most developed and developing countries, plus all age groups, income levels, educational levels, and genders. There is consensus among those who say innovation is poorly managed that society is changing too quickly and not in ways that benefit “people like me” (69%).

Many are concerned that Science is losing its independence: to Government, to the political process, and to the wealthy. In the U.S., two thirds assert that science is too politicized. For the first time in China, we see a contrast to their high trust in government: Three-quarters of respondents believe that Government and organizations that fund research have too much influence on science. There is concern about excessive influence of the elites, with 82% of those who say innovation is managed poorly believing that the system is biased in favor of the rich – this is 30 percentage points higher than those who feel innovation is managed well…(More)”.

The Oxford Handbook of Digital Diplomacy


Book edited by Corneliu Bjola and Ilan Manor: “In recent years, digital technologies have substantially impacted the world of diplomacy. From social media platforms and artificial intelligence to smartphone application and virtual meetings, digital technologies have proven disruptive impacting the norms, practices and logics of diplomats, states, and diplomatic institutions. Although the term digital diplomacy is commonly used by academics and diplomats, few works to date have clearly defined this term or offered a comprehensive analysis of its evolution. This handbook investigates digital diplomacy as a practice, as a process and as a form of disruption. Written by leading experts in the field, this comprehensive volume delves into the ways in which digital technologies are being used to achieve foreign policy goals, and how diplomats are adapting to the digital age.

The Oxford Handbook of Digital Diplomacy explores the shifting power dynamics in diplomacy, exploring the establishment of embassies in technology hubs, the challenges faced by foreign affairs departments in adapting to digital technologies, and the utilization of digital tools as a means of exerting influence. Utilizing a multidisciplinary approach, including theories from international relations, diplomacy studies, communications, sociology, internet studies, and psychology, the handbook examines the use of digital technologies for international development in the Global South, the efforts to combat digital disinformation in the Middle East, and the digital policies of countries in Europe and the Asia-Pacific. Through case studies and in-depth analysis, readers will gain a comprehensive understanding of the term “digital diplomacy” and the many ways in which diplomacy has evolved in the digital age…(More)”.

Integrating Participatory Budgeting and Institutionalized Citizens’ Assemblies: A Community-Driven Perspective


Article by Nick Vlahos: “There is a growing excitement in the democracy field about the potential of citizen’s assemblies (CAs), a practice that brings together groups of residents selected by lottery to deliberate on public policy issues. There is longitudinal evidence to suggest that deliberative mini-publics such as those who meet in CAs can be transformative when it comes to adding more nuance to public opinion on complex and potentially polarizing issues.

But there are two common critiques of CAs. The first is that they are not connected to centers of power (with very few notable exceptions) and don’t have authority to make binding decisions. The second is that they are often disconnected from the broader public, and indeed often claim to be making their own, new “publics” instead of engaging with existing ones.

In this article I propose that proponents of CAs could benefit from the thirty-year history of another democratic innovation—participatory budgeting (PB). There are nearly 12,000 recorded instances of PB to draw learnings from. I see value in both innovations (and have advocated and written about both) and would be interested to see some sort of experimentation that combines PB and CAs, from a decentralized, bottom-up, community-driven approach.

We can and should think about grassroots ways to scale and connect people across geography using combinations of democratic innovations, which along the way builds up (local) civic infrastructure by drawing from existing civic capital (resident-led groups, non-profits, service providers, social movements/mobilization etc.)…(More)”.

Facial Recognition: Current Capabilities, Future Prospects, and Governance


A National Academies of Sciences, Engineering, and Medicine study: “Facial recognition technology is increasingly used for identity verification and identification, from aiding law enforcement investigations to identifying potential security threats at large venues. However, advances in this technology have outpaced laws and regulations, raising significant concerns related to equity, privacy, and civil liberties.

This report explores the current capabilities, future possibilities, and necessary governance for facial recognition technology. Facial Recognition Technology discusses legal, societal, and ethical implications of the technology, and recommends ways that federal agencies and others developing and deploying the technology can mitigate potential harms and enact more comprehensive safeguards…(More)”.

Introducing RegBox: using serious games in regulatory development


Toolkit by UK Policy Lab: “…enabling policymakers to convene stakeholders and work together to make decisions affecting regulation, using serious games. The toolkit will consist of game patterns for different use cases, a collection of case studies, guidance, and a set of tools to help policymakers to decide which approach to take. Work on RegBox is still in progress but in the spirit of being open and iterative we wanted to share and communicate it early. Our overarching challenge question is:  

How can we provide engaging and participatory tools that help policymakers to develop and test regulations and make effective decisions? …  

Policy Lab has worked on a range of projects that intersect with regulation and we’ve noticed a growing demand for more anticipatory and participatory approaches in this area. Regulators are having to respond to emerging technologies which are disrupting markets and posing new risks to individuals and institutions. Additionally, the government has just launched the Smarter Regulation programme, which is encouraging officials to use regulations only where necessary, and ensure their use is proportionate and future-proof. Because a change in regulation can have significant effects on businesses, organisations, and individuals it is important to understand the potential effects before deciding. We hypothesise that serious games can be used to understand regulatory challenges and stress-test solutions at pace..(More)”.

People Have a Right to Climate Data


Article by Justin S. Mankin: “As a climate scientist documenting the multi-trillion-dollar price tag of the climate disasters shocking economies and destroying lives, I sometimes field requests from strategic consultantsfinancial investment analysts and reinsurers looking for climate data, analysis and computer code.

Often, they want to chat about my findings or have me draw out the implications for their businesses, like the time a risk analyst from BlackRock, the world’s largest asset manager, asked me to help with research on what the current El Niño, a cyclical climate pattern, means for financial markets.

These requests make sense: People and companies want to adapt to the climate risks they face from global warming. But these inquiries are also part of the wider commodification of climate science. Venture capitalists are injecting hundreds of millions of dollars into climate intelligence as they build out a rapidly growing business of climate analytics — the data, risk models, tailored analyses and insights people and institutions need to understand and respond to climate risks.

I point companies to our freely available data and code at the Dartmouth Climate Modeling and Impacts Group, which I run, but turn down additional requests for customized assessments. I regard climate information as a public good and fear contributing to a world in which information about the unfolding risks of droughts, floods, wildfires, extreme heat and rising seas are hidden behind paywalls. People and companies who can afford private risk assessments will rent, buy and establish homes and businesses in safer places than the billions of others who can’t, compounding disadvantage and leaving the most vulnerable among us exposed.

Despite this, global consultants, climate and agricultural technology start-ups, insurance companies and major financial firms are all racing to meet the ballooning demand for information about climate dangers and how to prepare for them. While a lot of this information is public, it is often voluminous, technical and not particularly useful for people trying to evaluate their personal exposure. Private risk assessments fill that gap — but at a premium. The climate risk analytics market is expected to grow to more than $4 billion globally by 2027.

I don’t mean to suggest that the private sector should not be involved in furnishing climate information. That’s not realistic. But I worry that an overreliance on the private sector to provide climate adaptation information will hollow out publicly provided climate risk science, and that means we all will pay: the well-off with money, the poor with lives…(More)”.