Inclusive by default: strategies for more inclusive participation


Article by Luiza Jardim and Maria Lucien: “…The systemic challenges that marginalised groups face are pressing and require action. The global average age of parliamentarians is 53, highlighting a gap in youth representation. Young people already face challenges like poverty, lack of education, unemployment and multiple forms of discrimination. Additionally, some participatory formats are often unappealing to young people and pose a challenge for engaging them. Gender equity research highlights the underrepresentation of women at all levels of decision-making and governance. Despite recent improvements, gender parity in governance worldwide is still decades or even centuries away. Meanwhile, ongoing global conflicts in Ukraine, Sudan, Gaza and elsewhere, as well as the impacts of a changing climate, have driven the recent increase in the number of forcibly displaced people to more than 100 million. The engagement of these individuals in decision-making can vary greatly depending on their specific circumstances and the nature of their displacement.

Participatory and deliberative democracy can have transformative impacts on historically marginalised communities but only if they are intentionally included in program design and implementation. To start with, it’s possible to reduce the barriers to participation, such as the cost and time of transport to the participation venue, or burdens imposed by social and cultural roles in society, like childcare. During the process, mindful and attentive facilitation can help balance power dynamics and encourage participation from traditionally excluded people. This is further strengthened if the facilitation team includes and trains members of priority communities in facilitation and session planning…(More)”.

Are We Ready for the Next Pandemic? Navigating the First and Last Mile Challenges in Data Utilization


Blog by Stefaan Verhulst, Daniela Paolotti, Ciro Cattuto and Alessandro Vespignani:

“Public health officials from around the world are gathering this week in Geneva for a weeklong meeting of the 77th World Health Assembly. A key question they are examining is: Are we ready for the next pandemic? As we have written elsewhere, regarding access to and re-use of data, particularly non-traditional data, for pandemic preparedness and response: we are not. Below, we list ten recommendations to advance access to and reuse of non-traditional data for pandemics, drawing on input from a high-level workshop, held in Brussels, within the context of the ESCAPE program…(More)”

As

Unmasking and Quantifying Power Structures: How Network Analysis Enhances Peace and State-Building Efforts


Blog by Issa Luna Pla: “Critiques of peace and state-building efforts have pointed out the inadequate grasp of the origins of conflict, political unrest, and the intricate dynamics of criminal and illicit networks (Holt and Bouch, 2009Cockayne and Lupel, 2011). This limited understanding has failed to sufficiently weaken their economic and political influence or effectively curb their activities and objectives. A recent study highlights that although punitive approaches may have temporarily diminished the power of these networks, the absence of robust analytical tools has made it difficult to assess the enduring impact of these strategies.

1. Application of Network Analytics in State-Building

The importance of analytics in international peace and state-building operations is becoming increasingly recognized (O’Brien, 2010Gnanguenon, 2021Rød et al., 2023). Analytics, particularly network analysis, plays a crucial role in dissecting and dismantling complex power structures that often undermine peace initiatives and governance reforms. This analytical approach is crucial for revealing and disrupting the entrenched networks that sustain ongoing conflicts or obstruct peace processes. From the experiences in Guatemala, three significant lessons have been learned regarding the need for analytics for regional and thematic priorities in such operations (Waxenecker, 2019). These insights are vital for understanding how to tailor analytical strategies to address specific challenges in conflict-affected areas.

  1. The effectiveness of the International Commission CICIG in dismantling criminal networks was constrained by its lack of advanced analytical tools. This limitation prevented a deeper exploration of the conflicts’ roots and hindered the assessment of the long-term impacts of its strategies. While the CICIG had a systematic approach to understanding criminal networks from a contextual and legal perspective, its action plans lacked comprehensive statistic analytics methodologies, leading to missed opportunities in targeting key strategic players within these networks. High-level arrests were based on available evidence and charges that prosecutors could substantiate, rather than a strategic analysis of actors’ roles and influences within the networks’ dynamics.
  2. Furthermore, the extent of network dismantlement and the lasting effects of imprisonment and financial control of the illicit groups’ assets remain unclear, highlighting the need for predictive analytics to anticipate conflicts and sustainability. Such tools could enable operations to forecast potential disruptions or stability, allowing for data-driven proactive measures to prevent violence or bolster peace.
  3. Lastly, insights derived from network analysis suggest that efforts should focus on enhancing diplomatic negotiations, promoting economic development and social capital, and balancing punitive measures with strategic interventions. By understanding the dynamics and modeling group behavior in conflict zones, negotiations can be better informed by a deep and holistic comprehension of the underlying power structures and motivations. This approach could also help in forecasting recidivism, assessing risks of network reorganization, and evaluating the potential for increased armament, workforce, or empowerment, thereby facilitating more effective and sustainable peacebuilding initiatives.

2. Advancing Legal and Institutional Reforms

Utilizing data sciences in conflicted environments offers unique insights into the behavior of illicit networks and their interactions within the public and private sectors (Morselli et al., 2007Leuprecht and Hall, 2014Campedelli et al., 2019). This systematic approach, grounded in the analysis of years of illicit activities in Guatemala, highlights the necessity of rethinking traditional legal and institutional frameworks…(More)”.

The Future of Peacebuilding: Why Investing in PeaceTech is Essential in Today’s Geopolitics


Article by Artur Kluz and Stefaan Verhulst: “In today’s geopolitical landscape, marked by escalating tensions and technological advancements, there is a significant opportunity for technology to contribute to conflict prevention and peacebuilding: i.e. peacetech. Secretary of State Antony Blinken, in his recent speech on Technology and the Transformation of US Foreign Policy, emphasized the crucial role technology plays in geopolitical contests and its potential as an “engine of historic possibility — for our economies, for our democracies, for our people, for our planet.” His assertion that “security, stability, prosperity — they are no longer solely analog matters” underscores the necessity to urgently focus on and invest in technological innovations that can support peacebuilding in the digital age.

Peacetech is an emerging field that describes a range of technologies that can be used for peacebuilding. From satellite internet constellations and early warning systems to AI-driven conflict prediction models, peacetech has the potential to transform the landscape of peacekeeping and conflict prevention. With its diversity of applications, it can support institutions’ peacebuilding or conflict prevention activities by providing insights faster and at scale. It can empower local populations to promote their safety and security and help observers predict future conflict…(More)”.

Private Thought and Public Speech


Essay by David Bromwich: “The past decade has witnessed a notable rise in the deployment of outrageous speech and censorship: opposite tendencies, on the face of things, which actually strengthen each other’s claim. My aim in this essay is to defend the traditional civil libertarian argument against censorship, without defending outrageous speech. By outrageous, I should add, I don’t mean angry or indignant or accusing speech, of the sort its opponents call “extreme” (often because it expresses an opinion shared by a small minority). Spoken words of this sort may give an impetus to thought, and their existence is preferable to anything that could be done to silence them. Outrageous speech, by contrast, is speech that means only to enrage, and not to convey any information or argument, in however primitive a form. No intelligent person wishes there were more of it. But, for the survival of a free society, censorship is far more dangerous.  

Let me try for a closer description of these rival tendencies. On the one hand, there is the unembarrassed publication of the degrading epithet, the intemperate accusation, the outlandish verbal assault against a person thought to be an erring member of one’s own milieu; and on the other hand, the bureaucratized penalizing of inappropriate speech (often classified as such quite recently) which has become common in the academic, media, professional, and corporate workplace. …(More)”.

How the war on drunk driving was won


Blog by Nick Cowen: “…Viewed from the 1960s it might have seemed like ending drunk driving would be impossible. Even in the 1980s, the movement seemed unlikely to succeed and many researchers questioned whether it constituted a social problem at all.

Yet things did change: in 1980, 1,450 fatalities were attributed to drunk driving accidents in the UK. In 2020, there were 220. Road deaths in general declined much more slowly, from around 6,000 in 1980 to 1,500 in 2020. Drunk driving fatalities dropped overall and as a percentage of all road deaths.

The same thing happened in the United States, though not to quite the same extent. In 1980, there were around 28,000 drunk driving deaths there, while in 2020, there were 11,654. Despite this progress, drunk driving remains a substantial public threat, comparable in scale to homicide (of which in 2020 there were 594 in Britain and 21,570 in America).

Of course, many things have happened in the last 40 years that contributed to this reduction. Vehicles are better designed to prioritize life preservation in the event of a collision. Emergency hospital care has improved so that people are more likely to survive serious injuries from car accidents. But, above all, driving while drunk has become stigmatized.

This stigma didn’t come from nowhere. Governments across the Western world, along with many civil society organizations, engaged in hard-hitting education campaigns about the risks of drunk driving. And they didn’t just talk. Tens of thousands of people faced criminal sanctions, and many were even put in jail.

Two underappreciated ideas stick out from this experience. First, deterrence works: incentives matter to offenders much more than many scholars found initially plausible. Second, the long-run impact that successful criminal justice interventions have is not primarily in rehabilitation, incapacitation, or even deterrence, but in altering the social norms around acceptable behavior…(More)”.

On the Meaning of Community Consent in a Biorepository Context


Article by Astha Kapoor, Samuel Moore, and Megan Doerr: “Biorepositories, vital for medical research, collect and store human biological samples and associated data for future use. However, our reliance solely on the individual consent of data contributors for biorepository data governance is becoming inadequate. Big data analysis focuses on large-scale behaviors and patterns, shifting focus from singular data points to identifying data “journeys” relevant to a collective. The individual becomes a small part of the analysis, with the harms and benefits emanating from the data occurring at an aggregated level.

Community refers to a particular qualitative aspect of a group of people that is not well captured by quantitative measures in biorepositories. This is not an excuse to dodge the question of how to account for communities in a biorepository context; rather, it shows that a framework is needed for defining different types of community that may be approached from a biorepository perspective. 

Engaging with communities in biorepository governance presents several challenges. Moving away from a purely individualized understanding of governance towards a more collectivizing approach necessitates an appreciation of the messiness of group identity, its ephemerality, and the conflicts entailed therein. So while community implies a certain degree of homogeneity (i.e., that all members of a community share something in common), it is important to understand that people can simultaneously consider themselves a member of a community while disagreeing with many of its members, the values the community holds, or the positions for which it advocates. The complex nature of community participation therefore requires proper treatment for it to be useful in a biorepository governance context…(More)”.

Participatory mapping as a social digital tool


Blog by María de los Ángeles Briones: “…we will use 14 different examples from different continents and contexts to explore the goals and methods used for participatory mapping as a social digital tool. Despite looking very different and coming from a range of cultural backgrounds, there are a number of similarities in these different case studies.

Although the examples have different goals, we have identified four main focus areas: activism, conviviality, networking and urban planning. More localised mapping projects often had a focus on activism. We also see from that maps are not isolated tools, they are complementary to work with other communication tools and platforms.

The internet has transformed communications and networks across the globe – allowing for interconnectivity and scalability of information among and between different groups of society. This allows voices, regardless of their location, to be amplified and listened to by many other voices achieving collective goals. This has great potential in a global world where it is evident that top-down initiatives are not enough to handle many of the social needs that local people experience. However, though the internet makes sharing and collaborating between people easier, offline maps are still valuable, as shown in some of our examples.

The similarity between the different maps that we explored is that they are social digital tools. They are social because they are related to projects that are seeking to solve social needs; and they are digital because they are based on digital platforms that permit them to be alive, spread, shared and used. These characteristics also refer to their function and design.

A tool can be defined as a device or implement, especially one held in the hand, used to carry out a particular function. So when we speak of a tool there are four things involved: an actor, an object, a function and a purpose. Just as a hammer is a tool that a carpenter (actor) use to hammer nails (function) and thus build something (purpose) we understand that social tools are used by one or more people for taking actions where the final objective is to meet a social need…(More)”.

Big data for everyone


Article by Henrietta Howells: “Raw neuroimaging data require further processing before they can be used for scientific or clinical research. Traditionally, this could be accomplished with a single powerful computer. However, much greater computing power is required to analyze the large open-access cohorts that are increasingly being released to the community. And processing pipelines are inconsistently scripted, which can hinder reproducibility efforts. This creates a barrier for labs lacking access to sufficient resources or technological support, potentially excluding them from neuroimaging research. A paper by Hayashi and colleagues in Nature Methods offers a solution. They present https://brainlife.io, a freely available, web-based platform for secure neuroimaging data access, processing, visualization and analysis. It leverages ‘opportunistic computing’, which pools processing power from commercial and academic clouds, making it accessible to scientists worldwide. This is a step towards lowering the barriers for entry into big data neuroimaging research…(More)”.

“Data Commons”: Under Threat by or The Solution for a Generative AI Era ? Rethinking Data Access and Re-us


Article by Stefaan G. Verhulst, Hannah Chafetz and Andrew Zahuranec: “One of the great paradoxes of our datafied era is that we live amid both unprecedented abundance and scarcity. Even as data grows more central to our ability to promote the public good, so too does it remain deeply — and perhaps increasingly — inaccessible and privately controlled. In response, there have been growing calls for “data commons” — pools of data that would be (self-)managed by distinctive communities or entities operating in the public’s interest. These pools could then be made accessible and reused for the common good.

Data commons are typically the results of collaborative and participatory approaches to data governance [1]. They offer an alternative to the growing tendency toward privatized data silos or extractive re-use of open data sets, instead emphasizing the communal and shared value of data — for example, by making data resources accessible in an ethical and sustainable way for purposes in alignment with community values or interests such as scientific researchsocial good initiativesenvironmental monitoringpublic health, and other domains.

Data commons can today be considered (the missing) critical infrastructure for leveraging data to advance societal wellbeing. When designed responsibly, they offer potential solutions for a variety of wicked problems, from climate change to pandemics and economic and social inequities. However, the rapid ascent of generative artificial intelligence (AI) technologies is changing the rules of the game, leading both to new opportunities as well as significant challenges for these communal data repositories.

On the one hand, generative AI has the potential to unlock new insights from data for a broader audience (through conversational interfaces such as chats), fostering innovation, and streamlining decision-making to serve the public interest. Generative AI also stands out in the realm of data governance due to its ability to reuse data at a massive scale, which has been a persistent challenge in many open data initiatives. On the other hand, generative AI raises uncomfortable questions related to equitable accesssustainability, and the ethical re-use of shared data resources. Further, without the right guardrailsfunding models and enabling governance frameworks, data commons risk becoming data graveyards — vast repositories of unused, and largely unusable, data.

Ten part framework to rethink Data Commons

In what follows, we lay out some of the challenges and opportunities posed by generative AI for data commons. We then turn to a ten-part framework to set the stage for a broader exploration on how to reimagine and reinvigorate data commons for the generative AI era. This framework establishes a landscape for further investigation; our goal is not so much to define what an updated data commons would look like but to lay out pathways that would lead to a more meaningful assessment of the design requirements for resilient data commons in the age of generative AI…(More)”