Why PeaceTech must be the next frontier of innovation and investment


Article by Stefaan Verhulst and Artur Kluz: “…amidst this frenzy, a crucial question is being left unasked: Can technology be used not just to win wars, but to prevent them and save people’s lives?

There is an emerging field that dares to pose this question—PeaceTech. It is the use of technology to save human lives, prevent conflict, de-escalate violence, rebuild fractured communities, and secure fragile peace in post-conflict environments.

From early warning systems that predict outbreaks of violence, to platforms ensuring aid transparency, and mobile tools connecting refugees to services: PeaceTech is real, it works—and it is radically underfunded.

Unlike the vast sums pouring into defense startups, peace building efforts, including PeaceTech organizations and ventures, struggle for scraps. The United Nations Secretary-General released in 2020 its ambitious goal to fundraise $1.5 billion in peacebuilding support over a total of seven years. In contrast, private investment in defense tech crossed $34 billion in 2023 alone. 

Why is PeaceTech so neglected?

One reason PeaceTech is so neglected is cultural: in the tech world, “peace” can seem abstract or idealistic—soft power in a world of hard tech. In reality, peace is not soft; it is among the hardest, most complex challenges of our time. Peace requires systemic thinking, early intervention, global coordination, and a massive infrastructure of care, trust, and monitoring. Maintaining peace in a hyper-polarized, technologically complex world is a feat of engineering, diplomacy, and foresight.

And it’s a business opportunity. According to the Institute for Economics and Peace, violence costs the global economy over $17 trillion per year—about 13% of global GDP. Even modest improvements in peace would unlock billions in economic value.

Consider the peace dividend from predictive analytics that can help governments or international organizations intervene or mediate before conflict breaks out, or AI-powered verification tools to enforce ceasefires and disinformation controls. PeaceTech, if scaled, could become a multibillion dollar market—and a critical piece of the security architecture of the future…(More)”. ..See also Kluz Prize for PeaceTech (Applications Open)

The war over the peace business


Article by Tekendra Parmar: “At the second annual AI+ Expo in Washington, DC, in early June, war is the word of the day.

As a mix of Beltway bureaucrats, military personnel, and Washington’s consultant class peruse the expansive Walter E. Washington Convention Center, a Palantir booth showcases its latest in data-collection suites for “warfighters.” Lockheed Martin touts the many ways it is implementing AI throughout its weaponry systems. On the soundstage, the defense tech darling Mach Industries is selling its newest uncrewed aerial vehicles. “We’re living in a world with great-power competition,” the presenter says. “We can’t rule out the possibility of war — but the best way to prevent a war is deterrence,” he says, flanked by videos of drones flying through what looked like the rugged mountains and valleys of Kandahar.

Hosted by the Special Competitive Studies Project, a think tank led by former Google CEO Eric Schmidt, the expo says it seeks to bridge the gap between Silicon Valley entrepreneurs and Washington policymakers to “strengthen” America and its allies’ “competitiveness in critical technologies.”

One floor below, a startup called Anadyr Horizon is making a very different sales pitch, for software that seeks to prevent war rather than fight it: “Peace tech,” as the company’s cofounder Arvid Bell calls it. Dressed in white khakis and a black pinstripe suit jacket with a dove and olive branch pinned to his lapel (a gift from his husband), the former Harvard political scientist begins by noting that Russia’s all-out invasion of Ukraine had come as a surprise to many political scientists. But his AI software, he says, could predict it.

Long the domain of fantasy and science fiction, the idea of forecasting conflict has now become a serious pursuit. In Isaac Asimov’s 1950s “Foundation” series, the main character develops an algorithm that allows him to predict the decline of the Galactic Empire, angering its rulers and forcing him into exile. During the coronavirus pandemic, the US State Department experimented with AI fed with Twitter data to predict “COVID cases” and “violent events.” In its AI audit two years ago, the State Department revealed that it started training AI on “open-source political, social, and economic datasets” to predict “mass civilian killings.” The UN is also said to have experimented with AI to model the war in Gaza…(More)”… ..See also Kluz Prize for PeaceTech (Applications Open)

The Reenchanted World: On finding mystery in the digital age


Essay by Karl Ove Knausgaard: “…When Karl Marx and Friedrich Engels wrote about alienation in the 1840s—that’s nearly two hundred years ago—they were describing workers’ relationship with their work, but the consequences of alienation spread into their analysis to include our relationship to nature and to existence as such. One term they used was “loss of reality.” Society at that time was incomparably more brutal, the machines incomparably coarser, but problems such as economic inequality and environmental destruction have continued into our own time. If anything, alienation as Marx and Engels defined it has only increased.

Or has it? The statement “people are more alienated now than ever before in history” sounds false, like applying an old concept to a new condition. That is not really what we are, is it? If there is something that characterizes our time, isn’t it the exact opposite, that nothing feels alien?

Alienation involves a distance from the world, a lack of connection between it and us. What technology does is compensate for the loss of reality with a substitute. Technology calibrates all differences, fills in every gap and crack with images and voices, bringing everything close to us in order to restore the connection between ourselves and the world. Even the past, which just a few generations ago was lost forever, can be retrieved and brought back…(More)”.

2025 State of the Digital Decade


Report by The European Commission: “…assessed the EU’s progress along the four target areas for the EU’s digital transformation by 2030, highlighting achievements and gaps in the areas of digital infrastructure, digitalisation of businesses, digital skills, and digitalisation of public service.

Digital Decade logo

The report shows that although there are certain advancements, the rollout of connectivity infrastructure, such as fibre and 5G stand-alone networks, is still lagging. More companies are adopting Artificial Intelligence (AI), cloud and big data, but adoption needs to accelerate. Just over half of Europeans (55.6%) have a basic level of digital skills, while the availability of ICT specialists with advanced skills remains low and with a stark gender divide, hindering progress in key sectors, such as cybersecurity and AI. In 2024, the EU made steady progress in digitalising key public services, but a substantial portion of governmental digital infrastructure continues to depend on service providers outside the EU.

The data shows persisting challenges, such as fragmented markets, overly complex regulations, security and strategic dependence. Further public and private investment and easier access to venture capital for EU companies would accelerate innovation and scale up…(More)”.

The Hypocrisy Trap: How Changing What We Criticize Can Improve Our Lives


Book by Michael Hallsworth: “In our increasingly distrusting and polarized nations, accusations of hypocrisy are everywhere. But the strange truth is that our attempts to stamp out hypocrisy often backfire, creating what Michael Hallsworth calls The Hypocrisy Trap. In this groundbreaking book, he shows how our relentless drive to expose inconsistency between words and deeds can actually breed more hypocrisy or, worse, cynicism that corrodes democracy itself.

Through engaging stories and original research, Hallsworth shows that not all hypocrisy is equal. While some forms genuinely destroy trust and create harm, others reflect the inevitable compromises of human nature and complex societies. The Hypocrisy Trap offers practical solutions: ways to increase our own consistency, navigate accusations wisely, and change how we judge others’ actions. Hallsworth shows vividly that we can improve our politics, businesses, and personal relationships if we rethink hypocrisy—soon…(More)”.

The Loyalty Trap


Book by Jaime Lee Kucinskas: “…explores how civil servants navigated competing pressures and duties amid the chaos of the Trump administration, drawing on in-depth interviews with senior officials in the most contested agencies over the course of a tumultuous term. Jaime Lee Kucinskas argues that the professional culture and ethical obligations of the civil service stabilize the state in normal times but insufficiently prepare bureaucrats to cope with a president like Trump. Instead, federal employees became ensnared in intractable ethical traps, caught between their commitment to nonpartisan public service and the expectation of compliance with political directives. Kucinskas shares their quandaries, recounting attempts to preserve the integrity of government agencies, covert resistance, and a few bold acts of moral courage in the face of organizational decline and politicized leadership. A nuanced sociological account of the lessons of the Trump administration for democratic governance, The Loyalty Trap offers a timely and bracing portrait of the fragility of the American state…(More)”.

Manipulation: What It Is, Why It’s Bad, What to Do About It


Book by Cass Sunstein: “New technologies are offering companies, politicians, and others unprecedented opportunity to manipulate us. Sometimes we are given the illusion of power – of freedom – through choice, yet the game is rigged, pushing us in specific directions that lead to less wealth, worse health, and weaker democracy. In, Manipulation, nudge theory pioneer and New York Times bestselling author, Cass Sunstein, offers a new definition of manipulation for the digital age, explains why it is wrong; and shows what we can do about it. He reveals how manipulation compromises freedom and personal agency, while threatening to reduce our well-being; he explains the difference between manipulation and unobjectionable forms of influence, including ‘nudges’; and he lifts the lid on online manipulation and manipulation by artificial intelligence, algorithms, and generative AI, as well as threats posed by deepfakes, social media, and ‘dark patterns,’ which can trick people into giving up time and money. Drawing on decades of groundbreaking research in behavioral science, this landmark book outlines steps we can take to counteract manipulation in our daily lives and offers guidance to protect consumers, investors, and workers…(More)”.

European project to make web search more open and ethical


PressRelease: “The OpenWebSearch.eu consortium, which includes CERN, has released a pilot of the first federated, pan-European Open Web Index, paving the way for a new generation of unbiased and ethical search engines

Artistic map of Europe with search bars in different languages overlaid
(Image: openwebsearch.eu / using images by NASA (europe_dnb_2012_lrg.jpg), Unsplash (christopher-burns-dzejyfCAzIA-unsplash))

On 6 June, the OpenWebSearch.eu consortium released a pilot of a new infrastructure that aims to make European web search fairer, more transparent and commercially unbiased. With strong participation by CERN, the European Open Web Index (OWI) is now open for use by academic, commercial and independent teams under a general research licence, with commercial options in development on a case-by-case basis.

The OpenWebSearch.eu initiative was launched in 2022, with a consortium made up of 14 leading research institutions from across Europe, including CERN…

The OWI offers a clear alternative based on European values. The project’s cross-disciplinary nature, ensuring continuous dialogue between technical teams and legal, ethical and social experts, ensures that fairness and privacy are built into the OWI from the start. “Over thirty years since the World Wide Web was created at CERN and released to the public, our commitment to openness continues,” says Noor Afshan Fathima, IT research fellow at CERN. “Search is the next logical step in democratising digital access, especially as we enter the AI era.” The OWI facilitates AI capabilities, allowing web search data to be used for training large language models (LLMs), generating embeddings and powering chatbots…(More)”.

Usability for the World: Building Better Cities and Communities


Book edited by Elizabeth Rosenzweig, and Amanda Davis: “Want to build cities that truly work for everyone? Usability for the World: Sustainable Cities and Communities reveals how human-centered design is key to thriving, equitable urban spaces. This isn’t just another urban planning book; it’s a practical guide to transforming cities, offering concrete strategies and real-world examples you can use today.

What if our cities could be both efficient and human-friendly? This book tackles the core challenge of modern urban development: balancing functionality with the well-being of residents. It explores the crucial connection between usability and sustainability, demonstrating how design principles, from Universal to life-centered, create truly livable cities.

Interested in sustainable urban development? Usability for the World offers a global perspective, showcasing diverse approaches to creating equitable and resilient cities. Through compelling case studies, discover how user-centered design addresses pressing urban challenges. See how these principles connect directly to achieving the UN Sustainable Development Goals, specifically SDG 11: Sustainable Cities and Communities…(More)”.

Scientific Publishing: Enough is Enough


Blog by Seemay Chou: “In Abundance, Ezra Klein and Derek Thompson make the case that the biggest barriers to progress today are institutional. They’re not because of physical limitations or intellectual scarcity. They’re the product of legacy systems — systems that were built with one logic in mind, but now operate under another. And until we go back and address them at the root, we won’t get the future we say we want.

I’m a scientist. Over the past five years, I’ve experimented with science outside traditional institutes. From this vantage point, one truth has become inescapable. The journal publishing system — the core of how science is currently shared, evaluated, and rewarded — is fundamentally broken. And I believe it’s one of the legacy systems that prevents science from meeting its true potential for society.

It’s an unpopular moment to critique the scientific enterprise given all the volatility around its funding. But we do have a public trust problem. The best way to increase trust and protect science’s future is for scientists to have the hard conversations about what needs improvement. And to do this transparently. In all my discussions with scientists across every sector, exactly zero think the journal system works well. Yet we all feel trapped in a system that is, by definition, us.

I no longer believe that incremental fixes are enough. Science publishing must be built anew. I help oversee billions of dollars in funding across several science and technology organizations. We are expanding our requirement that all scientific work we fund will not go towards traditional journal publications. Instead, research we support should be released and reviewed more openly, comprehensively, and frequently than the status quo.

This policy is already in effect at Arcadia Science and Astera Institute, and we’re actively funding efforts to build journal alternatives through both Astera and The Navigation Fund. We hope others cross this line with us, and below I explain why every scientist and science funder should strongly consider it…(More)”.