Culture and Democracy, the evidence


Report by the European Commission: “This report analyses the concrete link between democracy and culture. It maps out how citizens who participate in cultural activities are much more likely to engage in civic and democratic life. Inequalities persist throughout the EU when it comes to citizens’ participation in cultural activities, with a clear knock-on impact on democratic participation. And this is just another reason why it is crucial that cultural activities are inclusive and affordable. Even more so as we see that investing in cultural participation can also support a range of other societal objectives – for example, in fields such as health, education and social inclusion. This report, and addressing the issues identified within it, is part of the work the European Commission is doing to strengthen democracy, to promote an inclusive and engaged society and to support the sustainability of the cultural sector. In the Work Plan for Culture 2023-2026, we put a specific focus on the link between culture and democracy, and we want to bring policy makers and stakeholders together to jointly work towards the concept of cultural citizenship in the EU. This report is part of the process…(More)”.

Artificial Intelligence in Science: Challenges, Opportunities and the Future of Research


OECD Report: “The rapid advances of artificial intelligence (AI) in recent years have led to numerous creative applications in science. Accelerating the productivity of science could be the most economically and socially valuable of all the uses of AI. Utilising AI to accelerate scientific productivity will support the ability of OECD countries to grow, innovate and meet global challenges, from climate change to new contagions. This publication is aimed at a broad readership, including policy makers, the public, and stakeholders in all areas of science. It is written in non-technical language and gathers the perspectives of prominent researchers and practitioners. The book examines various topics, including the current, emerging, and potential future uses of AI in science, where progress is needed to better serve scientific advancements, and changes in scientific productivity. Additionally, it explores measures to expedite the integration of AI into research in developing countries. A distinctive contribution is the book’s examination of policies for AI in science. Policy makers and actors across research systems can do much to deepen AI’s use in science, magnifying its positive effects, while adapting to the fast-changing implications of AI for research governance…(More)”.

There Is Always An Alternative


Speech by Cory Doctorow: “…The human condition is…not good. We’re in the polycrisis, a widening gyre of climate emergency, inequality, infrastructure neglect, rising authoritarianism and zoonotic plagues.

But that’s not the bad part. Stuff breaks. The Second Law of Thermodynamics is not up for debate. Things fall apart. Assuming nothing will break doesn’t make you an optimist — it makes you a danger to yourself and others. “Nothing will go wrong” is how we get “let’s not put any lifeboats on the Titanic.”

Let me say, “to hell with optimism and pessimism.” Optimism and pessimism are just fatalism in respectable suits.

Optimism is the belief that things will get better, no matter what we do.

Pessimism is the belief that things will get worse, no matter what we do.

Both deny human agency, that we can intervene to change things.

The belief that nothing will change — that nothing can change — is the wrecker’s most powerful weapon. After all, if you can convince people that nothing can be done, they won’t try to do anything.

Thus: Margaret Thatcher’s dictum, “There is no alternative,” a polite way of saying “Resistance is futile,” or, “Abandon hope all ye who enter here.”

This is inevitabilism, the belief that nothing can change. It’s the opposite of science fiction. As a science fiction writer, my job is to imagine alternatives. “There is no alternative” is a demand pretending to be an observation: “stop trying to think of an alternative.”

At its best, science fiction demands that we look beyond what a gadget does and interrogate who it does it for and who it does it to. That’s an important exercise, maybe the important exercise.

It’s the method by which we seize the means of computation for the betterment of the human race, not the immortal, rapacious colony organisms we call “limited liability companies,” to whom we represent inconvenient gut-flora, and which are rendering the only planet in the universe capable of sustaining human life unfit for human habitation.

The Luddites practiced science fiction. Perhaps you’ve heard that the Luddites were technophobic thugs who smashed steam-looms because they feared progress. That’s an ahistorical libel. The Luddites weren’t technophobes, they were highly skilled tech workers. Textile guilds required seven years of apprenticeship — Luddites got the equivalent of a master’s from MIT.

Luddites didn’t hate looms. They smashed looms because their bosses wanted to fire skilled workers, ship kidnapped Napoleonic War orphans north from London, and lock them inside factories for a decade of indenture, to be starved, beaten, maimed and killed.

Designing industrial machinery that’s “so easy a child can use it,” isn’t necessarily a prelude to child-slavery, but it’s not not a prelude to child-slavery, either.

The Luddites weren’t mad about what the machines did — they were mad at who the machines did it for and whom they did it to. The child-kidnapping millionaires of the Industrial Revolution said, “There is no alternative,” and the Luddites roared, “The hell you say there isn’t!”

Today’s tech millionaires are no different. Mark Zuckerberg used to insist that there was no way to talk to your friends without being comprehensively spied upon, so every intimate and compromising fact of your life could be gathered, processed, and mobilised against you.

He said this was inevitable, as though some bearded prophet staggered down off a mountain, bearing two stone tablets, intoning, “Zuck, thou shalt stop rotating thine logfiles, and lo, thou shalt mine them for actionable market intelligence.”

When we demanded the right to talk to our friends without Zuckerberg spying on us, he looked at us like we’d just asked for water that wasn’t wet.

Today, Zuck has a new inevitabilist narrative: that we will spend the rest of our days as legless, sexless, heavily surveilled, low-polygon cartoon characters in “the metaverse,” a virtual world he lifted from a 20-year-old dystopian science-fiction novel…(More)”.

Using data to address equity challenges in local government


Report by the Mastercard Center for Inclusive Growth (CFIG): “…This report describes the Data for Equity cohort learning journey, case studies of how participating cities engaged with and learned from the program, and key takeaways about the potential for data to inform effective and innovative equitable development efforts. Alongside data tools, participants explored the value of qualitative data, the critical link between racial equity and economic inclusion, and how federal funds can advance ongoing equity initiatives. 

Cohort members gained and shared insights throughout their learning journey, including:

  • Resources that provided guidance on how to target funding were helpful to ensuring the viability of cities’ equity and economic development initiatives.
  • Tools and resources that helped practitioners move from diagnosing challenges to identifying solutions were especially valuable.
  • Peer-to-peer learning is an essential resource for leaders and staff working in equity roles, which are often structured differently than other city offices.
  • More data tools that explicitly measure racial equity indicators are needed…(More)”.

Fighting poverty with synthetic data


Article by Jack Gisby, Anna Kiknadze, Thomas Mitterling, and Isabell Roitner-Fransecky: “If you have ever used a smartwatch or other wearable tech to track your steps, heart rate, or sleep, you are part of the “quantified self” movement. You are voluntarily submitting millions of intimate data points for collection and analysis. The Economist highlighted the benefits of good quality personal health and wellness data—increased physical activity, more efficient healthcare, and constant monitoring of chronic conditions. However, not everyone is enthusiastic about this trend. Many fear corporations will use the data to discriminate against the poor and vulnerable. For example, insurance firms could exclude patients based on preconditions obtained from personal data sharing.

Can we strike a balance between protecting the privacy of individuals and gathering valuable information? This blog explores applying a synthetic populations approach in New York City,  a city with an established reputation for using big data approaches to support urban management, including for welfare provisions and targeted policy interventions.

To better understand poverty rates at the census tract level, World Data Lab, with the support of the Sloan Foundation, generated a synthetic population based on the borough of Brooklyn. Synthetic populations rely on a combination of microdata and summary statistics:

  • Microdata consists of personal information at the individual level. In the U.S., such data is available at the Public Use Microdata Area (PUMA) level. PUMA are geographic areas partitioning the state, containing no fewer than 100,000 people each. However, due to privacy concerns, microdata is unavailable at the more granular census tract level. Microdata consists of both household and individual-level information, including last year’s household income, the household size, the number of rooms, and the age, sex, and educational attainment of each individual living in the household.
  • Summary statistics are based on populations rather than individuals and are available at the census tract level, given that there are fewer privacy concerns. Census tracts are small statistical subdivisions of a county, averaging about 4,000 inhabitants. In New York City, a census tract roughly equals a building block. Similar to microdata, summary statistics are available for individuals and households. On the census tract level, we know the total population, the corresponding demographic breakdown, the number of households within different income brackets, the number of households by number of rooms, and other similar variables…(More)”.

Supporting Safer Digital Spaces


Report by Suzie Dunn, Tracy Vaillancourt and Heather Brittain: “Various forms of digital technology are being used to inflict significant harms online. This is a pervasive issue in online interactions, in particular with regard to technology-facilitated gender-based violence (TFGBV) and technology-facilitated violence (TFV) against LGBTQ+ people. This modern form of violence perpetuates gender inequality and discrimination against LGBTQ+ people and has significant impacts on its targets.

As part of a multi-year research project Supporting a Safer Internet (in partnership with the International Development Research Centre) exploring the prevalence and impacts of TFGBV experienced by women, transgender, gender non-conforming and gender-diverse people, as well as TFV against LGBTQ+ individuals, an international survey was conducted by Ipsos on behalf of the Centre for International Governance Innovation (CIGI). The survey examined the influence of gender and sexual orientation on people’s experiences with online harms, with a focus on countries in the Global South. Data was collected from 18,149 people of all genders in 18 countries.

The special report provides background information on TFGBV and TFV against LGBTQ+ people by summarizing some of the existing research on the topic. It then presents the quantitative data collected on people’s experiences with, and opinions on, online harms. A list of recommendations is provided for governments, technology companies, academics, researchers and civil society organizations on how they can contribute to addressing and ending TFV…(More)”

(Read the Supporting Safer Digital Spaces: Highlights here.; Read the French translation of the Highlights here.)

Rewiring The Web: The future of personal data


Paper by Jon Nash and Charlie Smith: “In this paper, we argue that the widespread use of personal information online represents a fundamental flaw in our digital infrastructure that enables staggeringly high levels of fraud, undermines our right to privacy, and limits competition.Charlie Smith

To realise a web fit for the twenty-first century, we need to fundamentally rethink the ways in which we interact with organisations online.

If we are to preserve the founding values of an open, interoperable web in the face of such profound change, we must update the institutions, regulatory regimes, and technologies that make up this network of networks.

Many of the problems we face stem from the vast amounts of personal information that currently flow through the internet—and fixing this fundamental flaw would have a profound effect on the quality of our lives and the workings of the web…(More)”

Adopting AI Responsibly: Guidelines for Procurement of AI Solutions by the Private Sector


WEF Report: “In today’s rapidly evolving technological landscape, responsible and ethical adoption of artificial intelligence (AI) is paramount for commercial enterprises. The exponential growth of the global AI market highlights the need for establishing standards and frameworks to ensure responsible AI practices and procurement. To address this crucial gap, the World Economic Forum, in collaboration with GEP, presents a comprehensive guide for commercial organizations…(More)”.

Privacy-enhancing technologies (PETs)


Report by the Information Commissioner’s Office (UK): “This guidance discusses privacy-enhancing technologies (PETs) in detail. Read it if you have questions not answered in the Guide, or if you need a deeper understanding to help you apply PETs in practice.

The first part of the guidance is aimed at DPOs (data protection officers) and those with specific data protection responsibilities in larger organisations. It focuses on how PETs can help you achieve compliance with data protection law.

The second part is intended for a more technical audience, and for DPOs who want to understand more detail about the types of PETs that are currently available. It gives a brief introduction to eight types of PETs and explains their risks and benefits…(More)”.

Engaging citizens in innovation policy. Why, when and how?


OECD Report: “Innovation policies need to be socially embedded for them to effectively contribute to addressing major societal challenges. Engaging citizens in innovation policymaking can help define long-term policy priorities, enhance the quality and legitimacy of policy decisions, and increase the visibility of innovation in society. However, engaging all groups in society and effectively integrating citizens’ inputs in policy processes is challenging. This paper discusses why, when and how to engage citizens in innovation policy making. It also addresses practical considerations for organising these processes, such as reaching out to diverse publics and selecting the optimal mix of methods and tools…(More)”.