Evaluation in the Post-Truth World


Book edited by Mita Marra, Karol Olejniczak, and Arne Paulson:”…explores the relationship between the nature of evaluative knowledge, the increasing demand in decision-making for evaluation and other forms of research evidence, and the post-truth phenomena of antiscience sentiments combined with illiberal tendencies of the present day. Rather than offer a checklist on how to deal with post-truth, the experts found herein wish to raise awareness and reflection throughout policy circles on the factors that influence our assessment and policy-related work in such a challenging environment. Journeying alongside the editor and contributors, readers benefit from three guiding questions to help identify specific challenges but tools to deal with such challenges: How are policy problems conceptualized in the current political climate? What is the relationship between expertise and decision-making in today’s political circumstances? How complex has evaluation become as a social practice? Evaluation in the Post-Truth World will benefit evaluation practitioners at the program and project levels, as well as policy analysts and scholars interested in applications of evaluation in the public policy domain…(More)”.

Mark the good stuff: Content provenance and the fight against disinformation


BBC Blog: “BBC News’s Verify team is a dedicated group of 60 journalists who fact-check, verify video, counter disinformation, analyse data and – crucially – explain complex stories in the pursuit of truth. On Monday, March 4th, Verify published their first article using a new open media provenance technology called C2PA. The C2PA standard is a technology that records digitally signed information about the provenance of imagery, video and audio – information (or signals) that shows where a piece of media has come from and how it’s been edited. Like an audit trail or a history, these signals are called ‘content credentials’.

Content credentials can be used to help audiences distinguish between authentic, trustworthy media and content that has been faked. The digital signature attached to the provenance information ensures that when the media is “validated”, the person or computer reading the image can be sure that it came from the BBC (or any other source with its own x.509 certificate).

This is important for two reasons. First, it gives publishers like the BBC the ability to share transparently with our audiences what we do every day to deliver great journalism. It also allows us to mark content that is shared across third party platforms (like Facebook) so audiences can trust that when they see a piece of BBC content it does in fact come from the BBC.

For the past three years, BBC R&D has been an active partner in the development of the C2PA standard. It has been developed in collaboration with major media and technology partners, including Microsoft, the New York Times and Adobe. Membership in C2PA is growing to include organisations from all over the world, from established hardware manufacturers like Canon, to technology leaders like OpenAI, fellow media organisations like NHK, and even the Publicis Group covering the advertising industry. Google has now joined the C2PA steering committee and social media companies are leaning in too: Meta has recently announced they are actively assessing implementing C2PA across their platforms…(More)”.

Societal challenges and big qualitative data require a new era of methodological pragmatism


Blog by Alex Gillespie, Vlad Glăveanu, and Constance de Saint-Laurent: “The ‘classic’ methods we use today in psychology and the social sciences might seem relatively fixed, but they are the product of collective responses to concerns within a historical context. The 20th century methods of questionnaires and interviews made sense in a world where researchers did not have access to what people did or said, and even if they did, could not analyse it at scale. Questionnaires and interviews were suited to 20th century concerns (shaped by colonialism, capitalism, and the ideological battles of the Cold War) for understanding, classifying, and mapping opinions and beliefs.

However, what social scientists are faced with today is different due to the culmination of two historical trends. The first has to do with the nature of the problems we face. Inequalities, the climate emergency and current wars are compounded by a general rise in nationalism, populism, and especially post-truth discourses and ideologies. Nationalism and populism are not new, but the scale and sophistication of misinformation threatens to undermine collective responses to collective problems.

It is often said that we live in the age of ‘big data’, but what is less often said is that this is in fact the age of ‘big qualitative data’.

The second trend refers to technology and its accelerated development, especially the unprecedented accumulation of naturally occurring data (digital footprints) combined with increasingly powerful methods for data analysis (traditional and generative AI). It is often said that we live in the age of ‘big data’, but what is less often said is that this is in fact the age of ‘big qualitative data’. The biggest datasets are unstructured qualitative data (each minute adds 2.5 million Google text searches, 500 thousand photos on Snapchat, 500 hours of YouTube videos) and the most significant AI advances leverage this qualitative data and make it tractable for social research.

These two trends have been fuelling the rise in mixed methods research…(More)” (See also their new book ‘Pragmatism and Methodology’ (open access)

Forced to Change: Tech Giants Bow to Global Onslaught of Rules


Article by Adam Satariano, and David McCabe: “By Thursday, Google will have changed how it displays certain search results. Microsoft will no longer force Windows customers to use its Bing internet search tool. And Apple will give iPhone and iPad users access to rival app stores and payment systems for the first time.

The tech giants have been preparing ahead of a Wednesday deadline to comply with a new European Union law intended to increase competition in the digital economy. The law, called the Digital Markets Act, requires the biggest tech companies to overhaul how some of their products work so smaller rivals can gain more access to their users.

Those changes are some of the most visible shifts that Microsoft, Apple, Google, Meta and others are making in response to a wave of new regulations and laws around the world. In the United States, some of the tech behemoths have said they will abandon practices that are the subject of federal antitrust investigations. Apple, for one, is making it easier for Android users to interact with its iMessage product, a topic that the Justice Department has been investigating.

“This is a turning point,” said Margrethe Vestager, the European Commission executive vice president in Brussels, who spent much of the past decade battling with tech giants. “Self-regulation is over.”

For decades, Apple, Amazon, Google, Microsoft and Meta barreled forward with few rules and limits. As their power, riches and reach grew, a groundswell of regulatory activity, lawmaking and legal cases sprang up against them in Europe, the United States, China, India, Canada, South Korea and Australia. Now that global tipping point for reining in the largest tech companies has finally tipped.

The companies have been forced to alter the everyday technology they offer, including devices and features of their social media services, which have been especially noticeable to users in Europe. The firms are also making consequential shifts that are less visible, to their business models, deal making and data-sharing practices, for example.

The degree of change is evident at Apple. While the Silicon Valley company once offered its App Store as a unified marketplace around the world, it now has different rules for App Store developers in South Korea, the European Union and the United States because of new laws and court rulings. The company dropped the proprietary design of an iPhone charger because of another E.U. law, meaning future iPhones will have a charger that works with non-Apple devices…(More)”.

Why Do Universities Ignore Good Ideas?


Article by Jeffrey Funk: “Here is a recent assessment of 2023 Nobel Prize Winner Katalin Kariko:

“Eight current and former colleagues of Karikó told The Daily Pennsylvanian that — over the course of three decades — the university repeatedly shunned Karikó and her research, despite its groundbreaking potential.”

Another article claims that this occurred because she could not get the financial support to continue her research.

Why couldn’t she get financial support? “You’re more likely to get grants if you’re a tenured faculty member, but you’re more likely to get promoted to tenure if you get grants,” said Eric Feigl-Ding, an epidemiologist at the New England Complex Systems Institute and a former faculty member and researcher at Harvard Medical School. “There is a vicious cycle,” he says.

Interesting. So, the idea doesn’t matter. What matters to funding agencies is that you have previously obtained funding or are a tenured professor. Really? Are funding agencies this narrow-minded?

Mr. Feigl-Ding also said, “Universities also tend to look at how much a researcher publishes, or how widely covered by the media their work is, as opposed to how innovative the research is.” But why couldn’t Karikó get published?

Science magazine tells the story of her main paper with Drew Weismann in 2005. After being rejected by Nature within 24 hours: “It was similarly rejected by Science and by Cell, and the word incremental kept cropping up in the editorial staff comments.”

Incremental? There are more than two million papers published each year, and this research, for which Karikó and Weismann won a Nobel Prize, was deemed incremental? If it had been rejected for methods or for the contents being impossible to believe, I think most people could understand the rejection. But incremental?

Obviously, most of the two million papers published each year are really incremental. Yet one of the few papers that we can all agree was not incremental, gets rejected because it was deemed incremental.

Furthermore, this is happening in a system of science in which even Nature admits “disruptive science has declined,” few science-based technologies are being successfully commercialized, and Nature admits that it doesn’t understand why…(More)”.

Why Everyone Hates The Electronic Medical Record


Article by Dharushana Muthulingam: “Patient R was in a hurry. I signed into my computer—or tried to. Recently, IT had us update to a new 14-digit password. Once in, I signed (different password) into the electronic medical record. I had already ordered routine lab tests, but R had new info. I pulled up a menu to add on an additional HIV viral load to capture early infection, which the standard antibody test might miss. R went to the lab to get his blood drawn

My last order did not print to the onsite laboratory. An observant nurse had seen the order and no tube. The patient had left without the viral load being drawn. I called the patient: could he come back? 

 Healthcare workers do not like the electronic health record (EHR), where they spend more time than with patients. Doctors hate it, as do nurse practitionersnursespharmacists, and physical therapists. The National Academies of Science, Engineering and Medicine reports the EHR is a major contributor to clinician burnout. Patient experience is mixed, though the public is still concerned about privacy, errors, interoperability and access to their own records.

The EHR promised a lot: better accuracy, streamlined care, and patient-accessible records. In February 2009, the Obama administration passed the HITECH Act on this promise, investing $36 billion to scale up health information technology. No more deciphering bad handwriting for critical info. Efficiency and cost-savings could get more people into care. We imagined cancer and rare disease registries to research treatments. We wanted portable records accessible in an emergency. We wanted to rapidly identify the spread of highly contagious respiratory illnesses and other public health crises.

Why had the lofty ambition of health information, backed by enormous resources, failed so spectacularly?…(More)”.

Public sector capacity matters, but what is it?


Blog by Rainer Kattel, Marriana Mazzucato, Rosie Collington, Fernando Fernandez-Monge, Iacopo Gronchi, Ruth Puttick: “As governments turn increasingly to public sector innovations, challenges, missions and transformative policy initiatives, the need to understand and develop public sector capacities is ever more important. In IIPP’s project with Bloomberg Philanthropies to develop a Public Sector Capabilities Index, we propose to define public sector capacities through three inter-connected layers: state capacities, organisational capabilities, and dynamic capabilities of the public organisations.

The idea that governments should be able to design and deliver effective policies has existed ever since we had governments. A quick search in Google’s Ngram viewer shows that the use of state capacity in published books has experienced exponential growth since the late 1980s. It is, however, not a coincidence that focus on state and public sector capacities more broadly emerges in the shadow of new public management and neoliberal governance and policy reforms. Rather than understanding governance as a collaborative effort between all sectors, these reforms gave normative preference to business practices. Increasing focus on public sector capacity as a concept should thus be understood as an attempt to rebalance our understanding of how change happens in societies — through cross-sectoral co-creation — and as an effort to build the muscles in public organisations to work together to tackle socio-economic challenges.

We propose to define public sector capacities through three inter-connected layers: state capacities, organizational routines, and dynamic capabilities of the public organisations…(More)”.

Civic Trust: What’s In A Concept?


Article by Stefaan Verhulst, Andrew J. Zahuranec, Oscar Romero and Kim Ochilo: “We will only be able to improve civic trust once we know how to measure it…

A visualization of the ways to measure civic trust

Recently, there’s been a noticeable decline in trust toward institutions across different sectors of society. This is a serious issue, as evidenced by surveys including the Edelman Trust BarometerGallup, and Pew Research.

Diminishing trust presents substantial obstacles. It threatens to weaken the foundation of a pluralistic democracy, adversely affects public health, and hinders the collaboration needed to tackle worldwide challenges such as climate change. Trust forms the cornerstone of democratic social contracts and is crucial for maintaining the civic agreements essential for the prosperity and cohesion of communities, cities, and countries alike.

Yet to increase civic trust, we need to know what we mean by it and how to measure it, which turns out to be a challenging exercise. Toward that end, The GovLab at New York University and the New York Civic Engagement Commission joined forces to catalog and identify methodologies to quantify and understand the nuances of civic trust.

“Building trust across New York is essential if we want to deepen civic engagement,” said Sarah Sayeed, Chair and Executive Director of the Civic Engagement Commission. “Trust is the cornerstone of a healthy community and robust democracy.”

This blog delves into various strategies for developing metrics to measure civic trust, informed by our own desk research, which categorizes civic trust metrics into descriptive, diagnostic, and evaluative measures…(More)”.

A complexity science approach to law and governance


Introduction to a Special Issue by Pierpaolo Vivo, Daniel M. Katz and J. B. Ruhl: “The premise of this Special Issue is that legal systems are complex adaptive systems, and thus complexity science can be usefully applied to improve understanding of how legal systems operate, perform and change over time. The articles that follow take this proposition as a given and act on it using a variety of methods applied to a broad array of legal system attributes and contexts. Yet not too long ago some prominent legal scholars expressed scepticism that this field of study would produce more than broad generalizations, if even that. To orient readers unfamiliar with this field and its history, here we offer a brief background on how using complexity science to study legal systems has advanced from claims of ‘pseudoscience’ status to a widely adopted mainstream method. We then situate and summarize the articles.

The focus of complexity science is complex adaptive systems (CAS), systems ‘in which large networks of components with no central control and simple rules of operation give rise to complex collective behavior, sophisticated information processing and adaptation via learning or evolution’. It is important to distinguish CAS from systems that are merely complicated, such as a combustion engine, or complex but non-adaptive, such as a hurricane. A forest or coastal ecosystem, for example, is a complicated network of diverse physical and biological components, which, under no central rules of control, is highly adaptive over time…(More)”.

The Radical How


Report by Public Digital: “…We believe in the old adage about making the most of a crisis. We think the constraints facing the next government provide an unmissable opportunity to change how government works for the better.

Any mission-focused government should be well equipped to define, from day one, what outcomes it wants to bring about.

But radically changing what the government does is only part of the challenge. We also need to change how government does things. The usual methods, we argue in this paper, are too prone to failure and delay.

There’s a different approach to public service organisation, one based on multidisciplinary teams, starting with citizen needs, and scaling iteratively by testing assumptions. We’ve been arguing in favour of it for years now, and the more it gets used, the more we see success and timely delivery.

We think taking a new approach makes it possible to shift government from an organisation of programmes and projects, to one of missions and services. It offers even constrained administrations an opportunity to improve their chances of delivering outcomes, reducing risk, saving money, and rebuilding public trust…(More)”.