Establish Data Collaboratives To Foster Meaningful Public Involvement


Article by Gwen Ottinger: “Federal agencies are striving to expand the role of the public, including members of marginalized communities, in developing regulatory policy. At the same time, agencies are considering how to mobilize data of increasing size and complexity to ensure that policies are equitable and evidence-based. However, community engagement has rarely been extended to the process of examining and interpreting data. This is a missed opportunity: community members can offer critical context to quantitative data, ground-truth data analyses, and suggest ways of looking at data that could inform policy responses to pressing problems in their lives. Realizing this opportunity requires a structure for public participation in which community members can expect both support from agency staff in accessing and understanding data and genuine openness to new perspectives on quantitative analysis. 

To deepen community involvement in developing evidence-based policy, federal agencies should form Data Collaboratives in which staff and members of the public engage in mutual learning about available datasets and their affordances for clarifying policy problems…(More)”.

‘Positive deviance’ and the power of outliers


Bloomberg Cities Network: “Groundbreaking solutions in cities are often the result of visionary mayoral leadership. But sometimes certain communities achieve significantly better outcomes than their similarly resourced neighbors—and the underlying reasons may not be immediately obvious to local leaders. Ravi Gurumurthy, CEO of the global innovation foundation Nesta, believes that this variation in quality of life at a hyper-local level is something worth paying a lot more attention to. 

“The fastest way for us to improve people’s lives will be to mine that variation and really understand what is going on,” he says.    

This concept, known as “positive deviance,” describes individuals or communities that achieve remarkable success or exhibit highly effective behaviors despite facing the same constraints as their peers. With a long history of use in international development, positive deviance is now gaining traction among city leaders as a source of solutions to stubborn urban challenges.  

Here’s a closer look at what it’s about, and how it’s already being used to uplift promising approaches in cities. 

What is positive deviance? 

Positive deviance first gained widespread attention because of a remarkable success story in 1990s Vietnam. Much of the country was suffering from a malnutrition crisis, and efforts to design and implement new solutions were coming up short. But aid workers landed on a breakthrough by paying closer attention to children who already appeared larger and healthier than their peers.  

It turned out these children were being fed different diets—leaning more heavily on shrimp and crab, for example, which were widely accessible but less often fed to young people. These children also were being fed more frequently, in smaller meals, throughout the day—an intervention that, again, did not require parents to have more resources so much as to differently use what was universally available.  

When these practices—feeding kids shellfish and making meals smaller and more frequent—were replicated, malnutrition plummeted…(More)”

How do you accidentally run for President of Iceland?


Blog by Anna Andersen: “Content design can have real consequences — for democracy, even…

To run for President of Iceland, you need to be an Icelandic citizen, at least 35 years old, and have 1,500 endorsements.

For the first time in Icelandic history, this endorsement process is digital. Instead of collecting all their signatures on paper the old-fashioned way, candidates can now send people to https://island.is/forsetaframbod to submit their endorsement.

This change has, also for the first time in Icelandic history, given the nation a clear window into who is trying to run — and it’s a remarkably large number. To date, 82 people are collecting endorsements, including a comedian, a model, the world’s first double-arm transplant receiver, and my aunt Helga.

Many of these people are seriously vying for president (yep, my aunt Helga), some of them have undoubtedly signed up as a joke (nope, not the comedian), and at least 11 of them accidentally registered and had no idea that they were collecting endorsements for their candidacy.

“I’m definitely not about to run for president, this was just an accident,” one person told a reporter after having a good laugh about it.

“That’s hilarious!” another person said, thanking the reporter for letting them know that they were in the running.

As a content designer, I was intrigued. How could so many people accidentally start a campaign for President of Iceland?

It turns out, the answer largely has to do with content design.Presidential hopefuls were sending people a link to a page where they could be endorsed, but instead of endorsing the candidate, some people accidentally registered to be a candidate…(More)”.

The Open Data Maturity Ranking is shoddy – it badly needs to be re-thought


Article by Olesya Grabova: “Digitalising government is essential for Europe’s future innovation and economic growth and one of the keys to achieving this is open data – information that public entities gather, create, or fund, and it’s accessible to all to freely use.

This includes everything from public budget details to transport schedules. Open data’s benefits are vast — it fuels research, boosts innovation, and can even save lives in wartime through the creation of chatbots with information about bomb shelter locations. It’s estimated that its economic value will reach a total of EUR 194 billion for EU countries and the UK by 2030.

This is why correctly measuring European countries’ progress in open data is so important. And that’s why the European Commission developed the Open Data Maturity (ODM) ranking, which annually measures open data quality, policies, online portals, and impact across 35 European countries.

Alas, however, it doesn’t work as well as it should and this needs to be addressed.

A closer look at the report’s overall approach reveals the ranking hardly reflects countries’ real progress when it comes to open data. This flawed system, rather than guiding countries towards genuine improvement, risks misrepresenting their actual progress and misleads citizens about their country’s advancements, which further stalls opportunities for innovation.

Take Slovakia. It’s apparently the biggest climber,  leaping from 29th to 10th place in just over a year. One would expect that the country has made significant progress in making public sector information available and stimulating its reuse – one of the OMD assessment’s key elements.

A deeper examination reveals that this isn’t the case. Looking at the ODM’s methodology highlights where it falls short… and how it can be fixed…(More)”.

The economic research policymakers actually need


Blog by Jed Kolko: “…The structure of academia just isn’t set up to produce the kind of research many policymakers need. Instead, top academic journal editors and tenure committees reward research that pushes the boundaries of the discipline and makes new theoretical or empirical contributions. And most academic papers presume familiarity with the relevant academic literature, making it difficult for anyone outside of academia to make the best possible use of them.

The most useful research often came instead from regional Federal Reserve banks, non-partisan think-tanks, the corporate sector, and from academics who had the support, freedom, or job security to prioritize policy relevance. It generally fell into three categories:

  1. New measures of the economy
  2. Broad literature reviews
  3. Analyses that directly quantify or simulate policy decisions.

If you’re an economic researcher and you want to do work that is actually helpful for policymakers — and increases economists’ influence in government — aim for one of those three buckets.

The pandemic and its aftermath brought an urgent need for data at higher frequency, with greater geographic and sectoral detail, and about ways the economy suddenly changed. Some of the most useful research contributions during that period were new data and measures of the economy: they were valuable as ingredients rather than as recipes or finished meals. Here are some examples:

Millions of gamers advance biomedical research


Article by McGill: “…4.5 million gamers around the world have advanced medical science by helping to reconstruct microbial evolutionary histories using a minigame included inside the critically and commercially successful video game, Borderlands 3. Their playing has led to a significantly refined estimate of the relationships of microbes in the human gut. The results of this collaboration will both substantially advance our knowledge of the microbiome and improve on the AI programs that will be used to carry out this work in future.

By playing Borderlands Science, a mini-game within the looter-shooter video game Borderlands 3, these players have helped trace the evolutionary relationships of more than a million different kinds of bacteria that live in the human gut, some of which play a crucial role in our health. This information represents an exponential increase in what we have discovered about the microbiome up till now. By aligning rows of tiles which represent the genetic building blocks of different microbes, humans have been able to take on tasks that even the best existing computer algorithms have been unable to solve yet…(More) (and More)”.

Evidence Ecosystems and the Challenge of Humanising and Normalising Evidence


Article by Geoff Mulgan: “It is reasonable to assume that the work of governments, businesses and civil society goes better if the people making decisions are well-informed, using reliable facts and strong evidence rather than only hunch and anecdote.  The term ‘evidence ecosystem’1  is a useful shorthand for the results of systematic attempts to make this easier, enabling decision makers, particularly in governments, to access the best available evidence, in easily digestible forms and when it’s needed.  

…This sounds simple.  But these ecosystems are as varied as ecosystems in nature.  How they work depends on many factors, including how political or technical the issues are; the presence or absence of confident, well-organised professions; the availability of good quality evidence; whether there is a political culture that values research; and much more.

In particular, the paper argues that the next generation of evidence ecosystems need a sharper understanding of how the supply of evidence meets demand, and the human dimension of evidence.  That means cultivating lasting relationships rather than relying too much on a linear flow of evidence from researchers to decision-makers; it means using conversation as much as prose reports to ensure evidence is understood and acted on; and it means making use of stories as well as dry analysis.  It depends, in other words, on recognising that the users of evidence are humans.

In terms of prescription the paper emphasises:

  • Sustainability/normalisation: the best approaches are embedded, part of the daily life of decision-making rather than depending on one-off projects and programmes.  This applies both to evidence and to data.  Yet embeddedness is the exception rather than the rule.
  • Multiplicity: multiple types of knowledge, and logics, are relevant to decisions, which is why people and institutions that understand these different logics are so vital.  
  • Credibility and relationships: the intermediaries who connect the supply and demand of knowledge need to be credible, with both depth of knowledge and an ability to interpret it for diverse audiences, and they need to be able to create and maintain relationships, which will usually be either place or topic based, and will take time to develop, with the communication of evidence often done best in conversation.
  • Stories: influencing decision-makers depends on indirect as well as direct communication, since the media in all their forms play a crucial role in validating evidence and evidence travels best with stories, vignettes and anecdotes.

In short, while evidence is founded on rigorous analysis, good data and robust methods, it also needs to be humanised – embedded in relationships, brought alive in conversations and vivid, human stories – and normalised, becoming part of everyday work…(More)”.

How Copyright May Destroy Our Access To The World’s Academic Knowledge


Article by Glyn Moody: “The shift from analogue to digital has had a massive impact on most aspects of life. One area where that shift has the potential for huge benefits is in the world of academic publishing. Academic papers are costly to publish and distribute on paper, but in a digital format they can be shared globally for almost no cost. That’s one of the driving forces behind the open access movement. But as Walled Culture has reported, resistance from the traditional publishing world has slowed the shift to open access, and undercut the benefits that could flow from it.

That in itself is bad news, but new research from Martin Paul Eve (available as open access) shows that the way the shift to digital has been managed by publishers brings with it a new problem. For all their flaws, analogue publications have the great virtue that they are durable: once a library has a copy, it is likely to be available for decades, if not centuries. Digital scholarly articles come with no such guarantee. The Internet is constantly in flux, with many publishers and sites closing down each year, often without notice. That’s a problem when sites holding archival copies of scholarly articles vanish, making it harder, perhaps impossible, to access important papers. Eve explored whether publishers were placing copies of the articles they published in key archives. Ideally, digital papers would be available in multiple archives to ensure resilience, but the reality is that very few publishers did this. Ars Technica has a good summary of Eve’s results:

When Eve broke down the results by publisher, less than 1 percent of the 204 publishers had put the majority of their content into multiple archives. (The cutoff was 75 percent of their content in three or more archives.) Fewer than 10 percent had put more than half their content in at least two archives. And a full third seemed to be doing no organized archiving at all.

At the individual publication level, under 60 percent were present in at least one archive, and over a quarter didn’t appear to be in any of the archives at all. (Another 14 percent were published too recently to have been archived or had incomplete records.)..(More)”.

Monitoring global trade using data on vessel traffic


Article by Graham Pilgrim, Emmanuelle Guidetti and Annabelle Mourougane: “Rising uncertainties and geo-political tensions, together with more complex trade relations have increased the demand for data and tools to monitor global trade in a timely manner. At the same time, advances in Big Data Analytics and access to a huge quantity of alternative data – outside the realm of official statistics – have opened new avenues to monitor trade. These data can help identify bottlenecks and disruptions in real time but need to be cleaned and validated.

One such alternative data source is the Automatic Identification System (AIS), developed by the International Maritime Organisation, facilitating the tracking of vessels across the globe. The system includes messages transmitted by ships to land or satellite receivers, available in quasi real time. While it was primarily designed to ensure vessel safety, this data is particularly well suited for providing insights on trade developments, as over 80% in volume of international merchandise trade is carried by sea (UNCTAD, 2022). Furthermore, AIS data holds granular vessel information and detailed location data, which combined with other data sources can enable the identification of activity at a port (or even berth) level, by vessel type or by the jurisdiction of vessel ownership.

For a number of years, the UN Global Platform has made AIS data available to those compiling official statistics, such as National Statistics Offices (NSOs) or International Organisations. This has facilitated the development of new methodologies, for instance the automated identification of port locations (Irish Central Statistics Office, 2022). The data has also been exploited by data scientists and research centres to monitor trade in specific commodities such as Liquefied Natural Gas (QuantCube Technology, 2022) or to analyse port and shipping operations in a specific country (Tsalamanis et al., 2018). Beyond trade, the dataset has been used to track CO2 emissions from the maritime sector (Clarke et al., 2023).

New work from the OECD Statistics and Data Directorate contributes to existing research in this field in two major ways. First, it proposes a new methodology to identify ports, at a higher level of precision than in past research. Second, it builds indicators to monitor port congestion and trends in maritime trade flows and provides a tool to get detailed information and better understand those flows…(More)”.

What Does Information Integrity Mean for Democracies?


Article by Kamya Yadav and Samantha Lai: “Democracies around the world are encountering unique challenges with the rise of new technologies. Experts continue to debate how social media has impacted democratic discourse, pointing to how algorithmic recommendationsinfluence operations, and cultural changes in norms of communication alter the way people consume information. Meanwhile, developments in artificial intelligence (AI) surface new concerns over how the technology might affect voters’ decision-making process. Already, we have seen its increased use in relation to political campaigning. 

In the run-up to Pakistan’s 2024 presidential elections, former Prime Minister Imran Khan used an artificially generated speech to campaign while imprisoned. Meanwhile, in the United States, a private company used an AI-generated imitation of President Biden’s voice to discourage people from voting. In response, the Federal Communications Commission outlawed the use of AI-generated robocalls.

Evolving technologies present new threats. Disinformation, misinformation, and propaganda are all different faces of the same problem: Our information environment—the ecosystem in which we disseminate, create, receive, and process information—is not secure and we lack coherent goals to direct policy actions. Formulating short-term, reactive policy to counter or mitigate the effects of disinformation or propaganda can only bring us so far. Beyond defending democracies from unending threats, we should also be looking at what it will take to strengthen them. This begs the question: How do we work toward building secure and resilient information ecosystems? How can policymakers and democratic governments identify policy areas that require further improvement and shape their actions accordingly?…(More)”.