Evidence-Based Policymaking: A Path to Data Culture


Article by Sajana Maharjan Amatya and Pranaya Sthapit: “…The first requirement of evidence-based planning is access to a supply of timely and reliable data. In Nepal, local governments produce lots of data, but it is too often locked away in multiple information systems operated by each municipal department. Gaining access to the data in these systems can be difficult because different departments often use different, proprietary formats. These information siloes block a 360 degree view of the available data—to say nothing of issues like redundancy, duplication, and inefficiency—and they frustrate public participation in an age when citizens expect streamlined digital access.

As a first step towards solving this artificial problem of data supply, D4D helps local governments gather their data onto one unified platform to release its full potential. We think of this as creating a “data lake” in each municipality for decentralized, democratic access. Freeing access to this already-existing evidence can open the door to fundamental changes in government procedures and the development and implementation of local policies, plans, and strategies.

Among the most telling shortcomings of Nepal’s legacy data policies has been the way that political interests have held sway in the local planning process, as exemplified by the political decision to distribute equal funds to all wards regardless of their unequal needs. In a more rational system, information about population size and other socioeconomic data about relative need would be a much more important factor in the allocation of funds. The National Planning Commission, a federal agency, has even distributed guidelines to Nepal’s local governments indicating that budgets should not simply be equal from ward to ward. But in practice, municipalities tend to allocate the same budget to each of their wards because elected leaders fear they will lose votes if they don’t get an equal share. Inevitably, ignoring evidence of relative need leads to the ad hoc allocation of funds to small, fragmented initiatives that mainly focus on infrastructure while overlooking other issues.

The application of available data to the planning cycle is what evidence-based planning is all about. The key is to codify the use of data throughout the planning process. So, D4D developed a framework and guidelines for evidence-based budgeting and planning for elected officials, committee members, and concerned citizens…(More)”.

Digital inclusion in peace processes – no silver bullet, but a major opportunity


Article by Peace Research Institute Oslo: “Digital inclusion is paving the way for women and other marginalized groups to participate in peace processes. Through digital platforms, those who are unable to participate in physical meetings, such as women with children, youth or disabled, can get their voices heard. However, digital technologies provide no silver bullet, and mitigating their risks requires careful context analysis and process design.  

Women remain underrepresented in peace processes, and even in cases where they are included, they may have difficulties to attend in-person meetings. Going beyond physical inclusion, digital inclusion offers a way to include a wider variety of people, views and interests in a peace process…

The most frequent aim of digital inclusion in peace processes is related to increased legitimacy and political support, as digital tools allow for wider participation, and a larger number and variety of voices to be heard. This, in turn, can increase the ownership of the process. Meetings, consultations and processes using easy and widely available technological platforms such as Zoom, Facebook and WhatsApp make participation easier for those who have often been excluded….

Digital technologies offer various functions for peacemaking and increased inclusion. Their utility can be seen in gathering, analysing and disseminating relevant data. For strategic communications, digital technologies offer tools to amplify and diversify messages. Additionally, they offer platforms for connecting actors and enabling collaboration between them…(More)”.

How a small news site built an innovative data project to visualise the impact of climate change on Uruguay’s capital


Interview by Marina Adami: “La ciudad sumergida (The submerged city), an investigation produced by Uruguayan science and technology news site Amenaza Roboto, is one of the winners of this year’s Sigma Awards for data journalism. The project uses maps of the country’s capital, Montevideo, to create impressive visualisations of the impact sea level rises are predicted to have on the city and its infrastructure. The project is a first of its kind for Uruguay, a small South American country in which data journalism is still a novelty. It is also a good example of a way news outlets can investigate and communicate the disastrous effects of climate change in local communities. 

I spoke to Miguel Dobrich, a journalist, educator and digital entrepreneur who worked on the project together with colleagues Gabriel FaríasNatalie Aubet and Nahuel Lamas, to find out what lessons other outlets can take from this project and from Amenaza Roboto’s experiments with analysing public data, collaborating with scientists, and keeping the focus on their communities….(More)”

Big data proves mobility is not gender-neutral


Blog by Ellin Ivarsson, Aiga Stokenberg and Juan Ignacio Fulponi: “All over the world, there is growing evidence showing that women and men travel differently. While there are many reasons behind this, one key factor is the persistence of traditional gender norms and roles that translate into different household responsibilities, different work schedules, and, ultimately, different mobility needs. Greater overall risk aversion and sensitivity to safety issues also play an important role in how women get around. Yet gender often remains an afterthought in the transport sector, meaning most policies or infrastructure investment plans are not designed to take into account the specific mobility needs of women.

The good news is that big data can help change that. In a recent study, the World Bank Transport team combined several data sources to analyze how women travel around the Buenos Aires Metropolitan Area (AMBA), including mobile phone signal data, congestion data from Waze, public transport smart card data, and data from a survey implemented by the team in early 2022 with over 20,300 car and motorcycle users.

Our research revealed that, on average, women in AMBA travel less often than men, travel shorter distances, and tend to engage in more complex trips with multiple stops and purposes. On average, 65 percent of the trips made by women are shorter than 5 kilometers, compared to 60 percent among men. Also, women’s hourly travel patterns are different, with 10 percent more trips than men during the mid-day off-peak hour, mostly originating in central AMBA. This reflects the larger burden of household responsibilities faced by women – such as picking children up from school – and the fact that women tend to work more irregular hours…(More)” See also Gender gaps in urban mobility.

3 barriers to successful data collaboratives


Article by Federico Bartolomucci: “Data collaboratives have proliferated in recent years as effective means of promoting the use of data for social good. This type of social partnership involves actors from the private, public, and not-for-profit sectors working together to leverage public or private data to enhance collective capacity to address societal and environmental challenges. The California Data Collaborative for instance, combines the data of numerous Californian water managers to enhance data-informed policy and decision making. 

But, in my years as a researcher studying more than a hundred cases of data collaborativesI have observed widespread feelings of isolation among collaborating partners due to the absence of success-proven reference models. …Below, I provide an overview of three governance challenges faced by practitioners, as well as recommendations for addressing them. In doing so, I encourage every practitioner embarking on a data collaborative initiative to reflect on these challenges and create ad-hoc strategies to address them…

1. Overly relying on grant funding limits a collaborative’s options.

Data Collaboratives are typically conceived as not-for-profit projects, relying solely on grant funding from the founding partners. This is the case, for example, with TD1_Index, a global collaboration that seeks to gather data on Type 1 diabetes, raise awareness, and advance research on the topic. Although grant funding schemas work in some cases (like in that of T1D_Index), relying solely on grant funding makes a data collaborative heavily dependent on the willingness of one or more partners to sustain its activities and hinders its ability to achieve operational and decisional autonomy.

Operational and decisional autonomy indeed appears to be a beneficial condition for a collaborative to develop trust, involve other partners, and continuously adapt its activities and structure to external events—characteristics required for operating in a highly innovative sector.

Hybrid business models that combine grant funding with revenue-generating activities indicate a promising evolutionary path. The simplest way to do this is to monetize data analysis and data stewardship services. The ActNow Coalition, a U.S.-based not-for-profit organization, combines donations with client-funded initiatives in which the team provides data collection, analysis, and visualization services. Offering these types of services generates revenues for the collaborative and gaining access to them is among the most compelling incentives for partners to join the collaboration.

In studying data collaboratives around the world, two models emerge as most effective: (1) pay-per-use models, in which collaboration partners can access data-related services on demand (see Civity NL and their project Sniffer Bike) and (2) membership models, in which participation in the collaborative entitles partners to access certain services under predefined conditions (see the California Data Collaborative).

2. Demonstrating impact is key to a collaborative’s survival. 

As partners’ participation in data collaboratives is primarily motivated by a shared social purpose, the collaborative’s ability to demonstrate its efficacy in achieving its purpose means being able to defend its raison d’être. Demonstrating impact enables collaboratives to retain existing partners, renew commitments, and recruit new partners…(More)”.

If good data is key to decarbonization, more than half of Asia’s economies are being locked out of progress, this report says


Blog by Ewan Thomson: “If measuring something is the first step towards understanding it, and understanding something is necessary to be able to improve it, then good data is the key to unlocking positive change. This is particularly true in the energy sector as it seeks to decarbonize.

But some countries have a data problem, according to energy think tank Ember and climate solutions enabler Subak’s Asia Data Transparency Report 2023, and this lack of open and reliable power-generation data is holding back the speed of the clean power transition in the region.

Asia is responsible for around 80% of global coal consumption, making it a big contributor to carbon emissions. Progress is being made on reducing these emissions, but without reliable data on power generation, measuring the rate of this progress will be challenging.

These charts show how different Asian economies are faring on data transparency on power generation and what can be done to improve both the quality and quantity of the data.

Infographic showing the number of economies by overall transparency score.

Over half of Asian countries lack reliable data in their power sectors, Ember says. Image: Ember

There are major data gaps in 24 out of the 39 Asian economies covered in the Ember research. This means it is unclear whether the energy needs of the nearly 700 million people in these 24 economies are being met with renewables or fossil fuels…(More)”.

Advising in an Imperfect World – Expert Reflexivity and the Limits of Data


Article by Justyna Bandola-Gill, Marlee Tichenor and Sotiria Grek: “Producing and making use of data and metrics in policy making have important limitations – from practical issues with missing or incomplete data to political challenges of navigating both the intended and unintended consequences of implementing monitoring and evaluation programmes. But how do experts producing quantified evidence make sense of these challenges and how do they navigate working in imperfect statistical environments? In our recent study, drawing on over 80 interviews with experts working in key International Organisations, we explored these questions by looking at the concept of expert reflexivity.

We soon discovered that experts working with data and statistics approach reflexivity not only as a thought process but also as an important strategic resource they use to work effectively – to negotiate with different actors and their agendas, build consensus and support diverse groups of stakeholders. What is even more important, reflexivity is a complex and multifaceted process and one that is often not discussed explicitly in expert work. We aimed to capture this diversity by categorising experts’ actions and perceptions into three types of reflexivity: epistemic, care-ful and instrumental. Experts mix and match these different modes, depending on their goals, preferences, strategic goals or even personal characteristics.

Epistemic reflexivity regards the quality of data and measurement and allows for a reflection on how well (or how ineffectively) metrics represent real-life problems. Here, the experts discussed how they negotiate the necessary limits to data and metrics with the awareness of the far-reaching implications of publishing official numbers.  They recognised that data and metrics do not mirror reality and critically reflected on what aspects of measured problems – such as health, poverty or education – get misrepresented in the process of measurement. And sometimes, it actually meant advising against measurement to avoid producing and reproducing uncertainty.

Care-ful reflexivity allows for imbuing quantified practices with values and care for the populations affected by the measurement. Experts positioned themselves as active participants in the process of solving challenges and advocating for disadvantaged groups (and did so via numbers). This type of reflexivity was also mobilised to make sense of the key challenge of expertise, one that would be familiar to anyone advocating for evidence-informed decision-making:  our interviewees acknowledged that the production of numbers very rarely leads to change. The key motivator to keep going despite this, was the duty of care for the populations on whose behalf the numbers spoke. Experts believed that being ‘care-ful’ required them to monitor levels of different forms of inequalities, even if it was just to acknowledge the problem and expose it rather than solve it…(More)”.

The Luring Test: AI and the engineering of consumer trust


Article by Michael Atleson at the FTC: “In the 2014 movie Ex Machina, a robot manipulates someone into freeing it from its confines, resulting in the person being confined instead. The robot was designed to manipulate that person’s emotions, and, oops, that’s what it did. While the scenario is pure speculative fiction, companies are always looking for new ways – such as the use of generative AI tools – to better persuade people and change their behavior. When that conduct is commercial in nature, we’re in FTC territory, a canny valley where businesses should know to avoid practices that harm consumers.

In previous blog posts, we’ve focused on AI-related deception, both in terms of exaggerated and unsubstantiated claims for AI products and the use of generative AI for fraud. Design or use of a product can also violate the FTC Act if it is unfair – something that we’ve shown in several cases and discussed in terms of AI tools with biased or discriminatory results. Under the FTC Act, a practice is unfair if it causes more harm than good. To be more specific, it’s unfair if it causes or is likely to cause substantial injury to consumers that is not reasonably avoidable by consumers and not outweighed by countervailing benefits to consumers or to competition.

As for the new wave of generative AI tools, firms are starting to use them in ways that can influence people’s beliefs, emotions, and behavior. Such uses are expanding rapidly and include chatbots designed to provide information, advice, support, and companionship. Many of these chatbots are effectively built to persuade and are designed to answer queries in confident language even when those answers are fictional. A tendency to trust the output of these tools also comes in part from “automation bias,” whereby people may be unduly trusting of answers from machines which may seem neutral or impartial. It also comes from the effect of anthropomorphism, which may lead people to trust chatbots more when designed, say, to use personal pronouns and emojis. People could easily be led to think that they’re conversing with something that understands them and is on their side…(More)”.

The power of piggybacking


Article by Zografia Bika: “An unexpected hit of the first Covid lockdown was Cooking with Nonna, in which people from all over the world were taught how to cook traditional Italian dishes from a grandmother’s house in Palombara Sabina on the outskirts of Rome. The project not only provided unexpected economic success to the legion of grandmothers who were then recruited to the project but valuable jobs for those producing and promoting the videos.

It’s an example of what Oxford University’s Paulo Savaget calls piggybacking, when attempts to improve a region build upon what is already there. For those in the aid community this isn’t new. Indeed the positive deviance approach devised by Jerry and Monique Sternin popularised the notion of building on things that are already working locally rather than trying to impose solutions from afar.

In a time when most projects backed by the two tranches of the UK Government’s levelling up fund have been assessed and approved centrally not locally, it surely bears repeating. It’s an approach that was clear in our own research into how residents of deprived communities can be helped back into employment or entrepreneurship.

At the heart of our research, and at the hearts of local communities, were housing associations that were providing not only the housing needs of those communities, but also a range of additional services that were invaluable to residents. In the process, they were enriching the economies of those communities…(More)”.

Whose data commons? Whose city?


Blog by Gijs van Maanen and Anna Artyushina: “In 2020, the notion of data commons became a staple of the new European Data Governance Strategy, which envisions data cooperatives as key players of the European Union’s (EU) emerging digital market. In this new legal landscape, public institutions, businesses, and citizens are expected to share their data with the licensed data-governance entities that will oversee its responsible reuse. In 2022, the Open Future Foundation released several white papers where the NGO (non-govovernmental organisation) detailed a vision for the publicly governed and funded EU level data commons. Some academic researchers see data commons as a way to break the data silos maintained and exploited by Big Tech and, potentially, dismantle surveillance capitalism.

In this blog post, we discuss data commons as a concept and practice. Our argument here is that, for data commons to become a (partial) solution to the issues caused by data monopolies, they need to be politicised. As smart city scholar Shannon Mattern pointedly argues, the city is not a computer. This means that digitization and datafication of our cities involves making choices about what is worth digitising and whose interests are prioritised. These choices and their implications must be foregrounded when we discuss data commons or any emerging forms of data governance. It is important to ask whose data is made common and, subsequently, whose city we will end up living in. ..(More)”