Report by the World Economic Forum: “Digital transformation is becoming a crucial support mechanism for countries as they respond to the COVID-19 pandemic and undergo economic rebuilding and sustained development. For small and medium-sized cities (SMCs), digital transformation can disrupt traditional business models, breakthrough geographical and spatial boundaries, and create new ways to live in the digital era. However, the digital transformation of SMCs presents challenges such as insufficient digital talent, funds, and resources, poor understanding and application of digital technologies, and a lack of intercity interaction and cooperation mechanisms. This report analyses the challenges, needs, and concerns of SMCs undergoing digital transformation in China, Japan, Brazil, and Singapore, proposes a methodological reference model, and suggests actions for various urban stakeholders…(More)”.
Modernizing Agriculture Data Infrastructure to Improve Economic and Ecological Outcomes.
Paper by the AGree Initiative and the Data Foundation: “The paper highlights the necessity of data innovation to address a growing number of critical short and long-term food and agricultural issues, including agricultural production, environmental sustainability, nutrition assistance, food waste, and food and farm labor. It concludes by offering four practical options that are effective case studies for data acquisition, management, and use in other sectors.
Given the increasingly dynamic conditions in which the sector operates, the modernization of agricultural data collection, storage, and analysis will equip farmers, ranchers, and the U.S. Department of Agriculture (USDA) with tools to adapt, innovate, and ensure a food-secure future.
While USDA has made strides over the years, to truly unlock the potential of data to improve farm productivity and the resilience of rural communities, the department must establish a more effective data infrastructure, which will require addressing gaps in USDA’s mandate and authorities across its agencies and programs.
The white paper explores four options that are effective case studies for data acquisition, management, and use in other sectors:
- Centralized Data Infrastructure Operated by USDA
- Centralized Data Infrastructure Operated by a Non-Governmental Intermediary
- Data Linkage Hub Operated by a Non-USDA Agency in the Federal Government
- Contractual Model with Relevant Partners
Each of the models considered offers opportunities for collaboration with farmers and other stakeholders to ensure there are clear benefits and to address shortfalls in the current system. Careful consideration of the trade-offs of each option is critical given the dynamic weather and economic challenges the agriculture sector faces and the potential new economic opportunities that may be unlocked by harnessing the power of data…(More)”.
Global Data Barometer
Report and Site by the Global Data Barometer: “This report provides an overview of the Global Data Barometer findings. The Barometer includes 39 primary indicators, and over 500 sub-questions, covering 109 countries (delivering more than 60,000 data points in total). In this report, we select just a few of these to explore, providing a non-exhaustive overview of some of the topics that could be explored further using Barometer data.
- Section 1 provides a short overview of the key concepts used in the Barometer, and a short description of the methodology.
- Section 2 looks at the four key pillars of the Barometer (governance, capability, availability and use), and provides headlines from each.
- Section 3 provides a regional analysis, drawing on insights from Barometer regional hubs to understand the unique context of each region, and the relative strengths and weaknesses of countries.
- Section 4 provides a short summary of learning from the first edition, and highlights directions for future work.
The full methodology, and details of how to access and work further with Barometer data, are contained in Appendices…(More)”.
Lexota
Press Release: “Today, Global Partners Digital (GPD), the Centre for Human Rights at the University of Pretoria (CHR), Article 19 West Africa, the Collaboration on International ICT Policy in East and Southern Africa (CIPESA) and PROTEGE QV jointly launch LEXOTA—Laws on Expression Online: Tracker and Analysis, a new interactive tool to help human rights defenders track and analyse government responses to online disinformation across Sub-Saharan Africa.
Expanding on work started in 2020, LEXOTA offers a comprehensive overview of laws, policies and other government actions on disinformation in every country in Sub-Saharan Africa. The tool is powered by multilingual data and context-sensitive insight from civil society organisations and uses a detailed framework to assess whether government responses to disinformation are human rights-respecting. A dynamic comparison feature empowers users to examine the regulatory approaches of different countries and to compare how different policy responses measure up against human rights standards, providing them with insights into trends across the region as well as the option to examine country-specific analyses.
In recent years, governments in Sub-Saharan Africa have increasingly responded to disinformation through content-based restrictions and regulations, which often pose significant risks to individuals’ right to freedom of expression. LEXOTA was developed to support those working to defend internet freedom and freedom of expression across the region, by making data on these government actions accessible and comparable…(More)”.
A Consumer Price Index for the 21st Century
Press Release by the National Academies of Sciences, Engineering, and Medicine: “The Bureau of Labor Statistics (BLS) should undertake a new strategy to modernize the Consumer Price Index by accelerating its use of new data sources and developing price indexes based on different income levels, says a new report from the National Academies of Sciences, Engineering, and Medicine.
The Consumer Price Index is the most widely used measure of inflation in the U.S. It is used to determine cost-of-living allowances and, importantly, influences monetary policy, among many other private- and public-sector applications. The new report, Modernizing the Consumer Price Index for the 21st Century, says the index has traditionally relied on field-generated data, such as prices observed in person at grocery stores or major retailers. These data have become more challenging and expensive to collect, and the availability of vast digital sources of consumer price data presents an opportunity. BLS has begun tapping into these data and has said its objective is to switch a significant portion of its measurement to nontraditional and digital data sources by 2024.
“The enormous economic disruption of the COVID-19 pandemic presents a perfect case study for the need to rapidly employ new data sources for the Consumer Price Index,” said Daniel E. Sichel, professor of economics at Wellesley College, and chair of the committee that wrote the report. “Modernizing the Consumer Price Index can help our measurement of household costs and inflation be more accurate, timelier, and ultimately more useful for policymakers responding to rapidly changing economic conditions.”..
The report says BLS should embark on a strategy of accelerating and enhancing the use of scanner, web-scraped, and digital data directly from retailers in compiling the Consumer Price Index. Scanner data — recorded at the point of sale or by consumers in their homes — can expand the variety of products represented in the Consumer Price Index, and better detect shifts in buying patterns. Web-scraped data can more nimbly track the prices of online goods, and goods where one company dominates the market. Permanently automating web-scraping of price data should be a high priority for the Consumer Price Index program, especially for food, electronics, and apparel, the report says.
Embracing these alternative data sources now will ensure that the accuracy and timeliness of the Consumer Price Index will not be compromised in the future, the report adds. Moreover, accelerating this process will give BLS time to carefully assess new data sources and methodologies before taking the decision to incorporate them in the official index….(More)”
Under Construction: Citizen Participation in the European Union
Paper by Dominik Hierlemann: “Four out of five European citizens want to have a bigger say in EU policymaking. Already now, they can participate in the European Union through elections, citizens’ initiatives, consultations, petitions, dialogues, and the Ombudsman. But how well do these participation instruments work? Do citizens know about them? What is their impact on EU policymaking? This study examines seven EU participation instruments in depth. It finds that the EU offers a patchwork of participation instruments that work well in some respects but remain largely unknown and create little impact. To strengthen the voice of European citizens, the EU should move from its participation patchwork to a coherent participation infrastructure. Voting every five years is not enough. A democratically accountable and legitimate EU depends on the ongoing and effective participation of citizens…(More)”.
Data sharing between humanitarian organisations and donors
Report by Larissa Fast: “This report investigates issues related to data sharing between humanitarian actors and donors, with a focus on two key questions:
- What formal or informal frameworks govern the collection and sharing of disaggregated humanitarian data between humanitarian actors and donors?
- How are these frameworks and the related requirements understood or perceived by humanitarian actors and donors?
Drawing on interviews with donors and humanitarians about data sharing practices and examination of formal documents, the research finds that, overall and perhaps most importantly, references to ‘data’ in the context of humanitarian operations are usually generic and lack a consistent definition or even a shared terminology. Complex regulatory frameworks, variability among donor expectations, both among and within donor governments (e.g., at the country or field/headquarters levels), and among humanitarian experiences of data sharing all complicate the nature and handling of data sharing requests. Both the lack of data literacy and the differing perceptions of operational data management risks exacerbate many issues related to data sharing and create inconsistent practice (see full summary of findings in Table 3).
More specifically, while much formal documentation about data sharing between humanitarians and donors is available in the public domain, few contain explicit policies or clauses on data sharing, instead referring only to financial or compliance data and programme reporting requirements. Additionally, the justifications for sharing disaggregated humanitarian data are framed most often in terms of accountability, compliance, efficiency, and programme design. Most requests for data are linked to monitoring and compliance, as well as requests for data as ‘assurances’. Even so, donors indicated that although they request detailed/disaggregated data, they may not have the time, or human and/or technical capacity to deal with it properly. In general, donor interviewees insisted that no record level data is shared within their governments, but only aggregated or in low or no sensitivity formats….(More)”.
Can Algorithmic Recommendation Systems Be Good For Democracy? (Yes! & Chronological Feeds May Be Bad)
Article by Aviv Ovadya: Algorithmic recommendation systems (also known as recommender systems and recommendation engines) are one of the primary ways that we navigate the deluge of information from products like YouTube, Facebook, Netflix, Amazon, and TikTok. We only have a finite amount of time and attention, and recommendation systems help allocate our attention across the zettabytes of data (trillions of gigabytes!) now produced each year.
The (simplistic) “evil recommendation system” story
Recommendation systems in the prominent tech companies stereotypically use what has become referred to as “engagement-based ranking.” They aim to predict which content will lead a user to engage the most—e.g., by interacting with the content or spending more time in the product. This content is ranked higher and is the most likely to be shown to the user. The idea is that this will lead to more time using the company’s product, and thus ultimately more time viewing ads.
While this may be good for business, and is relatively easy to implement, it is likely to be a rather harmful approach—it turns out that this leads people to produce more and more sensationalist and divisive content since that is what leads to the most engagement. This is potentially very dangerous for democratic stability—if things get too divisive, the social contract supporting a democracy can falter, potentially leading to internal warfare. (Caveat: for the sake of brevity, this is a heavily simplified account, and there may be evidence that in some countries this is less of a problem; and many non-ads based companies have similar incentives.)
Is the chronological feed a fix?
The perils of engagement-based ranking have led some advocates, policymakers, and even former tech employees to want to replace recommendation systems with chronological feeds: no more recommendations, just a list of posts in order by time. This appears to make sense at first glance. If recommendation systems place business interests over democratic stability, then it seems important to eliminate them before our democracy collapses!
However, this is where the story gets a bit more complicated. Chronological feeds address some of the problems with engagement-based ranking systems, but they cause many more. To understand why, we need to consider what recommendations systems do to society…(More)”.
The Pragmatics of Democratic ‘Front-Sliding’
Article by Tom Ginsburg and Aziz Z. Huq: “The global crisis of democracy has reflected, in many cases, a gradual process sometimes characterized as “erosion” or “back-sliding.” This occurs across several fronts—political, legal, epistemic, and psychological—at the same time. As a result, any return to the democratic status quo ante must also be incremental, and confronts the challenge of where to start: How does a democracy that has survived a close call start to recreate conditions of meaningful political competition? What steps are to be taken, and in what order? There is likely to be local variance in the answers to these questions. But we think there are still lessons that can be gleaned from other countries’ experience. To that end, we start by reviewing the dynamic of backsliding. We next then to the problematics of ‘front-sliding’—i.e., the process of rebuilding the necessary political, legal, epistemic, and sociological component of democracy. We then examine distinctive and difficult question of punishing individuals who have been drivers of back-sliding. Finally, we turn, albeit briefly, to the question of how to sequence different elements of ‘front-sliding.’…(More)”.
Better, broader, safer: using health data for research and analysis
The Goldacre Review: “This review was tasked with finding ways to deliver better, broader, safer use of NHS data for analysis and research: more specifically, it was asked to identify the strategic or technical blockers to such work, and how they can be practically overcome. It was commissioned to inform, and sit alongside, the NHS Data Strategy. The recommendations are derived from extensive engagement with over 300 individuals, 8 focus groups, 100 written submissions, substantial desk research, and detailed discussion with our SSG….
In the past ‘data infrastructure’ meant beige boxes in large buildings. In the 21st century, data infrastructure is code, and people with skills. As noted in previous reviews, many shortcomings in the system have been driven by a ‘destructive impatience’: constantly chasing small, isolated, short-term projects at the expense of building a coherent system that can deliver faster, better, safer outputs for all users of data.
If we invest in platforms and curation – at less than the cost of digitising one hospital – and engage robustly with the technical challenges, then we can rapidly capitalise on our skills and data. New analysts, academics and innovators will arrive to find accessible platforms, with well curated data and accessible technical documentation. The start-up time for each new project will shrink, productivity will rocket, and lives will be saved.
Seventy-three years of complete NHS patient records contain all the noise from millions of lifetimes. Perfect, subtle signals can be coaxed from this data, and those signals go far beyond mere academic curiosity. They represent deeply buried treasure that can help prevent suffering and death around the planet on a biblical scale. It is our collective duty to make this work…(More)”.