Modernizing Agriculture Data Infrastructure to Improve Economic and Ecological Outcomes.


Paper by the AGree Initiative and the Data Foundation: “The paper highlights the necessity of data innovation to address a growing number of critical short and long-term food and agricultural issues, including agricultural production, environmental sustainability, nutrition assistance, food waste, and food and farm labor. It concludes by offering four practical options that are effective case studies for data acquisition, management, and use in other sectors.

Given the increasingly dynamic conditions in which the sector operates, the modernization of agricultural data collection, storage, and analysis will equip farmers, ranchers, and the U.S. Department of Agriculture (USDA) with tools to adapt, innovate, and ensure a food-secure future.

While USDA has made strides over the years, to truly unlock the potential of data to improve farm productivity and the resilience of rural communities, the department must establish a more effective data infrastructure, which will require addressing gaps in USDA’s mandate and authorities across its agencies and programs.

The white paper explores four options that are effective case studies for data acquisition, management, and use in other sectors:

  1. Centralized Data Infrastructure Operated by USDA
  2. Centralized Data Infrastructure Operated by a Non-Governmental Intermediary
  3. Data Linkage Hub Operated by a Non-USDA Agency in the Federal Government
  4. Contractual Model with Relevant Partners

Each of the models considered offers opportunities for collaboration with farmers and other stakeholders to ensure there are clear benefits and to address shortfalls in the current system. Careful consideration of the trade-offs of each option is critical given the dynamic weather and economic challenges the agriculture sector faces and the potential new economic opportunities that may be unlocked by harnessing the power of data…(More)”.

Global Data Barometer


Report and Site by the Global Data Barometer: “This report provides an overview of the Global Data Barometer findings. The Barometer includes 39 primary indicators, and over 500 sub-questions, covering 109 countries (delivering more than 60,000 data points in total). In this report, we select just a few of these to explore, providing a non-exhaustive overview of some of the topics that could be explored further using Barometer data.

  • Section 1 provides a short overview of the key concepts used in the Barometer, and a short description of the methodology.
  • Section 2 looks at the four key pillars of the Barometer (governance, capability, availability and use), and provides headlines from each.
  • Section 3 provides a regional analysis, drawing on insights from Barometer regional hubs to understand the unique context of each region, and the relative strengths and weaknesses of countries.
  • Section 4 provides a short summary of learning from the first edition, and highlights directions for future work.

The full methodology, and details of how to access and work further with Barometer data, are contained in Appendices…(More)”.

A Movement That’s Quietly Reshaping Democracy For The Better


Essay by Claudia Chwalisz: “Imagine you receive an invitation one day from your mayor, inviting you to serve as a member of your city’s newly established permanent Citizens’ Assembly. You will be one of 100 others like you — people who are not politicians or even necessarily party members. All of you were drawn by lot through a fair and random process called a civic lottery. Together, you are broadly representative of the community — a mix of bakers, doctors, students, accountants, shopkeepers and more. You are young and old and from many backgrounds — everybody living in the city over age 16 is eligible, and anyone can take part regardless of citizenship status. Essentially, this group of 100 people is a microcosm of the wider public. Your mandate lasts for one year, after which a new group of people will be drawn by lot.

This is not just a thought experiment. Since the 1980s, a wave of such citizens’ assemblies has been building, and it has been gaining momentum since 2010. Over the past four decades, hundreds of thousands of people around the world have received invitations from heads of state, ministers, mayors and other public authorities to serve as members of over 500 citizens’ assemblies and other deliberative processes to inform policy making. Important decisions have been shaped by everyday people about 10-year, $5 billion strategic plans, 30-year infrastructure investment strategies, tackling online hate speech and harassment, taking preventative action against increased flood risks, improving air quality, reducing greenhouse gas emissions and many other issues.

As governance systems are failing to address some of society’s most pressing issues and trust between citizens and government is faltering, these new institutions embody the potential of democratic renewal. They create the democratic spaces for everyday people to grapple with the complexity of policy issues, listen to one another and find common ground. In doing so, they create the conditions to overcome polarization and strengthen societal cohesion. They bring out the collective intelligence of society — the principle that many diverse people will come to better decisions than more homogeneous groups…(More)”.

Technology of the Oppressed


Book by David Nemer: “Brazilian favelas are impoverished settlements usually located on hillsides or the outskirts of a city. In Technology of the Oppressed, David Nemer draws on extensive ethnographic fieldwork to provide a rich account of how favela residents engage with technology in community technology centers and in their everyday lives. Their stories reveal the structural violence of the information age. But they also show how those oppressed by technology don’t just reject it, but consciously resist and appropriate it, and how their experiences with digital technologies enable them to navigate both digital and nondigital sources of oppression—and even, at times, to flourish.

Nemer uses a decolonial and intersectional framework called Mundane Technology as an analytical tool to understand how digital technologies can simultaneously be sites of oppression and tools in the fight for freedom. Building on the work of the Brazilian educator and philosopher Paulo Freire, he shows how the favela residents appropriate everyday technologies—technological artifacts (cell phones, Facebook), operations (repair), and spaces (Telecenters and Lan Houses)—and use them to alleviate the oppression in their everyday lives. He also addresses the relationship of misinformation to radicalization and the rise of the new far right. Contrary to the simplistic techno-optimistic belief that technology will save the poor, even with access to technology these marginalized people face numerous sources of oppression, including technological biases, racism, classism, sexism, and censorship. Yet the spirit, love, community, resilience, and resistance of favela residents make possible their pursuit of freedom…(More)”.

Using mobile money data and call detail records to explore the risks of urban migration in Tanzania


Paper by Rosa Lavelle-Hill: “Understanding what factors predict whether an urban migrant will end up in a deprived neighbourhood or not could help prevent the exploitation of vulnerable individuals. This study leveraged pseudonymized mobile money interactions combined with cell phone data to shed light on urban migration patterns and deprivation in Tanzania. Call detail records were used to identify individuals who migrated to Dar es Salaam, Tanzania’s largest city. A street survey of the city’s subwards was used to determine which individuals moved to more deprived areas. t-tests showed that people who settled in poorer neighbourhoods had less money coming into their mobile money account after they moved, but not before. A machine learning approach was then utilized to predict which migrants will move to poorer areas of the city, making them arguably more vulnerable to poverty, unemployment and exploitation. Features indicating the strength and location of people’s social connections in Dar es Salaam before they moved (‘pull factors’) were found to be most predictive, more so than traditional ‘push factors’ such as proxies for poverty in the migrant’s source region…(More)”.

Lexota


Press Release: “Today, Global Partners Digital (GPD), the Centre for Human Rights at the University of Pretoria (CHR), Article 19 West Africa, the Collaboration on International ICT Policy in East and Southern Africa (CIPESA) and PROTEGE QV jointly launch LEXOTA—Laws on Expression Online: Tracker and Analysis, a new interactive tool to help human rights defenders track and analyse government responses to online disinformation across Sub-Saharan Africa. 

Expanding on work started in 2020, LEXOTA offers a comprehensive overview of laws, policies and other government actions on disinformation in every country in Sub-Saharan Africa. The tool is powered by multilingual data and context-sensitive insight from civil society organisations and uses a detailed framework to assess whether government responses to disinformation are human rights-respecting. A dynamic comparison feature empowers users to examine the regulatory approaches of different countries and to compare how different policy responses measure up against human rights standards, providing them with insights into trends across the region as well as the option to examine country-specific analyses. 

In recent years, governments in Sub-Saharan Africa have increasingly responded to disinformation through content-based restrictions and regulations, which often pose significant risks to individuals’ right to freedom of expression. LEXOTA was developed to support those working to defend internet freedom and freedom of expression across the region, by making data on these government actions accessible and comparable…(More)”.

A Consumer Price Index for the 21st Century


Press Release by the National Academies of Sciences, Engineering, and Medicine: “The Bureau of Labor Statistics (BLS) should undertake a new strategy to modernize the Consumer Price Index by accelerating its use of new data sources and developing price indexes based on different income levels, says a new report from the National Academies of Sciences, Engineering, and Medicine.

The Consumer Price Index is the most widely used measure of inflation in the U.S. It is used to determine cost-of-living allowances and, importantly, influences monetary policy, among many other private- and public-sector applications. The new report, Modernizing the Consumer Price Index for the 21st Century, says the index has traditionally relied on field-generated data, such as prices observed in person at grocery stores or major retailers. These data have become more challenging and expensive to collect, and the availability of vast digital sources of consumer price data presents an opportunity. BLS has begun tapping into these data and has said its objective is to switch a significant portion of its measurement to nontraditional and digital data sources by 2024.

“The enormous economic disruption of the COVID-19 pandemic presents a perfect case study for the need to rapidly employ new data sources for the Consumer Price Index,” said Daniel E. Sichel, professor of economics at Wellesley College, and chair of the committee that wrote the report. “Modernizing the Consumer Price Index can help our measurement of household costs and inflation be more accurate, timelier, and ultimately more useful for policymakers responding to rapidly changing economic conditions.”..
The report says BLS should embark on a strategy of accelerating and enhancing the use of scanner, web-scraped, and digital data directly from retailers in compiling the Consumer Price Index. Scanner data — recorded at the point of sale or by consumers in their homes — can expand the variety of products represented in the Consumer Price Index, and better detect shifts in buying patterns. Web-scraped data can more nimbly track the prices of online goods, and goods where one company dominates the market. Permanently automating web-scraping of price data should be a high priority for the Consumer Price Index program, especially for food, electronics, and apparel, the report says.

Embracing these alternative data sources now will ensure that the accuracy and timeliness of the Consumer Price Index will not be compromised in the future, the report adds. Moreover, accelerating this process will give BLS time to carefully assess new data sources and methodologies before taking the decision to incorporate them in the official index….(More)”

A New Approach to Digital Public Goods Is Gaining Steam


Article by Susan Ariel Aaronson: “Data is different from other inputs. Researchers in the public and private sectors can reuse troves of data indefinitely without that data losing its value. Individuals can use the same data for multiple purposes. They can create new products or research complex problems. Hence, data is multidimensional. It can simultaneously be a commercial asset and a public good.

Firms have long relied on data to improve the efficiency and quality of goods and services. However, today market actors also utilize data to create entirely new services, such as personalized healthcare. Data-driven sectors such as social networks and artificial-intelligence services are the foundation of today’s global economy. These sectors also enabled much of the world to function during the pandemic.

However, some seven companies in the U.S. and China collect, control, protect, analyze and sell much of the world’s data. According to the U.N. agency UNCTAD, these data behemoths control much of data collection through their provision of services; data transmissions through submarine cables and satellites; data storage; and data analysis, processing, and use. These firms rely on trade agreements to protect their intellectual property, which in turn allows them control over the data analyzed by their algorithms. Such complete control over data is dangerous for market actors large and small. When fewer researchers have access or can reuse data sets, these firms are essentially reducing the economic and social potential—the generativity of data….(More)”.

In potentially seismic shift, Government could release almost all advice to ministers


Article by Henry Cooke: (New Zealand) “The Government is considering proactively releasing almost all advice to ministers under a planned shakeup to transparency rules, which, if made, would amount to a seismic shift in the way the public sector communicates.

Open government advocates have cautiously welcomed the planned move, but say the devil will be in the detail – as the proactive release regime could end up defanging the Official Information Act (OIA).

The Public Service Commission is consulting with government departments and agencies on a proposal to release to the public all briefings and other advice given to ministers – unless there is a compelling reason not to, such as national security or breaching a commercial agreement, according to a person with knowledge of the discussions.

Currently, the Government proactively releases all Cabinet papers within 30 working days of a decision being made, but it does not release the advice that underpins those decisions. The Cabinet papers can also be redacted entirely or in part if the Government believes there is a good reason to do so.

Some advice is proactively released by individual agencies but there is no uniform rule declaring it or any centralised depository. In practice, much of it is released after either the media or opposition requests a copy under the OIA.

The new regime would see all ministerial advice be released without waiting to be asked for it, although it is not clear on what timeframe.

Ministers would also have to proactively release the titles of their briefings on a regular basis, meaning any advice that was not released could be requested under the OIA.

The Public Service Commission – which oversees the sprawling public sector – is also exploring options for a single point of access for these documents, instead of it being spread over many different websites….(More)”.

Roe draft raises concerns data could be used to identify abortion seekers, providers


Article by Chris Mills Rodrigo: “Concerns that data gathered from peoples’ interactions with their digital devices could potentially be used to identify individuals seeking or performing abortions have come into the spotlight with the news that pregnancy termination services could soon be severely restricted or banned in much of the United States.

Following the leak of a draft majority opinion indicating that the Supreme Court is poised to overturn Roe v. Wade, the landmark 1973 decision that established the federal right to abortion, privacy advocates are raising alarms about the ways law enforcement officials or anti-abortion activists could make such identifications using data available on the open market, obtained from companies or extracted from devices.

“The dangers of unfettered access to Americans’ personal information have never been more obvious. Researching birth control online, updating a period-tracking app or bringing a phone to the doctor’s office could be used to track and prosecute women across the U.S.,” Sen. Ron Wyden (D-Ore.) said in a statement to The Hill. 

Data from web searches, smartphone location pings and online purchases can all be easily obtained with little to no safeguards.

“Almost everything that you do … data can be captured about it and can be fed into a larger model that can help somebody or some entity infer whether or not you may be pregnant and whether or not you may be someone who’s planning to have an abortion or has had one,” Nathalie Maréchal, senior policy manager at Ranking Digital Rights, explained. 

There are three primary ways that data could travel from individuals’ devices to law enforcement or other groups, according to experts who spoke with The Hill.

The first is via third party data brokers, which make up a shadowy multibillion dollar industry dedicated to collecting, aggregating and selling location data harvested from individuals’ mobile phones that has provided unprecedented access to the daily movements of Americans for advertisers, or virtually anyone willing to pay…(More)”.