When Farmland Becomes the Front Line, Satellite Data and Analysis Can Fight Hunger


Article by Inbal Becker-Reshef and Mary Mitkish: “When a shock to the global food system occurs—such as during the Russian invasion of Ukraine in 2022—collecting the usual ground-based data is all but impossible. The Russia–Ukraine war has turned farmland into the front lines of a war zone. In this situation, it is unreasonable to expect civilians to walk onto fields riddled with land mines and damaged by craters to collect information on what has been planted, where it was planted, and if it could be harvested. The inherent danger of ground-based data collection, especially in occupied territories of the conflict, has demanded a different way to assess planted and harvested areas and forecast crop production.

Satellite-based information can provide this evidence quickly and reliably. At NASA Harvest, NASA’s Global Food Security and Agriculture Consortium, one of our main aims is to use satellite-based information to fill gaps in the agriculture information ecosystem. Since the start of the Russia–Ukraine conflict, we have been using satellite imagery to estimate the impact of the war on Ukraine’s agricultural lands at the request of the Ministry of Agrarian Policy and Food of Ukraine. Our work demonstrates how effective this approach can be for delivering critical and timely insights for decisionmakers.

Prior to the war, Ukraine accounted for over 10% of the world’s wheat, corn, and barley trade and was the number one sunflower oil exporter, accounting for close to 50% of the global market. In other words, food produced in Ukraine is critical for its national economy, for global trade, and for feeding millions across the globe…(More)”.

The City as a License: Design, Rights and Civics in a Blockchain Society


Special Issue by Martijn de Waal et al: “Building upon critical work on smart cities, platform urbanism and algorithmic governance, this special issue proposes the ‘generative metaphor’ of the City as a License as a lens to analyze the digitally enhanced management of urban resources and infrastructures from a perspective of rights and agency. Such a perspective has become especially urgent with the rise of new data practices around the emergence of distributed ledger technologies, as they may introduce additional layers of complexity to the ‘algorithmic governance’ of cities. This is particularly due to their tokenization of resources, identities, and rights and automatic administration of access to urban resources. Contributions in this special issue investigate the affordances of distributed ledger technologies with regards to civic agency in the governance of collective urban resources. Could these newly emerging management systems for energy production and consumption or property rights contribute to pro-social and sustainable ways of governing and managing communities and their resources, according to the logic of the commons? The lens of the City as a License not only allows for such an analysis of potentialities, but also for a critical view on these techno-social systems, such as the way in which they may repeat the inequities and obfuscations of existing systems, produce unintended consequences through complex processes, and complicate accountability…(More)”.

AI’s big rift is like a religious schism


Article by Henry Farrell: “…Henri de Saint-Simon, a French utopian, proposed a new religion, worshipping the godlike force of progress, with Isaac Newton as its chief saint. He believed that humanity’s sole uniting interest, “the progress of the sciences”, should be directed by the “elect of humanity”, a 21-member “Council of Newton”. Friedrich Hayek, a 20th-century economist, later gleefully described how this ludicrous “religion of the engineers” collapsed into a welter of feuding sects.

Today, the engineers of artificial intelligence (ai) are experiencing their own religious schism. One sect worships progress, canonising Hayek himself. The other is gripped by terror of godlike forces. Their battle has driven practical questions to the margins of debate…(More)”.

The biggest data protection fight you’ve never heard of


Article by Russell Brandom: “One of the biggest negotiations in tech has been happening almost entirely behind the scenes. Organized as a side letter to the World Trade Organization, the Joint Statement Initiative (JSI) on E-commerce has been developing quietly for more than six years, picking up particular momentum in the last six months. The goal is to codify a new set of rules for international online trade between the United States and 88 other countries throughout Eastern Europe, Latin America, and Southeast Asia.

But while the participants basically agree about the nuts and bolts of copyright and licensing, broader questions of data protection have taken center stage. The group brings together free-market diehards like Singapore with more protectionist countries like Brazil, so it’s no surprise that there are different ideas of privacy in play. But this kind of international bargaining can play a surprising role in shaping what’s possible. Countries can still set tougher privacy rules at a national level, but with the offending parties almost always based overseas, a contravening agreement might make those rules difficult to enforce…(More)”.

Do disappearing data repositories pose a threat to open science and the scholarly record?


Article by Dorothea Strecker, Heinz Pampel, Rouven Schabinger and Nina Leonie Weisweiler: “Research data repositories, such as Zenodo or the UK Data Archive, are specialised information infrastructures that focus on the curation and dissemination of research data. One of repositories’ main tasks is maintaining their collections long-term, see for example the TRUST Principles, or the requirements of the certification organization CoreTrustSeal. Long-term preservation is also a prerequisite for several data practices that are getting increasing attention, such as data reuse and data citation.

For data to remain usable, the infrastructures that host them also have to be kept operational. However, the long-term operation of research data repositories is challenging, and sometimes, for varying reasons and despite best efforts, they are shut down….

In a recent study we therefore set out to take an infrastructure perspective on the long-term preservation of research data by investigating repositories across disciplines and types that were shut down. We also tried to estimate the impact of repository shutdown on data availability…

We found that repository shutdown was not rare: 6.2% of all repositories listed in re3data were shut down. Since the launch of the registry in 2012, at least one repository has been shut down each year (see Fig.1). The median age of a repository when shutting down was 12 years…(More)”.

The global reach of the EU’s approach to digital transformation


Report by the European Parliament’s Think Tank: “The EU’s approach to digital transformation is rooted in protecting fundamental rights, sustainability, ethics and fairness. With this human-centric vision of the digital economy and society, the EU seeks to empower citizens and businesses, regardless of their size. In the EU’s view, the internet should remain open, fair, inclusive and focused on people. Digital technologies should work for citizens and help them to engage in society. Companies should be able to compete on equal terms, and consumers should be confident that their rights are respected. The European Commission has published a number of strategies and action plans recently that outline the EU’s vision for the digital future and set concrete targets for achieving it. The Commission has also proposed several digital regulations, including the artificial intelligence act, the Digital Services Act and the Digital Markets Act. These regulations are intended to ensure a safe online environment and fair and open digital markets, strengthen Europe’s competitiveness, improve algorithmic transparency and give citizens better control over how they share their personal data. Although some of these regulations have not yet been adopted, and others have been in force for only a short time, they are expected to have impact not only in the EU but also beyond its borders. For instance, several regulations target businesses – regardless of where they are based – that offer services to EU citizens or businesses. In addition, through the phenomenon known as ‘the Brussels effect’, these rules may influence tech business practices and national legislation around the world. The EU is an active participant in developing global digital cooperation and global governance frameworks for specific areas. Various international organisations are developing instruments to ensure that people and businesses can take advantage of artificial intelligence’s benefits and limit negative consequences. In these global negotiations, the EU promotes respect for various fundamental rights and freedoms, as well as compatibility with EU law….(More)”.

How Much of the World Is It Possible to Model?


Article by Dan Rockmore: “…Modelling, in general, is now routine. We model everything, from elections to economics, from the climate to the coronavirus. Like model cars, model airplanes, and model trains, mathematical models aren’t the real thing—they’re simplified representations that get the salient parts right. Like fashion models, model citizens, and model children, they’re also idealized versions of reality. But idealization and abstraction can be forms of strength. In an old mathematical-modelling joke, a group of experts is hired to improve milk production on a dairy farm. One of them, a physicist, suggests, “Consider a spherical cow.” Cows aren’t spheres any more than brains are jiggly sponges, but the point of modelling—in some ways, the joy of it—is to see how far you can get by using only general scientific principles, translated into mathematics, to describe messy reality.

To be successful, a model needs to replicate the known while generalizing into the unknown. This means that, as more becomes known, a model has to be improved to stay relevant. Sometimes new developments in math or computing enable progress. In other cases, modellers have to look at reality in a fresh way. For centuries, a predilection for perfect circles, mixed with a bit of religious dogma, produced models that described the motion of the sun, moon, and planets in an Earth-centered universe; these models worked, to some degree, but never perfectly. Eventually, more data, combined with more expansive thinking, ushered in a better model—a heliocentric solar system based on elliptical orbits. This model, in turn, helped kick-start the development of calculus, reveal the law of gravitational attraction, and fill out our map of the solar system. New knowledge pushes models forward, and better models help us learn.

Predictions about the universe are scientifically interesting. But it’s when models make predictions about worldly matters that people really pay attention.We anxiously await the outputs of models run by the Weather Channel, the Fed, and fivethirtyeight.com. Models of the stock market guide how our pension funds are invested; models of consumer demand drive production schedules; models of energy use determine when power is generated and where it flows. Insurers model our fates and charge us commensurately. Advertisers (and propagandists) rely on A.I. models that deliver targeted information (or disinformation) based on predictions of our reactions.

But it’s easy to get carried away..(More)”

Missing Evidence : Tracking Academic Data Use around the World


Worldbank Report: “Data-driven research on a country is key to producing evidence-based public policies. Yet little is known about where data-driven research is lacking and how it could be expanded. This paper proposes a method for tracking academic data use by country of subject, applying natural language processing to open-access research papers. The model’s predictions produce country estimates of the number of articles using data that are highly correlated with a human-coded approach, with a correlation of 0.99. Analyzing more than 1 million academic articles, the paper finds that the number of articles on a country is strongly correlated with its gross domestic product per capita, population, and the quality of its national statistical system. The paper identifies data sources that are strongly associated with data-driven research and finds that availability of subnational data appears to be particularly important. Finally, the paper classifies countries into groups based on whether they could most benefit from increasing their supply of or demand for data. The findings show that the former applies to many low- and lower-middle-income countries, while the latter applies to many upper-middle- and high-income countries…(More)”.

Are we entering a “Data Winter”?


Article by Stefaan G. Verhulst: “In an era where data drives decision-making, the accessibility of data for public interest purposes has never been more crucial. Whether shaping public policy, responding to disasters, or empowering research, data plays a pivotal role in our understanding of complex social, environmental, and economic issues. In 2015, I introduced the concept of Data Collaboratives to advance new and innovative partnerships between the public and private sectors that could make data more accessible for public interest purposes. More recently, I have been advocating for a reimagined approach to data stewardship to make data collaboration more systematic, agile, sustainable, and responsible.

We may be entering a “Data Winter”

Despite many advances toward data stewardship (especially during Covid19) and despite the creation of several important data collaboratives (e.g., the Industry Data for Society Partnership) the project of opening access to data is proving increasingly challenging. Indeed, unless we step up our efforts in 2024, we may be entering a prolonged data winter — analogous to previous Artificial Intelligence winters, marked by reduced funding and interest in AI research, in which data assets that could be leveraged for the common good are instead frozen and immobilized. Recent developments, such as a decline in access to social media data for research and the growing privatization of climate data, along with a decrease in open data policy activity, signify a worrying trend. This blog takes stock of these developments and, building on some recent expert commentary, raises a number of concerns about the current state of data accessibility and its implications for the public interest. We conclude by calling for a new Decade of Data — one marked by a reinvigorated commitment to open data and data reuse for the public interest…(More)”.

The world needs an International Decade for Data–or risk splintering into AI ‘haves’ and ‘have-nots,’ UN researchers warn


Article by Tshilidzi Marwala and David Passarelli: “The rapid rise in data-driven technologies is shaping how many of us live–from biometric data collected by our smartwatches, artificial intelligence (AI) tools and models changing how we work, to social media algorithms that seem to know more about our content preferences than we do. Greater amounts of data are affecting all aspects of our lives, and indeed, society at large.

This explosion in data risks creating new inequalities, equipping a new set of “haves” who benefit from the power of data while excluding, or even harming, a set of “have-nots”–and splitting the international community into “data-poor” and “data-rich” worlds.

We know that data, when harnessed correctly, can be a powerful tool for sustainable development. Intelligent and innovative use of data can support public health systems, improve our understanding of climate change and biodiversity loss, anticipate crises, and tackle deep-rooted structural injustices such as racism and economic inequality.

However, the vast quantity of data is fueling an unregulated Wild West. Instead of simply issuing more warnings, governments must instead work toward good governance of data on a global scale. Due to the rapid pace of technological innovation, policies intended to protect society will inevitably fall behind. We need to be more ambitious.

To begin with, governments must ensure that the benefits derived from data are equitably distributed by establishing global ground rules for data collection, sharing, taxation, and re-use. This includes dealing with synthetic data and cross-border data flows…(More)”.