Systems Ultra: Making Sense of Technology in a Complex World


Book by Georgina Voss: “…explores how we experience complex systems: the mesh of things, people, and ideas interacting to produce their own patterns and behaviours.

What does it mean when a car which runs on code drives dangerously? What does massmarket graphics software tell us about the workplace politics of architects? And, in these human-made systems, which phenomena are designed, and which are emergent? In a world of networked technologies, global supply chains, and supranational regulations, there are growing calls for a new kind of literacy around systems and their ramifications. At the same time, we are often told these systems are impossible to fully comprehend and are far beyond our control.

Drawing on field research and artistic practice around the industrial settings of ports, air traffic control, architectural software, payment platforms in adult entertainment, and car crash testing, Georgina Voss argues that complex systems can be approached as sites of revelation around scale, time, materiality, deviance, and breakages. With humour and guile, she tells the story of what ‘systems’ have come to mean, how they have been sold to us, and the real-world consequences of the power that flows through them.

Systems Ultra goes beyond narratives of technological exceptionalism to explore how we experience the complex systems which influence our lives, how to understand them more clearly, and, perhaps, how to change them…(More)”.

Regulating AI Deepfakes and Synthetic Media in the Political Arena


Report by Daniel Weiner and Lawrence Norden: “…Part I of this resource defines the terms deepfakesynthetic media, and manipulated media in more detail. Part II sets forth some necessary considerations for policymakers, specifically:

  • The most plausible rationales for regulating deepfakes and other manipulated media when used in the political arena. In general, the necessity of promoting an informed electorate and the need to safeguard the overall integrity of the electoral process are among the most compelling rationales for regulating manipulated media in the political space.
  • The types of communications that should be regulated. Regulations should reach synthetic images and audio as well as video. Policymakers should focus on curbing or otherwise limiting depictions of events or statements that did not actually occur, especially those appearing in paid campaign ads and certain other categories of paid advertising or otherwise widely disseminated communications. All new rules should have clear carve-outs for parody, news media stories, and potentially other types of protected speech.
  • How such media should be regulated. Transparency rules — for example, rules requiring a manipulated image or audio recording to be clearly labeled as artificial and not a portrayal of real events — will usually be easiest to defend in court. Transparency will not always be enough, however; lawmakers should also consider outright bans of certain categories of manipulated media, such as deceptive audio and visual material seeking to mislead people about the time, place, and manner of voting.
  • Who regulations should target. Both bans and less burdensome transparency requirements should primarily target those who create or disseminate deceptive media, although regulation of the platforms used to transmit deepfakes may also make sense…(More)”.

The City as a License: Design, Rights and Civics in a Blockchain Society


Special Issue by Martijn de Waal et al: “Building upon critical work on smart cities, platform urbanism and algorithmic governance, this special issue proposes the ‘generative metaphor’ of the City as a License as a lens to analyze the digitally enhanced management of urban resources and infrastructures from a perspective of rights and agency. Such a perspective has become especially urgent with the rise of new data practices around the emergence of distributed ledger technologies, as they may introduce additional layers of complexity to the ‘algorithmic governance’ of cities. This is particularly due to their tokenization of resources, identities, and rights and automatic administration of access to urban resources. Contributions in this special issue investigate the affordances of distributed ledger technologies with regards to civic agency in the governance of collective urban resources. Could these newly emerging management systems for energy production and consumption or property rights contribute to pro-social and sustainable ways of governing and managing communities and their resources, according to the logic of the commons? The lens of the City as a License not only allows for such an analysis of potentialities, but also for a critical view on these techno-social systems, such as the way in which they may repeat the inequities and obfuscations of existing systems, produce unintended consequences through complex processes, and complicate accountability…(More)”.

2023 OECD Digital Government Index


OECD Report: “Digital government is essential to transform government processes and services in ways that improve the responsiveness and reliability of the public sector. During the COVID-19 pandemic it also proved crucial to governments’ ability to continue operating in times of crisis and provide timely services to citizens and businesses. Yet, for the digital transformation to be sustainable in the long term, it needs solid foundations, including adaptable governance arrangements, reliable and resilient digital public infrastructure, and a prospective approach to governing with emerging technologies such as artificial intelligence. This paper presents the main findings of the 2023 edition of the OECD Digital Government Index (DGI), which benchmarks the efforts made by governments to establish the foundations necessary for a coherent, human-centred digital transformation of the public sector. It comprises 155 data points from 33 member countries, 4 accession countries and 1 partner country collected in 2022, covering the period between 01 January 2020 and 31 October 2022…(More)”

AI’s big rift is like a religious schism


Article by Henry Farrell: “…Henri de Saint-Simon, a French utopian, proposed a new religion, worshipping the godlike force of progress, with Isaac Newton as its chief saint. He believed that humanity’s sole uniting interest, “the progress of the sciences”, should be directed by the “elect of humanity”, a 21-member “Council of Newton”. Friedrich Hayek, a 20th-century economist, later gleefully described how this ludicrous “religion of the engineers” collapsed into a welter of feuding sects.

Today, the engineers of artificial intelligence (ai) are experiencing their own religious schism. One sect worships progress, canonising Hayek himself. The other is gripped by terror of godlike forces. Their battle has driven practical questions to the margins of debate…(More)”.

The biggest data protection fight you’ve never heard of


Article by Russell Brandom: “One of the biggest negotiations in tech has been happening almost entirely behind the scenes. Organized as a side letter to the World Trade Organization, the Joint Statement Initiative (JSI) on E-commerce has been developing quietly for more than six years, picking up particular momentum in the last six months. The goal is to codify a new set of rules for international online trade between the United States and 88 other countries throughout Eastern Europe, Latin America, and Southeast Asia.

But while the participants basically agree about the nuts and bolts of copyright and licensing, broader questions of data protection have taken center stage. The group brings together free-market diehards like Singapore with more protectionist countries like Brazil, so it’s no surprise that there are different ideas of privacy in play. But this kind of international bargaining can play a surprising role in shaping what’s possible. Countries can still set tougher privacy rules at a national level, but with the offending parties almost always based overseas, a contravening agreement might make those rules difficult to enforce…(More)”.

Do disappearing data repositories pose a threat to open science and the scholarly record?


Article by Dorothea Strecker, Heinz Pampel, Rouven Schabinger and Nina Leonie Weisweiler: “Research data repositories, such as Zenodo or the UK Data Archive, are specialised information infrastructures that focus on the curation and dissemination of research data. One of repositories’ main tasks is maintaining their collections long-term, see for example the TRUST Principles, or the requirements of the certification organization CoreTrustSeal. Long-term preservation is also a prerequisite for several data practices that are getting increasing attention, such as data reuse and data citation.

For data to remain usable, the infrastructures that host them also have to be kept operational. However, the long-term operation of research data repositories is challenging, and sometimes, for varying reasons and despite best efforts, they are shut down….

In a recent study we therefore set out to take an infrastructure perspective on the long-term preservation of research data by investigating repositories across disciplines and types that were shut down. We also tried to estimate the impact of repository shutdown on data availability…

We found that repository shutdown was not rare: 6.2% of all repositories listed in re3data were shut down. Since the launch of the registry in 2012, at least one repository has been shut down each year (see Fig.1). The median age of a repository when shutting down was 12 years…(More)”.

Missing Evidence : Tracking Academic Data Use around the World


Worldbank Report: “Data-driven research on a country is key to producing evidence-based public policies. Yet little is known about where data-driven research is lacking and how it could be expanded. This paper proposes a method for tracking academic data use by country of subject, applying natural language processing to open-access research papers. The model’s predictions produce country estimates of the number of articles using data that are highly correlated with a human-coded approach, with a correlation of 0.99. Analyzing more than 1 million academic articles, the paper finds that the number of articles on a country is strongly correlated with its gross domestic product per capita, population, and the quality of its national statistical system. The paper identifies data sources that are strongly associated with data-driven research and finds that availability of subnational data appears to be particularly important. Finally, the paper classifies countries into groups based on whether they could most benefit from increasing their supply of or demand for data. The findings show that the former applies to many low- and lower-middle-income countries, while the latter applies to many upper-middle- and high-income countries…(More)”.

Are we entering a “Data Winter”?


Article by Stefaan G. Verhulst: “In an era where data drives decision-making, the accessibility of data for public interest purposes has never been more crucial. Whether shaping public policy, responding to disasters, or empowering research, data plays a pivotal role in our understanding of complex social, environmental, and economic issues. In 2015, I introduced the concept of Data Collaboratives to advance new and innovative partnerships between the public and private sectors that could make data more accessible for public interest purposes. More recently, I have been advocating for a reimagined approach to data stewardship to make data collaboration more systematic, agile, sustainable, and responsible.

We may be entering a “Data Winter”

Despite many advances toward data stewardship (especially during Covid19) and despite the creation of several important data collaboratives (e.g., the Industry Data for Society Partnership) the project of opening access to data is proving increasingly challenging. Indeed, unless we step up our efforts in 2024, we may be entering a prolonged data winter — analogous to previous Artificial Intelligence winters, marked by reduced funding and interest in AI research, in which data assets that could be leveraged for the common good are instead frozen and immobilized. Recent developments, such as a decline in access to social media data for research and the growing privatization of climate data, along with a decrease in open data policy activity, signify a worrying trend. This blog takes stock of these developments and, building on some recent expert commentary, raises a number of concerns about the current state of data accessibility and its implications for the public interest. We conclude by calling for a new Decade of Data — one marked by a reinvigorated commitment to open data and data reuse for the public interest…(More)”.

The world needs an International Decade for Data–or risk splintering into AI ‘haves’ and ‘have-nots,’ UN researchers warn


Article by Tshilidzi Marwala and David Passarelli: “The rapid rise in data-driven technologies is shaping how many of us live–from biometric data collected by our smartwatches, artificial intelligence (AI) tools and models changing how we work, to social media algorithms that seem to know more about our content preferences than we do. Greater amounts of data are affecting all aspects of our lives, and indeed, society at large.

This explosion in data risks creating new inequalities, equipping a new set of “haves” who benefit from the power of data while excluding, or even harming, a set of “have-nots”–and splitting the international community into “data-poor” and “data-rich” worlds.

We know that data, when harnessed correctly, can be a powerful tool for sustainable development. Intelligent and innovative use of data can support public health systems, improve our understanding of climate change and biodiversity loss, anticipate crises, and tackle deep-rooted structural injustices such as racism and economic inequality.

However, the vast quantity of data is fueling an unregulated Wild West. Instead of simply issuing more warnings, governments must instead work toward good governance of data on a global scale. Due to the rapid pace of technological innovation, policies intended to protect society will inevitably fall behind. We need to be more ambitious.

To begin with, governments must ensure that the benefits derived from data are equitably distributed by establishing global ground rules for data collection, sharing, taxation, and re-use. This includes dealing with synthetic data and cross-border data flows…(More)”.