OECD Report: “Digital government is essential to transform government processes and services in ways that improve the responsiveness and reliability of the public sector. During the COVID-19 pandemic it also proved crucial to governments’ ability to continue operating in times of crisis and provide timely services to citizens and businesses. Yet, for the digital transformation to be sustainable in the long term, it needs solid foundations, including adaptable governance arrangements, reliable and resilient digital public infrastructure, and a prospective approach to governing with emerging technologies such as artificial intelligence. This paper presents the main findings of the 2023 edition of the OECD Digital Government Index (DGI), which benchmarks the efforts made by governments to establish the foundations necessary for a coherent, human-centred digital transformation of the public sector. It comprises 155 data points from 33 member countries, 4 accession countries and 1 partner country collected in 2022, covering the period between 01 January 2020 and 31 October 2022…(More)”
AI’s big rift is like a religious schism
Article by Henry Farrell: “…Henri de Saint-Simon, a French utopian, proposed a new religion, worshipping the godlike force of progress, with Isaac Newton as its chief saint. He believed that humanity’s sole uniting interest, “the progress of the sciences”, should be directed by the “elect of humanity”, a 21-member “Council of Newton”. Friedrich Hayek, a 20th-century economist, later gleefully described how this ludicrous “religion of the engineers” collapsed into a welter of feuding sects.
Today, the engineers of artificial intelligence (ai) are experiencing their own religious schism. One sect worships progress, canonising Hayek himself. The other is gripped by terror of godlike forces. Their battle has driven practical questions to the margins of debate…(More)”.
The biggest data protection fight you’ve never heard of
Article by Russell Brandom: “One of the biggest negotiations in tech has been happening almost entirely behind the scenes. Organized as a side letter to the World Trade Organization, the Joint Statement Initiative (JSI) on E-commerce has been developing quietly for more than six years, picking up particular momentum in the last six months. The goal is to codify a new set of rules for international online trade between the United States and 88 other countries throughout Eastern Europe, Latin America, and Southeast Asia.
But while the participants basically agree about the nuts and bolts of copyright and licensing, broader questions of data protection have taken center stage. The group brings together free-market diehards like Singapore with more protectionist countries like Brazil, so it’s no surprise that there are different ideas of privacy in play. But this kind of international bargaining can play a surprising role in shaping what’s possible. Countries can still set tougher privacy rules at a national level, but with the offending parties almost always based overseas, a contravening agreement might make those rules difficult to enforce…(More)”.
Do disappearing data repositories pose a threat to open science and the scholarly record?
Article by Dorothea Strecker, Heinz Pampel, Rouven Schabinger and Nina Leonie Weisweiler: “Research data repositories, such as Zenodo or the UK Data Archive, are specialised information infrastructures that focus on the curation and dissemination of research data. One of repositories’ main tasks is maintaining their collections long-term, see for example the TRUST Principles, or the requirements of the certification organization CoreTrustSeal. Long-term preservation is also a prerequisite for several data practices that are getting increasing attention, such as data reuse and data citation.
For data to remain usable, the infrastructures that host them also have to be kept operational. However, the long-term operation of research data repositories is challenging, and sometimes, for varying reasons and despite best efforts, they are shut down….
In a recent study we therefore set out to take an infrastructure perspective on the long-term preservation of research data by investigating repositories across disciplines and types that were shut down. We also tried to estimate the impact of repository shutdown on data availability…
We found that repository shutdown was not rare: 6.2% of all repositories listed in re3data were shut down. Since the launch of the registry in 2012, at least one repository has been shut down each year (see Fig.1). The median age of a repository when shutting down was 12 years…(More)”.
Missing Evidence : Tracking Academic Data Use around the World
Worldbank Report: “Data-driven research on a country is key to producing evidence-based public policies. Yet little is known about where data-driven research is lacking and how it could be expanded. This paper proposes a method for tracking academic data use by country of subject, applying natural language processing to open-access research papers. The model’s predictions produce country estimates of the number of articles using data that are highly correlated with a human-coded approach, with a correlation of 0.99. Analyzing more than 1 million academic articles, the paper finds that the number of articles on a country is strongly correlated with its gross domestic product per capita, population, and the quality of its national statistical system. The paper identifies data sources that are strongly associated with data-driven research and finds that availability of subnational data appears to be particularly important. Finally, the paper classifies countries into groups based on whether they could most benefit from increasing their supply of or demand for data. The findings show that the former applies to many low- and lower-middle-income countries, while the latter applies to many upper-middle- and high-income countries…(More)”.
Are we entering a “Data Winter”?
Article by Stefaan G. Verhulst: “In an era where data drives decision-making, the accessibility of data for public interest purposes has never been more crucial. Whether shaping public policy, responding to disasters, or empowering research, data plays a pivotal role in our understanding of complex social, environmental, and economic issues. In 2015, I introduced the concept of Data Collaboratives to advance new and innovative partnerships between the public and private sectors that could make data more accessible for public interest purposes. More recently, I have been advocating for a reimagined approach to data stewardship to make data collaboration more systematic, agile, sustainable, and responsible.
Despite many advances toward data stewardship (especially during Covid19) and despite the creation of several important data collaboratives (e.g., the Industry Data for Society Partnership) the project of opening access to data is proving increasingly challenging. Indeed, unless we step up our efforts in 2024, we may be entering a prolonged data winter — analogous to previous Artificial Intelligence winters, marked by reduced funding and interest in AI research, in which data assets that could be leveraged for the common good are instead frozen and immobilized. Recent developments, such as a decline in access to social media data for research and the growing privatization of climate data, along with a decrease in open data policy activity, signify a worrying trend. This blog takes stock of these developments and, building on some recent expert commentary, raises a number of concerns about the current state of data accessibility and its implications for the public interest. We conclude by calling for a new Decade of Data — one marked by a reinvigorated commitment to open data and data reuse for the public interest…(More)”.
The world needs an International Decade for Data–or risk splintering into AI ‘haves’ and ‘have-nots,’ UN researchers warn
Article by Tshilidzi Marwala and David Passarelli: “The rapid rise in data-driven technologies is shaping how many of us live–from biometric data collected by our smartwatches, artificial intelligence (AI) tools and models changing how we work, to social media algorithms that seem to know more about our content preferences than we do. Greater amounts of data are affecting all aspects of our lives, and indeed, society at large.
This explosion in data risks creating new inequalities, equipping a new set of “haves” who benefit from the power of data while excluding, or even harming, a set of “have-nots”–and splitting the international community into “data-poor” and “data-rich” worlds.
We know that data, when harnessed correctly, can be a powerful tool for sustainable development. Intelligent and innovative use of data can support public health systems, improve our understanding of climate change and biodiversity loss, anticipate crises, and tackle deep-rooted structural injustices such as racism and economic inequality.
However, the vast quantity of data is fueling an unregulated Wild West. Instead of simply issuing more warnings, governments must instead work toward good governance of data on a global scale. Due to the rapid pace of technological innovation, policies intended to protect society will inevitably fall behind. We need to be more ambitious.
To begin with, governments must ensure that the benefits derived from data are equitably distributed by establishing global ground rules for data collection, sharing, taxation, and re-use. This includes dealing with synthetic data and cross-border data flows…(More)”.
Avoiding the News
Book by Benjamin Toff, Ruth Palmer, and Rasmus Kleis Nielsen: “A small but growing number of people in many countries consistently avoid the news. They feel they do not have time for it, believe it is not worth the effort, find it irrelevant or emotionally draining, or do not trust the media, among other reasons. Why and how do people circumvent news? Which groups are more and less reluctant to follow the news? In what ways is news avoidance a problem—for individuals, for the news industry, for society—and how can it be addressed?
This groundbreaking book explains why and how so many people consume little or no news despite unprecedented abundance and ease of access. Drawing on interviews in Spain, the United Kingdom, and the United States as well as extensive survey data, Avoiding the News examines how people who tune out traditional media get information and explores their “folk theories” about how news organizations work. The authors argue that news avoidance is about not only content but also identity, ideologies, and infrastructures: who people are, what they believe, and how news does or does not fit into their everyday lives. Because news avoidance is most common among disadvantaged groups, it threatens to exacerbate existing inequalities by tilting mainstream journalism even further toward privileged audiences. Ultimately, this book shows, persuading news-averse audiences of the value of journalism is not simply a matter of adjusting coverage but requires a deeper, more empathetic understanding of people’s relationships with news across social, political, and technological boundaries…(More)”.
The New Knowledge
Book by Blayne Haggart and Natasha Tusikov: “From the global geopolitical arena to the smart city, control over knowledge—particularly over data and intellectual property—has become a key battleground for the exercise of economic and political power. For companies and governments alike, control over knowledge—what scholar Susan Strange calls the knowledge structure—has become a goal unto itself.
The rising dominance of the knowledge structure is leading to a massive redistribution of power, including from individuals to companies and states. Strong intellectual property rights have concentrated economic benefits in a smaller number of hands, while the “internet of things” is reshaping basic notions of property, ownership, and control. In the scramble to create and control data and intellectual property, governments and companies alike are engaging in ever-more surveillance.
The New Knowledge is a guide to and analysis of these changes, and of the emerging phenomenon of the knowledge-driven society. It highlights how the pursuit of the control over knowledge has become its own ideology, with its own set of experts drawn from those with the ability to collect and manipulate digital data. Haggart and Tusikov propose a workable path forward—knowledge decommodification—to ensure that our new knowledge is not treated simply as a commodity to be bought and sold, but as a way to meet the needs of the individuals and communities that create this knowledge in the first place…(More)”.
A Guide to Designing New Institutions
Guide by TIAL: “We have created this guide as part of TIAL’s broader programme of work to help with the design of new institutions needed in fields ranging from environmental change to data stewardship and AI to mental health.
This guide covers all the necessary steps of creating a new institution:
- Preparation
- Design (from structures and capabilities to processes and resources)
- Socialisation (to ensure buy-in and legitimacy)
- Implementation…(More)”.