Report by Dalberg: “Data and data ecosystems enable decision makers to improve lives and livelihoods by better understanding the world around them and acting in more effective and targeted ways. In a time of growing crises and shrinking budgets, it is imperative that every dollar is spent in the most efficient and equitable way. Data ecosystems provide decision makers with the information needed to assess and predict challenges, identify and customize solutions, and monitor and evaluate real-time progress. Together, this enables decisions that are more collaborative, effective, efficient, equitable, timely, and transparent. And this is only getting easier—ongoing advances in our ability to harness and apply data are creating opportunities to better target resources and create even more transformative impact…(More)”.
Eliminate data asymmetries to democratize data use
Article by Rahul Matthan: “Anyone who possesses a large enough store of data can reasonably expect to glean powerful insights from it. These insights are more often than not used to enhance advertising revenues or ensure greater customer stickiness. In other instances, they’ve been subverted to alter our political preferences and manipulate us into taking decisions we otherwise may not have.
The ability to generate insights places those who have access to these data sets at a distinct advantage over those whose data is contained within them. It allows the former to benefit from the data in ways that the latter may not even have thought possible when they consented to provide it. Given how easily these insights can be used to harm those to whom it pertains, there is a need to mitigate the effects of this data asymmetry.
Privacy law attempts to do this by providing data principals with tools they can use to exert control over their personal data. It requires data collectors to obtain informed consent from data principals before collecting their data and forbids them from using it for any purpose other than that which has been previously notified. This is why, even if that consent has been obtained, data fiduciaries cannot collect more data than is absolutely necessary to achieve the stated purpose and are only allowed to retain that data for as long as is necessary to fulfil the stated purpose.
In India, we’ve gone one step further and built techno-legal solutions to help reduce this data asymmetry. The Data Empowerment and Protection Architecture (DEPA) framework makes it possible to extract data from the silos in which they reside and transfer it on the instructions of the data principal to other entities, which can then use it to provide other services to the data principal. This data micro-portability dilutes the historical advantage that incumbents enjoy on account of collecting data over the entire duration of their customer engagement. It eliminates data asymmetries by establishing the infrastructure that creates a competitive market for data-based services, allowing data principals to choose from a range of options as to how their data could be used for their benefit by service providers.
This, however, is not the only type of asymmetry we have to deal with in this age of big data. In a recent article, Stefaan Verhulst of GovLab at New York University pointed out that it is no longer enough to possess large stores of data—you need to know how to effectively extract value from it. Many businesses might have vast stores of data that they have accumulated over the years they have been in operation, but very few of them are able to effectively extract useful signals from that noisy data.
Without the know-how to translate data into actionable information, merely owning a large data set is of little value.
Unlike data asymmetries, which can be mitigated by making data more widely available, information asymmetries can only be addressed by radically democratizing the techniques and know-how that are necessary for extracting value from data. This know-how is largely proprietary and hard to access even in a fully competitive market. What’s more, in many instances, the computation power required far exceeds the capacity of entities for whom data analysis is not the main purpose of their business…(More)”.
Data and displacement: Ethical and practical issues in data-driven humanitarian assistance for IDPs
Blog by Vicki Squire: “Ten years since the so-called “data revolution” (Pearn et al, 2022), the rise of “innovation” and the proliferation of “data solutions” has rendered the assessment of changing data practices within the humanitarian sector ever more urgent. New data acquisition modalities have provoked a range of controversies across multiple contexts and sites (e.g. Human Rights Watch, 2021, 2022a, 2022b). Moreover, a range of concerns have been raised about data sharing (e.g. Fast, 2022) and the inequities embedded within humanitarian data (e.g. Data Values, 2022).
With this in mind, the Data and Displacement project set out to explore the practical and ethical implications of data-driven humanitarian assistance in two contexts characterised by high levels of internal displacement: north-eastern Nigeria and South Sudan. Our interdisciplinary research team includes academics from each of the regions under analysis, as well as practitioners from the International Organization for Migration. From the start, the research was designed to centre the lived experiences of Internally Displaced Persons (IDPs), while also shedding light on the production and use of humanitarian data from multiple perspectives.
We conducted primary research during 2021-2022. Our research combines dataset analysis and visualisation techniques with a thematic analysis of 174 semi-structured qualitative interviews. In total we interviewed 182 people: 42 international data experts, donors, and humanitarian practitioners from a range of governmental and non-governmental organisations; 40 stakeholders and practitioners working with IDPs across north-eastern Nigeria and South Sudan (20 in each region); and 100 IDPs in camp-like settings (50 in each region). Our findings point to a disconnect between international humanitarian standards and practices on the ground, the need to revisit existing ethical guidelines such informed consent, and the importance of investing in data literacies…(More)”.
A Philosophy for Future Generations
Book by Tiziana Andina: “If societies, like institutions, are built to endure, then the bond that exists between generations must be considered. Constructing a framework to establish a philosophy of future generations, Tiziana Andina explores the factors that make it possible for a society to reproduce over time.
Andina’s study of the diachronic structure of societies considers the never-ending passage of generations, as each new generation comes to form a part of the new social fabric and political model.
Her model draws on the anthropologies offered by classical political philosophies such as Hobbes and Machiavelli and the philosophies of power as discussed by Nietzsche. She confronts the ethics and function of this fundamental relationship, examines the role of transgenerationality in the formation and endurance of Western democracies and recognizes an often overlooked problem: each new generation must form part of social and political arrangements designed for them by the generations that came before…(More)”.
Inclusive Imaginaries: Catalysing Forward-looking Policy Making through Civic Imagination
UNDP Report: “Today’s complex challenges- including climate change, global health, and international security, among others – are pushing development actors to re-think and re-imagine traditional ways of working and decision-making. Transforming traditional approaches to navigating complexity would support what development thinker Sam Pitroda’s calls a ‘third vision’ demands a mindset rooted in creativity, innovation, and courage in order to one transcend national interests and takes into account global issues.
Inclusive Imaginaries is an approach that utilises collective reflection and imagination to engage with citizens, towards building more just, equitable and inclusive futures. It seeks to infuse imagination as a key process to support gathering of community perspectives rooted in lived experience and local culture, towards developing more contextual visions for policy and programme development…(More)”.
‘Dark data’ is killing the planet – we need digital decarbonisation
Article by Tom Jackson and Ian R. Hodgkinson: “More than half of the digital data firms generate is collected, processed and stored for single-use purposes. Often, it is never re-used. This could be your multiple near-identical images held on Google Photos or iCloud, a business’s outdated spreadsheets that will never be used again, or data from internet of things sensors that have no purpose.
This “dark data” is anchored to the real world by the energy it requires. Even data that is stored and never used again takes up space on servers – typically huge banks of computers in warehouses. Those computers and those warehouses all use lots of electricity.
This is a significant energy cost that is hidden in most organisations. Maintaining an effective organisational memory is a challenge, but at what cost to the environment?
In the drive towards net zero many organisations are trying to reduce their carbon footprints. Guidance has generally centred on reducing traditional sources of carbon production, through mechanisms such as carbon offsetting via third parties (planting trees to make up for emissions from using petrol, for instance).
While most climate change activists are focused on limiting emissions from the automotive, aviation and energy industries, the processing of digital data is already comparable to these sectors and is still growing. In 2020, digitisation was purported to generate 4% of global greenhouse gas emissions. Production of digital data is increasing fast – this year the world is expected to generate 97 zettabytes (that is: 97 trillion gigabytes) of data. By 2025, it could almost double to 181 zettabytes. It is therefore surprising that little policy attention has been placed on reducing the digital carbon footprint of organisations…(More)”.
Public procurement of artificial intelligence systems: new risks and future proofing
Paper by Merve Hickok: “Public entities around the world are increasingly deploying artificial intelligence (AI) and algorithmic decision-making systems to provide public services or to use their enforcement powers. The rationale for the public sector to use these systems is similar to private sector: increase efficiency and speed of transactions and lower the costs. However, public entities are first and foremost established to meet the needs of the members of society and protect the safety, fundamental rights, and wellbeing of those they serve. Currently AI systems are deployed by the public sector at various administrative levels without robust due diligence, monitoring, or transparency. This paper critically maps out the challenges in procurement of AI systems by public entities and the long-term implications necessitating AI-specific procurement guidelines and processes. This dual-prong exploration includes the new complexities and risks introduced by AI systems, and the institutional capabilities impacting the decision-making process. AI-specific public procurement guidelines are urgently needed to protect fundamental rights and due process…(More)”.
Tales from a Robotic World: How Intelligent Machines Will Shape Our Future
Book by Dario Floreano and Nicola Nosengo: “Tech prognosticators promised us robots—autonomous humanoids that could carry out any number of tasks. Instead, we have robot vacuum cleaners. But, as Dario Floreano and Nicola Nosengo report, advances in robotics could bring those rosy predictions closer to reality. A new generation of robots, directly inspired by the intelligence and bodies of living organisms, will be able not only to process data but to interact physically with humans and the environment. In this book, Floreano, a roboticist, and Nosengo, a science writer, bring us tales from the future of intelligent machines—from rescue drones to robot spouses—along with accounts of the cutting-edge research that could make it all possible.
These stories from the not-so-distant future show us robots that can be used for mitigating effects of climate change, providing healthcare, working with humans on the factory floor, and more. Floreano and Nosengo tell us how an application of swarm robotics could protect Venice from flooding, how drones could reduce traffic on the congested streets of mega-cities like Hong Kong, and how a “long-term relationship model” robot could supply sex, love, and companionship. After each fictional scenario, they explain the technologies that underlie it, describing advances in such areas as soft robotics, swarm robotics, aerial and mobile robotics, humanoid robots, wearable robots, and even biohybrid robots based on living cells. Robotics technology is no silver bullet for all the world’s problems—but it can help us tackle some of the most pressing challenges we face…(More)”.
Google’s new AI can hear a snippet of song—and then keep on playing
Article by Tammy Xu: “The new AI system can generate natural sounds and voices after being prompted with a few seconds of audio.
AudioLM, developed by Google researchers, produces sounds that match the style of reminders, including complex sounds like piano music or human voices, in a way that is nearly indistinguishable from original record. The technique shows promise in terms of speeding up the training of AI to generate audio, and it could eventually be used to automatically generate music to accompany videos.
AI-generated audio has become ubiquitous: voices on home assistants like Alexa use natural language processing. AI music systems like OpenAI’s Jukebox have produced impressive results, but most current techniques require people to prepare transcriptions and label training data based on text, which does It takes a lot of time and human labor. For example, Jukebox uses text-based data to generate lyrics.
AudioLM, described in a non-peer-reviewed paper Last month was different: it didn’t require transcription or labeling. Instead, an audio database is fed into the program, and machine learning is used to compress the audio files into audio clips, called “tokens,” without losing too much information. This encrypted training data is then fed into a machine learning model that uses natural language processing to learn the audio samples.
To generate sound, a few seconds of audio is fed into AudioLM, then predict what happens next. This process is similar to how language models like GPT-3 predict sentences and words that often follow one another.
Sound clip released by the team sounds quite natural. In particular, piano music created with AudioLM sounded more fluid than piano music created with existing AI techniques, which tends to sound chaotic…(More)”.
The Equality Machine: Harnessing Digital Technology for a Brighter, More Inclusive Future
Book by Orly Lobel: “Much has been written about the challenges tech presents to equality and democracy. But we can either criticize big data and automation or steer it to do better. Lobel makes a compelling argument that while we cannot stop technological development, we can direct its course according to our most fundamental values.
With provocative insights in every chapter, Lobel masterfully shows that digital technology frequently has a comparative advantage over humans in detecting discrimination, correcting historical exclusions, subverting long-standing stereotypes, and addressing the world’s thorniest problems: climate, poverty, injustice, literacy, accessibility, speech, health, and safety.
Lobel’s vivid examples—from labor markets to dating markets—provide powerful evidence for how we can harness technology for good. The book’s incisive analysis and elegant storytelling will change the debate about technology and restore human agency over our values…(More)”.