Including the underrepresented


Paper by FIDE: “Deliberative democracy is based on the premise that all voices matter and that we can equally participate in decision-making. However, structural inequalities might prevent certain groups from being recruited for deliberation, skewing the process towards the socially privileged. Those structural inequalities are also present in the deliberation room, which can lead to unconscious (or conscious) biases that hinder certain voices while amplifying others. This causes particular perspectives to influence decision-making unequally.

This paper presents different methods and strategies applied in previous processes to increase the inclusion of underrepresented groups. We distinguish strategies for the two critical phases of the deliberative process: recruitment and deliberation…(More)”.

Data Maturity Assessment for Government


UK Government: “The Data Maturity Assessment (DMA) for Government is a robust and comprehensive framework, designed by the public sector for the public sector. The DMA represents a big step forward in our shared ambition to establish and strengthen the data foundations in government by enabling a granular view of the current status of our data environments.

The systematic and detailed picture that the DMA results provide can be used to deliver value in the data function and across the enterprise. Maturity results, and the progression behaviours/features outlined in the DMA, will be essential to reviewing and setting data strategy. DMA outputs provide a way to communicate and evidence how the data ecosystem is critical to the business. When considered in the context of organisational priorities and responsibilities, DMA outputs can assist in:

  • identifying and mitigating strategic risk arising from low data maturity, and where higher maturity needs to be maintained
  • targeting and prioritising investment in the most important data initiatives
  • assuring the data environment for new services and programmes…(More)”.

Whose data commons? Whose city?


Blog by Gijs van Maanen and Anna Artyushina: “In 2020, the notion of data commons became a staple of the new European Data Governance Strategy, which envisions data cooperatives as key players of the European Union’s (EU) emerging digital market. In this new legal landscape, public institutions, businesses, and citizens are expected to share their data with the licensed data-governance entities that will oversee its responsible reuse. In 2022, the Open Future Foundation released several white papers where the NGO (non-govovernmental organisation) detailed a vision for the publicly governed and funded EU level data commons. Some academic researchers see data commons as a way to break the data silos maintained and exploited by Big Tech and, potentially, dismantle surveillance capitalism.

In this blog post, we discuss data commons as a concept and practice. Our argument here is that, for data commons to become a (partial) solution to the issues caused by data monopolies, they need to be politicised. As smart city scholar Shannon Mattern pointedly argues, the city is not a computer. This means that digitization and datafication of our cities involves making choices about what is worth digitising and whose interests are prioritised. These choices and their implications must be foregrounded when we discuss data commons or any emerging forms of data governance. It is important to ask whose data is made common and, subsequently, whose city we will end up living in. ..(More)”

The Technology/Jobs Puzzle: A European Perspective


Blog by Pierre-Alexandre Balland, Lucía Bosoer and Andrea Renda as part of the work of the Markle Technology Policy and Research Consortium: “In recent years, the creation of “good jobs” – defined as occupations that provide a middle-class living standard, adequate benefits, sufficient economic security, personal autonomy, and career prospects (Rodrik and Sabel 2019; Rodrik and Stantcheva 2021) – has become imperative for many governments. At the same time, developments in industrial value chains and in digital technologies such as Artificial Intelligence (AI) create important challenges for the creation of good jobs. On the one hand, future good jobs may not be found only in manufacturing, ad this requires that industrial policy increasingly looks at services. On the other hand, AI has shown the potential to automate both routine and also non-routine tasks (TTC 2022), and this poses new, important questions on what role humans will play in the industrial value chains of the future. In the report drafted for the Markle Technology Policy and Research Consortium on The Technology/Jobs Puzzle: A European Perspective, we analyze Europe’s approach to the creation of “good jobs”. By mapping Europe’s technological specialization, we estimate in which sectors good jobs are most likely to emerge, and assess the main opportunities and challenges Europe faces on the road to a resilient, sustainable and competitive future economy.The report features an important reflection on how to define job quality and, relatedly “good jobs”. From the perspective of the European Union, job quality can be defined along two distinct dimensions. First, while the internationally agreed definition is rather static (e.g. related to the current conditions of the worker), the emerging interpretation at the EU level incorporates the extent to which a given job leads to nurturing human capital, and thereby empowering workers with more skills and well-being over time. Second, job quality can be seen from a “micro” perspective, which only accounts for the condition of the individual worker; or from a more “macro” perspective, which considers whether the sector in which the job emerges is compatible with the EU’s agenda, and in particular with the twin (green and digital) transition. As a result, we argue that ideally, Europe should avoid creating “good” jobs in “bad” sectors, as well as “bad” jobs in “good” sectors. The ultimate goal is to create “good” jobs in “good” sectors….(More)”

How public money is shaping the future of AI


Report by Ethica: “The European Union aims to become the “home of trustworthy Artificial Intelligence” and has committed the biggest existing public funding to invest in AI over the next decade. However, the lack of accessible data and comprehensive reporting on the Framework Programmes’ results and impact hinder the EU’s capacity to achieve its objectives and undermine the credibility of its commitments. 

This research commissioned by the European AI & Society Fund, recommends publicly accessible data, effective evaluation of the real-world impacts of funding, and mechanisms for civil society participation in funding before investing further public funds to achieve the EU’s goal of being the epicenter of trustworthy AI.

Among its findings, the research has highlighted the negative impact of the European Union’s investment in artificial intelligence (AI). The EU invested €10bn into AI via its Framework Programmes between 2014 and 2020, representing 13.4% of all available funding. However, the investment process is top-down, with little input from researchers or feedback from previous grantees or civil society organizations. Furthermore, despite the EU’s aim to fund market-focused innovation, research institutions and higher and secondary education establishments received 73% of the total funding between 2007 and 2020. Germany, France, and the UK were the largest recipients, receiving 37.4% of the total EU budget.

The report also explores the lack of commitment to ethical AI, with only 30.3% of funding calls related to AI mentioning trustworthiness, privacy, or ethics. Additionally, civil society organizations are not involved in the design of funding programs, and there is no evaluation of the economic or societal impact of the funded work. The report calls for political priorities to align with funding outcomes in specific, measurable ways, citing transport as the most funded sector in AI despite not being an EU strategic focus, while programs to promote SME and societal participation in scientific innovation have been dropped….(More)”.

Unpacking Social Capital


Paper by Ruben Durante, Nicola Mastrorocco, Luigi Minale & James M. Snyder Jr. : “We use novel and unique survey data from Italy to shed light on key questions regarding the measurement of social capital and the use of social capital indicators for empirical work. Our data cover a sample of over 600,000 respondents interviewed between 2000 and 2015. We identify four distinct components of social capital – i) social participation, ii) political participation, iii) trust in others, and iv) trust in institutions – and examine how they relate to each other. We then study how each dimension of social capital relates to various socioeconomic factors both at the individual and the aggregate level, and to various proxies of social capital commonly used in the literature. Finally, building on previous work, we investigate to what extent different dimensions of social capital predict differences in key economic, political, and health outcomes. Our findings support the view that social capital is a multifaceted object with multiple dimensions that, while related, are distinct from each other. Future work should take such multidimensionality into account and carefully consider what measure of social capital to use…(More)”.

When Concerned People Produce Environmental Information: A Need to Re-Think Existing Legal Frameworks and Governance Models?


Paper by Anna Berti Suman, Mara Balestrini, Muki Haklay, and Sven Schade: “When faced with an environmental problem, locals are often among the first to act. Citizen science is increasingly one of the forms of participation in which people take action to help solve environmental problems that concern them. This implies, for example, using methods and instruments with scientific validity to collect and analyse data and evidence to understand the problem and its causes. Can the contribution of environmental data by citizens be articulated as a right? In this article, we explore these forms of productive engagement with a local matter of concern, focussing on their potential to challenge traditional allocations of responsibilities. Taking mostly the perspective of the European legal context, we identify an existing gap between the right to obtain environmental information, granted at present by the Aarhus Convention, and “a right to contribute information” and have that information considered by appointed institutions. We also explore what would be required to effectively practise this right in terms of legal and governance processes, capacities, and infrastructures, and we propose a flexible framework to implement it. Situated at the intersection of legal and governance studies, this article builds on existing literature on environmental citizen science, and on its interplay with law and governance. Our methodological approach combines literature review with legal analysis of the relevant conventions and national rules. We conclude by reflecting on the implications of our analysis, and on the benefits of this legal innovation, potentially fostering data altruism and an active citizenship, and shielding ordinary people against possible legal risks…(More)”.

The wisdom of crowds for improved disaster resilience: a near-real-time analysis of crowdsourced social media data on the 2021 flood in Germany


Paper by Mahsa Moghadas, Alexander Fekete, Abbas Rajabifard & Theo Kötter: “Transformative disaster resilience in times of climate change underscores the importance of reflexive governance, facilitation of socio-technical advancement, co-creation of knowledge, and innovative and bottom-up approaches. However, implementing these capacity-building processes by relying on census-based datasets and nomothetic (or top-down) approaches remains challenging for many jurisdictions. Web 2.0 knowledge sharing via online social networks, whereas, provides a unique opportunity and valuable data sources to complement existing approaches, understand dynamics within large communities of individuals, and incorporate collective intelligence into disaster resilience studies. Using Twitter data (passive crowdsourcing) and an online survey, this study draws on the wisdom of crowds and public judgment in near-real-time disaster phases when the flood disaster hit Germany in July 2021. Latent Dirichlet Allocation, an unsupervised machine learning technique for Topic Modeling, was applied to the corpora of two data sources to identify topics associated with different disaster phases. In addition to semantic (textual) analysis, spatiotemporal patterns of online disaster communication were analyzed to determine the contribution patterns associated with the affected areas. Finally, the extracted topics discussed online were compiled into five themes related to disaster resilience capacities (preventive, anticipative, absorptive, adaptive, and transformative). The near-real-time collective sensing approach reflected optimized diversity and a spectrum of people’s experiences and knowledge regarding flooding disasters and highlighted communities’ sociocultural characteristics. This bottom-up approach could be an innovative alternative to traditional participatory techniques of organizing meetings and workshops for situational analysis and timely unfolding of such events at a fraction of the cost to inform disaster resilience initiatives…(More)”.

Mini Data Centers heat local swimming pools for free


Springwise: “It is now well-understood that data centres consume vast amounts of energy. This is because the banks of servers in the data centres require a lot of cooling, which, in turn, uses a lot of energy. But one data centre has found a use for all the heat that it generates, a use that could also help public facilities such as swimming pools save money on their energy costs.

Deep Green, which runs data centres, has developed small edge data centres that can be installed locally and divert some of their excess heat to warm leisure centres and public swimming pools. The system, dubbed a “digital boiler”, involves immersing central processing unit (CPU) servers in special cooling tubs, which use oil to remove heat from the servers. This oil is then passed through a heat exchanger, which removes the heat and uses it to warm buildings or swimming pools.

Photo source Deep Green

The company says the heat donation from one of its digital boilers will cut a public swimming pool’s gas requirements by around 70 per cent, saving leisure centres thousands of pounds every year while also drastically reducing carbon emissions. Deep Green pays for the electricity it uses and donates the heat for free. This is a huge benefit, as Britain’s public swimming pools are facing massive increases in heating bills, which is causing many to close or restrict their hours…(More)”.

Atlas of the Senseable City


Book by Antoine Picon and Carlo Ratti: “What have smart technologies taught us about cities? What lessons can we learn from today’s urbanites to make better places to live? Antoine Picon and Carlo Ratti argue that the answers are in the maps we make. For centuries, we have relied on maps to navigate the enormity of the city. Now, as the physical world combines with the digital world, we need a new generation of maps to navigate the city of tomorrow. Pervasive sensors allow anyone to visualize cities in entirely new ways—ebbs and flows of pollution, traffic, and internet connectivity.
 
This book explores how the growth of digital mapping, spurred by sensing technologies, is affecting cities and daily lives. It examines how new cartographic possibilities aid urban planners, technicians, politicians, and administrators; how digitally mapped cities could reveal ways to make cities smarter and more efficient; how monitoring urbanites has political and social repercussions; and how the proliferation of open-source maps and collaborative platforms can aid activists and vulnerable populations. With its beautiful, accessible presentation of cutting-edge research, this book makes it easy for readers to understand the stakes of the new information age—and appreciate the timeless power of the city….(More)”.