Paper by Carolin Martina Rauter, Sabine Wöhlke & Silke Schicktanz: “Personalized medicine (PM) operates with biological data to optimize therapy or prevention and to achieve cost reduction. Associated data may consist of large variations of informational subtypes e.g. genetic characteristics and their epigenetic modifications, biomarkers or even individual lifestyle factors. Present innovations in the field of information technology have already enabled the procession of increasingly large amounts of such data (‘volume’) from various sources (‘variety’) and varying quality in terms of data accuracy (‘veracity’) to facilitate the generation and analyzation of messy data sets within a short and highly efficient time period (‘velocity’) to provide insights into previously unknown connections and correlations between different items (‘value’). As such developments are characteristics of Big Data approaches, Big Data itself has become an important catchphrase that is closely linked to the emerging foundations and approaches of PM. However, as ethical concerns have been pointed out by experts in the debate already, moral concerns by stakeholders such as patient organizations (POs) need to be reflected in this context as well. We used an empirical-ethical approach including a website-analysis and 27 telephone-interviews for gaining in-depth insight into German POs’ perspectives on PM and Big Data. Our results show that not all POs are stakeholders in the same way. Comparing the perspectives and political engagement of the minority of POs that is currently actively involved in research around PM and Big Data-driven research led to four stakeholder sub-classifications: ‘mediators’ support research projects through facilitating researcher’s access to the patient community while simultaneously selecting projects they preferably support while ‘cooperators’ tend to contribute more directly to research projects by providing and implemeting patient perspectives. ‘Financers’ provide financial resources. ‘Independents’ keep control over their collected samples and associated patient-related information with a strong interest in making autonomous decisions about its scientific use. A more detailed terminology for the involvement of POs as stakeholders facilitates the adressing of their aims and goals. Based on our results, the ‘independents’ subgroup is a promising candidate for future collaborations in scientific research. Additionally, we identified gaps in PO’s knowledge about PM and Big Data. Based on these findings, approaches can be developed to increase data and statistical literacy. This way, the full potential of stakeholder involvement of POs can be made accessible in discourses around PM and Big Data….(More)”.
Paper by Daniel Innerarity: “Democracy is possible because of an increase in the complexity of society, but that same complexity seems to threaten democracy. There is a clear imbalance between people’s actual competence and the expectation that citizens in a democratic society will be politically competent. It is not only that society has become more complex but that democratization itself increases the degree of social complexity. This unintelligibility can be overcome through the acquisition of some political competence—such as improving individual knowledge, diverse strategies for simplification or recourse to the experts—that partially reduce this imbalance. My hypothesis is that despite the attraction of de-democratizing procedures, the best solutions are those that are most democratic: strengthening the cooperation and the institutional organization of collective intelligence. The purpose of this article is not to solve all the problems I touch on, but rather to examine how they are related and to provide a general framework for the problem of de-democratization through misunderstanding….(More)”.
Paper by Christopher S. Yoo and Alicia Lai: “Policymakers in the United States have just begun to address regulation of artificial intelligence technologies in recent years, gaining momentum through calls for additional research funding, piece-meal guidance, proposals, and legislation at all levels of government. This Article provides an overview of high-level federal initiatives for general artificial intelligence (AI) applications set forth by the U.S. president and responding agencies, early indications from the incoming Biden Administration, targeted federal initiatives for sector-specific AI applications, pending federal legislative proposals, and state and local initiatives. The regulation of the algorithmic ecosystem will continue to evolve as the United States continues to search for the right balance between ensuring public safety and transparency and promoting innovation and competitiveness on the global stage….(More)”.
Paper by Barry Eichengreen, Cevat Aksoy and Orkun Saka: “It is sometimes said that an effect of the COVID-19 pandemic will be heightened appreciation of the importance of scientific research and expertise. We test this hypothesis by examining how exposure to previous epidemics affected trust in science and scientists. Building on the “impressionable years hypothesis” that attitudes are durably formed during the ages 18 to 25, we focus on individuals exposed to epidemics in their country of residence at this particular stage of the life course. Combining data from a 2018 Wellcome Trust survey of more than 75,000 individuals in 138 countries with data on global epidemics since 1970, we show that such exposure has no impact on views of science as an endeavor but that it significantly reduces trust in scientists and in the benefits of their work. We also illustrate that the decline in trust is driven by the individuals with little previous training in science subjects. Finally, our evidence suggests that epidemic-induced distrust translates into lower compliance with health-related policies in the form of negative views towards vaccines and lower rates of child vaccination….(More)”.
Paper by Kelli A. Bird et al: “Do successful local nudge interventions maintain efficacy when scaled state or nationwide? We investigate, through two randomized controlled trials, the impact of a national and state-level campaign encouraging students to apply for financial aid for college. The campaigns collectively reached over 800,000 students, with multiple treatment arms patterned after prior local interventions in order to explore potential mechanisms. We find no impacts on aid receipt or college enrollment overall or for any subgroups. We find no evidence that different approaches to message framing, delivery, or timing, or access to one-on-one advising affected campaign efficacy. We discuss why nudge strategies that work locally may be hard to scale effectively….(More)”.
Paper by Andrew Gelman and Yotam Margalit: “To explain the political clout of different social groups, traditional accounts typically focus on the group’s size, resources, or commonality and intensity of its members’ interests. We contend that a group’s penumbra—the set of individuals who are personally familiar with people in that group—is another important explanatory factor that merits systematic analysis. To this end, we designed a panel study that allows us to learn about the characteristics of the penumbras of politically relevant groups such as gay people, the unemployed, or recent immigrants. Our study reveals major and systematic differences in the penumbras of various social groups, even ones of similar size. Moreover, we find evidence that entering a group’s penumbra is associated with a change in attitude on group-related policy questions. Taken together, our findings suggest that penumbras are pertinent for understanding variation in the political standing of different groups in society….(More)”.
Paper by Alexandre Pólvora and Susana Nascimento: “This paper depicts a theoretical and methodological experimentation approach developed at the EU Policy Lab of the European Commission’s Joint Research Centre. The approach is first framed by its larger institutional context and positioned in a back-end space of public sector innovation. With an internal and self-reflexive departure point, our purpose is to outline it as catalyst of future-oriented explorations, simultaneously nurtured by evidence-based knowledge, and its own transdisciplinary set of experimentation concepts and practices. In addition, to allow for its observation in a practical stage, the paper showcases an empirical illustration of the approach in a forward-looking project for policy advice.
#Blockchain4EU was an exploration of existing, emerging or potential applications of blockchain in industrial and non-financial sectors, with attention to plausible near future applications and scenarios, and focus on possible policy, economic, social, technical, legal and environmental impacts. The approach is anchored on desk and qualitative research throughout the project. But its primary outputs emerge from participatory foresight, collective vision building and co-creation workshops, and the prototyping of speculative artefacts through multi-stakeholder engagement. The purpose is to stimulate anticipatory governance frameworks in general, and push the frontiers of what is common practice in policy when considering emerging technologies….(More)”
Paper by Josh Cowls, Andreas Tsamados, Mariarosaria Taddeo & Luciano Floridi: “Initiatives relying on artificial intelligence (AI) to deliver socially beneficial outcomes—AI for social good (AI4SG)—are on the rise. However, existing attempts to understand and foster AI4SG initiatives have so far been limited by the lack of normative analyses and a shortage of empirical evidence. In this Perspective, we address these limitations by providing a definition of AI4SG and by advocating the use of the United Nations’ Sustainable Development Goals (SDGs) as a benchmark for tracing the scope and spread of AI4SG. We introduce a database of AI4SG projects gathered using this benchmark, and discuss several key insights, including the extent to which different SDGs are being addressed. This analysis makes possible the identification of pressing problems that, if left unaddressed, risk hampering the effectiveness of AI4SG initiatives….(More)”.
Paper by Chris Norval, Jennifer Cobbe and Jatinder Singh: “As the IoT becomes increasingly ubiquitous, concerns are being raised about how IoT systems are being built and deployed. Connected devices will generate vast quantities of data, which drive algorithmic systems and result in real-world consequences. Things will go wrong, and when they do, how do we identify what happened, why they happened, and who is responsible? Given the complexity of such systems, where do we even begin?
This chapter outlines aspects of accountability as they relate to IoT, in the context of the increasingly interconnected and data-driven nature of such systems. Specifically, we argue the urgent need for mechanisms – legal, technical, and organisational – that facilitate the review of IoT systems. Such mechanisms work to support accountability, by enabling the relevant stakeholders to better understand, assess, interrogate and challenge the connected environments that increasingly pervade our world….(More)”
Paper by Frank Hendriks: “Pushed by technological, cultural and related political drivers, a ‘new plebiscitary democracy’ is emerging which challenges established electoral democracy as well as variants of deliberative democracy. The new plebiscitary democracy reinvents and radicalizes longer-existing methods (initiative, referendum, recall, primary, petition, poll) with new tools and applications (mostly digital). It comes with a comparatively thin conceptualization of democracy, invoking the bare notion of a demos whose aggregated will is to steer actors and issues in public governance in a straight majoritarian way. In addition to unravelling the reinvented logic of plebiscitary democracy in conceptual terms, this article fleshes out an empirically informed matrix of emerging formats, distinguishing between votations that are ‘political-leader’ and ‘public-issue’ oriented on the one hand, and ‘inside-out’ and ‘outside-in’ initiated on the other hand. Relatedly, it proposes an agenda for systematic research into the various guises, drivers and implications of the new plebiscitary democracy. Finally, it reflects on possible objections to the argumentation….(More)”