Paper by Hani Safadi and Richard Thomas Watson: “The rise of digital platforms creates knowledge monopolies that threaten innovation. Their power derives from the imposition of data obligations and persistent coupling on platform participation and their usurpation of the rights to data created by other participants to facilitate information asymmetries. Knowledge monopolies can use machine learning to develop competitive insights unavailable to every other platform participant. This information asymmetry stifles innovation, stokes the growth of the monopoly, and reinforces its ascendency. National or regional governance structures, such as laws and regulatory authorities, constrain economic monopolies deemed not in the public interest. We argue the need for legislation and an associated regulatory mechanism to curtail coercive data obligations, control, eliminate data rights exploitation, and prevent mergers and acquisitions that could create or extend knowledge monopolies…(More)”.
Towards Responsible Quantum Technology
Paper by Mauritz Kop et al: “The expected societal impact of quantum technologies (QT) urges us to proceed and innovate responsibly. This article proposes a conceptual framework for Responsible QT that seeks to integrate considerations about ethical, legal, social, and policy implications (ELSPI) into quantum R&D, while responding to the Responsible Research and Innovation dimensions of anticipation, inclusion, reflection and responsiveness. After examining what makes QT unique, we argue that quantum innovation should be guided by a methodological framework for Responsible QT, aimed at jointly safeguarding against risks by proactively addressing them, engaging stakeholders in the innovation process, and continue advancing QT (‘SEA’). We further suggest operationalizing the SEA-framework by establishing quantum-specific guiding principles. The impact of quantum computing on information security is used as a case study to illustrate (1) the need for a framework that guides Responsible QT, and (2) the usefulness of the SEA-framework for QT generally. Additionally, we examine how our proposed SEA-framework for responsible innovation can inform the emergent regulatory landscape affecting QT, and provide an outlook of how regulatory interventions for QT as base-layer technology could be designed, contextualized, and tailored to their exceptional nature in order to reduce the risk of unintended counterproductive effects of policy interventions.
Laying the groundwork for a responsible quantum ecosystem, the research community and other stakeholders are called upon to further develop the recommended guiding principles, and discuss their operationalization into best practices and real-world applications. Our proposed framework should be considered a starting point for these much needed, highly interdisciplinary efforts…(More)”.
Unpacking Social Capital
Paper by Ruben Durante, Nicola Mastrorocco, Luigi Minale & James M. Snyder Jr. : “We use novel and unique survey data from Italy to shed light on key questions regarding the measurement of social capital and the use of social capital indicators for empirical work. Our data cover a sample of over 600,000 respondents interviewed between 2000 and 2015. We identify four distinct components of social capital – i) social participation, ii) political participation, iii) trust in others, and iv) trust in institutions – and examine how they relate to each other. We then study how each dimension of social capital relates to various socioeconomic factors both at the individual and the aggregate level, and to various proxies of social capital commonly used in the literature. Finally, building on previous work, we investigate to what extent different dimensions of social capital predict differences in key economic, political, and health outcomes. Our findings support the view that social capital is a multifaceted object with multiple dimensions that, while related, are distinct from each other. Future work should take such multidimensionality into account and carefully consider what measure of social capital to use…(More)”.
Responding to the coronavirus disease-2019 pandemic with innovative data use: The role of data challenges
Paper by Jamie Danemayer, Andrew Young, Siobhan Green, Lydia Ezenwa and Michael Klein: “Innovative, responsible data use is a critical need in the global response to the coronavirus disease-2019 (COVID-19) pandemic. Yet potentially impactful data are often unavailable to those who could utilize it, particularly in data-poor settings, posing a serious barrier to effective pandemic mitigation. Data challenges, a public call-to-action for innovative data use projects, can identify and address these specific barriers. To understand gaps and progress relevant to effective data use in this context, this study thematically analyses three sets of qualitative data focused on/based in low/middle-income countries: (a) a survey of innovators responding to a data challenge, (b) a survey of organizers of data challenges, and (c) a focus group discussion with professionals using COVID-19 data for evidence-based decision-making. Data quality and accessibility and human resources/institutional capacity were frequently reported limitations to effective data use among innovators. New fit-for-purpose tools and the expansion of partnerships were the most frequently noted areas of progress. Discussion participants identified building capacity for external/national actors to understand the needs of local communities can address a lack of partnerships while de-siloing information. A synthesis of themes demonstrated that gaps, progress, and needs commonly identified by these groups are relevant beyond COVID-19, highlighting the importance of a healthy data ecosystem to address emerging threats. This is supported by data holders prioritizing the availability and accessibility of their data without causing harm; funders and policymakers committed to integrating innovations with existing physical, data, and policy infrastructure; and innovators designing sustainable, multi-use solutions based on principles of good data governance…(More)”.
The Normative Challenges of AI in Outer Space: Law, Ethics, and the Realignment of Terrestrial Standards
Paper by Ugo Pagallo, Eleonora Bassi & Massimo Durante: “The paper examines the open problems that experts of space law shall increasingly address over the next few years, according to four different sets of legal issues. Such differentiation sheds light on what is old and what is new with today’s troubles of space law, e.g., the privatization of space, vis-à-vis the challenges that AI raises in this field. Some AI challenges depend on its unique features, e.g., autonomy and opacity, and how they affect pillars of the law, whether on Earth or in space missions. The paper insists on a further class of legal issues that AI systems raise, however, only in outer space. We shall never overlook the constraints of a hazardous and hostile environment, such as on a mission between Mars and the Moon. The aim of this paper is to illustrate what is still mostly unexplored or in its infancy in this kind of research, namely, the fourfold ways in which the uniqueness of AI and that of outer space impact both ethical and legal standards. Such standards shall provide for thresholds of evaluation according to which courts and legislators evaluate the pros and cons of technology. Our claim is that a new generation of sui generis standards of space law, stricter or more flexible standards for AI systems in outer space, down to the “principle of equality” between human standards and robotic standards, will follow as a result of this twofold uniqueness of AI and of outer space…(More)”.
Protecting the integrity of survey research
Paper by Jamieson, Kathleen Hall, et al: “Although polling is not irredeemably broken, changes in technology and society create challenges that, if not addressed well, can threaten the quality of election polls and other important surveys on topics such as the economy. This essay describes some of these challenges and recommends remediations to protect the integrity of all kinds of survey research, including election polls. These 12 recommendations specify ways that survey researchers, and those who use polls and other public-oriented surveys, can increase the accuracy and trustworthiness of their data and analyses. Many of these recommendations align practice with the scientific norms of transparency, clarity, and self-correction. The transparency recommendations focus on improving disclosure of factors that affect the nature and quality of survey data. The clarity recommendations call for more precise use of terms such as “representative sample” and clear description of survey attributes that can affect accuracy. The recommendation about correcting the record urges the creation of a publicly available, professionally curated archive of identified technical problems and their remedies. The paper also calls for development of better benchmarks and for additional research on the effects of panel conditioning. Finally, the authors suggest ways to help people who want to use or learn from survey research understand the strengths and limitations of surveys and distinguish legitimate and problematic uses of these methods…(More)”.
When Concerned People Produce Environmental Information: A Need to Re-Think Existing Legal Frameworks and Governance Models?
Paper by Anna Berti Suman, Mara Balestrini, Muki Haklay, and Sven Schade: “When faced with an environmental problem, locals are often among the first to act. Citizen science is increasingly one of the forms of participation in which people take action to help solve environmental problems that concern them. This implies, for example, using methods and instruments with scientific validity to collect and analyse data and evidence to understand the problem and its causes. Can the contribution of environmental data by citizens be articulated as a right? In this article, we explore these forms of productive engagement with a local matter of concern, focussing on their potential to challenge traditional allocations of responsibilities. Taking mostly the perspective of the European legal context, we identify an existing gap between the right to obtain environmental information, granted at present by the Aarhus Convention, and “a right to contribute information” and have that information considered by appointed institutions. We also explore what would be required to effectively practise this right in terms of legal and governance processes, capacities, and infrastructures, and we propose a flexible framework to implement it. Situated at the intersection of legal and governance studies, this article builds on existing literature on environmental citizen science, and on its interplay with law and governance. Our methodological approach combines literature review with legal analysis of the relevant conventions and national rules. We conclude by reflecting on the implications of our analysis, and on the benefits of this legal innovation, potentially fostering data altruism and an active citizenship, and shielding ordinary people against possible legal risks…(More)”.
The wisdom of crowds for improved disaster resilience: a near-real-time analysis of crowdsourced social media data on the 2021 flood in Germany
Paper by Mahsa Moghadas, Alexander Fekete, Abbas Rajabifard & Theo Kötter: “Transformative disaster resilience in times of climate change underscores the importance of reflexive governance, facilitation of socio-technical advancement, co-creation of knowledge, and innovative and bottom-up approaches. However, implementing these capacity-building processes by relying on census-based datasets and nomothetic (or top-down) approaches remains challenging for many jurisdictions. Web 2.0 knowledge sharing via online social networks, whereas, provides a unique opportunity and valuable data sources to complement existing approaches, understand dynamics within large communities of individuals, and incorporate collective intelligence into disaster resilience studies. Using Twitter data (passive crowdsourcing) and an online survey, this study draws on the wisdom of crowds and public judgment in near-real-time disaster phases when the flood disaster hit Germany in July 2021. Latent Dirichlet Allocation, an unsupervised machine learning technique for Topic Modeling, was applied to the corpora of two data sources to identify topics associated with different disaster phases. In addition to semantic (textual) analysis, spatiotemporal patterns of online disaster communication were analyzed to determine the contribution patterns associated with the affected areas. Finally, the extracted topics discussed online were compiled into five themes related to disaster resilience capacities (preventive, anticipative, absorptive, adaptive, and transformative). The near-real-time collective sensing approach reflected optimized diversity and a spectrum of people’s experiences and knowledge regarding flooding disasters and highlighted communities’ sociocultural characteristics. This bottom-up approach could be an innovative alternative to traditional participatory techniques of organizing meetings and workshops for situational analysis and timely unfolding of such events at a fraction of the cost to inform disaster resilience initiatives…(More)”.
China Data Flows and Power in the Era of Chinese Big Tech
Paper by W. Gregory Voss and Emmanuel Pernot-Leplay: “Personal data have great economic interest today and their possession and control are the object of geopolitics, leading to their regulation by means that vary dependent on the strategic objectives of the jurisdiction considered. This study fills a gap in the literature in this area by analyzing holistically the regulation of personal data flows both into and from China, the world’s second largest economy. In doing so, it focuses on laws and regulations of three major power blocs: the United States, the European Union, and China, seen within the framework of geopolitics, and considering the rise of Chinese big tech.
First, this study analyzes ways that the United States—the champion of the free-flow of data that has helped feed the success of the Silicon Valley system—has in specific cases prevented data flows to China on grounds of individual data protection and national security. The danger of this approach and alternate protection through potential U.S. federal data privacy legislation are evoked. Second, the cross-border data flow restriction of the European Union’s General Data Protection Regulation (GDPR) is studied in the context of data exports to China, including where the data transit via the United States prior to their transfer to China. Next, after review of the conditions for a European Commission adequacy determination and an examination of recent data privacy legislation in China, the authors provide a preliminary negative assessment of the potential for such a determination for China, where government access is an important part of the picture. Difficult points are highlighted for investigation by data exporters to China, when relying on EU transfer mechanisms, following the Schrems II jurisprudence.
Finally, recent Chinese regulations establishing requirements for the export of data are studied. In this exercise, light is shed on compliance requirements for companies under Chinese law, provisions of Chinese data transfer regulations that are similar to the those of the GDPR, and aspects that show China’s own approach to restrictions on data transfers, such as an emphasis on national security protection. This study concludes with the observation that restrictions for data flows both into and out of China will continue and potentially be amplified, and economic actors will need to prepare themselves to navigate the relevant regulations examined in this study….(More)”.
The pandemic veneer: COVID-19 research as a mobilisation of collective intelligence by the global research community
Paper by Daniel W Hook and James R Wilsdon: “The global research community responded with speed and at scale to the emergence of COVID-19, with around 4.6% of all research outputs in 2020 related to the pandemic. That share almost doubled through 2021, to reach 8.6% of research outputs. This reflects a dramatic mobilisation of global collective intelligence in the face of a crisis. It also raises fundamental questions about the funding, organisation and operation of research. In this Perspective article, we present data that suggests that COVID-19 research reflects the characteristics of the underlying networks from which it emerged, and on which it built. The infrastructures on which COVID-19 research has relied – including highly skilled, flexible research capacity and collaborative networks – predated the pandemic, and are the product of sustained, long-term investment. As such, we argue that COVID-19 research should not be viewed as a distinct field, or one-off response to a specific crisis, but as a ‘pandemic veneer’ layered on top of longstanding interdisciplinary networks, capabilities and structures. These infrastructures of collective intelligence need to be better understood, valued and sustained as crucial elements of future pandemic or crisis response…(More)”.