Open peer-review platform for COVID-19 preprints


Michael A. Johansson & Daniela Saderi in Nature: “The public call for rapid sharing of research data relevant to the COVID-19 outbreak (see go.nature.com/2t1lyp6) is driving an unprecedented surge in (unrefereed) preprints. To help pinpoint the most important research, we have launched Outbreak Science Rapid PREreview, with support from the London-based charity Wellcome. This is an open-source platform for rapid review of preprints related to emerging outbreaks (see https://outbreaksci.prereview.org).

These reviews comprise responses to short, yes-or-no questions, with optional commenting. The questions are designed to capture structured, high-level input on the importance and quality of the research, which can be aggregated across several reviews. Scientists who have ORCID IDs can submit their reviews as they read the preprints (currently limited to the medRxiv, bioRxiv and arXiv repositories). The reviews are open and can be submitted anonymously.

Outbreaks of pathogens such as the SARS-CoV-2 coronavirus that is responsible for COVID-19 move fast and can affect anyone. Research to support outbreak response needs to be fast and open, too, as do mechanisms to review outbreak-related research. Help other scientists, as well as the media, journals and public-health officials, to find the most important COVID-19 preprints now….(More)”.

The Economics of Maps


Abhishek Nagaraj and Scott Stern in the Journal of Economic Perspectives: “For centuries, maps have codified the extent of human geographic knowledge and shaped discovery and economic decision-making. Economists across many fields, including urban economics, public finance, political economy, and economic geography, have long employed maps, yet have largely abstracted away from exploring the economic determinants and consequences of maps as a subject of independent study. In this essay, we first review and unify recent literature in a variety of different fields that highlights the economic and social consequences of maps, along with an overview of the modern geospatial industry. We then outline our economic framework in which a given map is the result of economic choices around map data and designs, resulting in variations in private and social returns to mapmaking. We highlight five important economic and institutional factors shaping mapmakers’ data and design choices. Our essay ends by proposing that economists pay more attention to the endogeneity of mapmaking and the resulting consequences for economic and social welfare…(More)”.

Corporate Capitalism's Use of Openness: Profit for Free?


Book by Arwid Lund and Mariano Zukerfeld: “This book tackles the concept of openness (as in open source software, open access and free culture), from a critical political economy perspective to consider its encroachment by capitalist corporations, but also how it advances radical alternatives to cognitive capitalism.

Drawing on four case studies, Corporate Capitalism’s Use of Openness will add to discussion on open source software, open access content platforms, open access publishing, and open university courses. These otherwise disparate cases share two fundamental features: informational capitalist corporations base their successful business models on unpaid productive activities, play, attention, knowledge and labour, and do so crucially by resorting to ideological uses of concepts such as “openness”, “communities” and “sharing”.

The authors present potential solutions and alternative regulations to counter these exploitative and alienating business models, and to foster digital knowledge commons, ranging from co-ops and commons-based peer production to state agencies’ platforms. Their research and findings will appeal to students, academics and activists around the world in fields such as sociology, economy, media and communication, library and information science, political sciences and technology studies….(More)”.

Automation in Moderation


Article by Hannah Bloch-Wehba: “This Article assesses recent efforts to compel or encourage online platforms to use automated means to prevent the dissemination of unlawful online content before it is ever seen or distributed. As lawmakers in Europe and around the world closely scrutinize platforms’ “content moderation” practices, automation and artificial intelligence appear increasingly attractive options for ridding the Internet of many kinds of harmful online content, including defamation, copyright infringement, and terrorist speech. Proponents of these initiatives suggest that requiring platforms to screen user content using automation will promote healthier online discourse and will aid efforts to limit Big Tech’s power.

In fact, however, the regulations that incentivize platforms to use automation in content moderation come with unappreciated costs for civil liberties and unexpected benefits for platforms. The new automation techniques exacerbate existing risks to free speech and user privacy and create ripe new sources of information for surveillance, aggravating threats to free expression, associational rights, religious freedoms, and equality. Automation also worsens transparency and accountability deficits. Far from curtailing private power, the new regulations endorse and expand platform authority to police online speech, with little in the way of oversight and few countervailing checks. New regulations of online intermediaries should therefore incorporate checks on the use of automation to avoid exacerbating these dynamics. Carefully drawn transparency obligations, algorithmic accountability mechanisms, and procedural safeguards can help to ameliorate the effects of these regulations on users and competition…(More)”.

Many Tech Experts Say Digital Disruption Will Hurt Democracy


Lee Rainie and Janna Anderson at Pew Research Center: “The years of almost unfettered enthusiasm about the benefits of the internet have been followed by a period of techlash as users worry about the actors who exploit the speed, reach and complexity of the internet for harmful purposes. Over the past four years – a time of the Brexit decision in the United Kingdom, the American presidential election and a variety of other elections – the digital disruption of democracy has been a leading concern.

The hunt for remedies is at an early stage. Resistance to American-based big tech firms is increasingly evident, and some tech pioneers have joined the chorus. Governments are actively investigating technology firms, and some tech firms themselves are requesting government regulation. Additionally, nonprofit organizations and foundations are directing resources toward finding the best strategies for coping with the harmful effects of disruption. For example, the Knight Foundation announced in 2019 that it is awarding $50 million in grants to encourage the development of a new field of research centered on technology’s impact on democracy.

In light of this furor, Pew Research Center and Elon University’s Imagining the Internet Center canvassed technology experts in the summer of 2019 to gain their insights about the potential future effects of people’s use of technology on democracy….

The main themes found in an analysis of the experts’ comments are outlined in the next two tables….(More)”.

Can Technology Support Democracy?


Essay by Douglas Schuler: “The utopian optimism about democracy and the internet has given way to disillusionment. At the same time, given the complexity of today’s wicked problems, the need for democracy is critical. Unfortunately democracy is under attack around the world, and there are ominous signs of its retreat.

How does democracy fare when digital technology is added to the picture? Weaving technology and democracy together is risky, and technologists who begin any digital project with the conviction that technology can and will solve “problems” of democracy are likely to be disappointed. Technology can be a boon to democracy if it is informed technology.

The goal in writing this essay was to encourage people to help develop and cultivate a rich democratic sphere. Democracy has great potential that it rarely achieves. It is radical, critical, complex, and fragile. It takes different forms in different contexts. These forms are complex and the solutionism promoted by the computer industry and others is not appropriate in the case of democracies. The primary aim of technology in the service of democracy is not merely to make it easier or more convenient but to improve society’s civic intelligence, its ability to address the problems it faces effectively and equitably….(More)”.

Beyond Takedown: Expanding the Toolkit for Responding to Online Hate


Paper by Molly K. Land and Rebecca J. Hamilton: “The current preoccupation with ‘fake news’ has spurred a renewed emphasis in popular discourse on the potential harms of speech. In the world of international law, however, ‘fake news’ is far from new. Propaganda of various sorts is a well-worn tactic of governments, and in its most insidious form, it has played an instrumental role in inciting and enabling some of the worst atrocities of our time. Yet as familiar as propaganda might be in theory, it is raising new issues as it has migrated to the digital realm. Technological developments have largely outpaced existing legal and political tools for responding to the use of mass communications devices to instigate or perpetrate human rights violations.

This chapter evaluates the current practices of social media companies for responding to online hate, arguing that they are inevitably both overbroad and under-inclusive. Using the example of the role played by Facebook in the recent genocide against the minority Muslim Rohingya population in Myanmar, the chapter illustrates the failure of platform hate speech policies to address pervasive and coordinated online speech, often state-sponsored or state-aligned, denigrating a particular group that is used to justify or foster impunity for violence against that group. Addressing this “conditioning speech” requires a more tailored response that includes remedies other than content removal and account suspensions. The chapter concludes by surveying a range of innovative responses to harmful online content that would give social media platforms the flexibly to intervene earlier, but with a much lighter touch….(More)”.

Nudge Theory and Decision Making: Enabling People to Make Better Choices


Chapter by Vikramsinh Amarsinh Patil: “This chapter examines the theoretical underpinnings of nudge theory and makes a case for incorporating nudging into the decision-making process in corporate contexts. Nudging and more broadly behavioural economics have become buzzwords on account of the seminal work that has been done by economists and highly publicized interventions employed by governments to support national priorities. Firms are not to be left behind, however. What follows is extensive documentation of such firms that have successfully employed nudging techniques. The examples are segmented by the nudge recipient, namely – managers, employees, and consumers. Firms can guide managers to become better leaders, employees to become more productive, and consumers to stay loyal. However, nudging is not without its pitfalls. It can be used towards nefarious ends and be notoriously difficult to implement and execute. Therefore, nudges should be rigorously tested via experimentation and should be ethically sound….(More)”.

Smart Urban Development


Open Access Book edited by Vito Bobek: “Debates about the future of urban development in many countries have been increasingly influenced by discussions of smart cities. Despite numerous examples of this “urban labelling” phenomenon, we know surprisingly little about so-called smart cities. This book provides a preliminary critical discussion of some of the more important aspects of smart cities. Its primary focus is on the experience of some designated smart cities, with a view to problematizing a range of elements that supposedly characterize this new urban form. It also questions some of the underlying assumptions and contradictions hidden within the concept….(More)”.

Irreproducibility is not a sign of failure, but an inspiration for fresh ideas


Editorial at Nature: “Everyone’s talking about reproducibility — or at least they are in the biomedical and social sciences. The past decade has seen a growing recognition that results must be independently replicated before they can be accepted as true.

A focus on reproducibility is necessary in the physical sciences, too — an issue explored in this month’s Nature Physics, in which two metrologists argue that reproducibility should be viewed through a different lens. When results in the science of measurement cannot be reproduced, argue Martin Milton and Antonio Possolo, it’s a sign of the scientific method at work — and an opportunity to promote public awareness of the research process (M. J. T. Milton and A. Possolo Nature Phys26, 117–119; 2020)….

However, despite numerous experiments spanning three centuries, the precise value of G remains uncertain. The root of the uncertainty is not fully understood: it could be due to undiscovered errors in how the value is being measured; or it could indicate the need for new physics. One scenario being explored is that G could even vary over time, in which case scientists might have to revise their view that it has a fixed value.

If that were to happen — although physicists think it unlikely — it would be a good example of non-reproduced data being subjected to the scientific process: experimental results questioning a long-held theory, or pointing to the existence of another theory altogether.

Questions in biomedicine and in the social sciences do not reduce so cleanly to the determination of a fundamental constant of nature. Compared with metrology, experiments to reproduce results in fields such as cancer biology are likely to include many more sources of variability, which are fiendishly hard to control for.

But metrology reminds us that when researchers attempt to reproduce the results of experiments, they do so using a set of agreed — and highly precise — experimental standards, known in the measurement field as metrological traceability. It is this aspect, the authors contend, that helps to build trust and confidence in the research process….(More)”.