What Big Tech Knows About Your Body


Article by Yael Grauer: “If you were seeking online therapy from 2017 to 2021—and a lot of people were—chances are good that you found your way to BetterHelp, which today describes itself as the world’s largest online-therapy purveyor, with more than 2 million users. Once you were there, after a few clicks, you would have completed a form—an intake questionnaire, not unlike the paper one you’d fill out at any therapist’s office: Are you new to therapy? Are you taking any medications? Having problems with intimacy? Experiencing overwhelming sadness? Thinking of hurting yourself? BetterHelp would have asked you if you were religious, if you were LGBTQ, if you were a teenager. These questions were just meant to match you with the best counselor for your needs, small text would have assured you. Your information would remain private.

Except BetterHelp isn’t exactly a therapist’s office, and your information may not have been completely private. In fact, according to a complaint brought by federal regulators, for years, BetterHelp was sharing user data—including email addresses, IP addresses, and questionnaire answers—with third parties, including Facebook and Snapchat, for the purposes of targeting ads for its services. It was also, according to the Federal Trade Commission, poorly regulating what those third parties did with users’ data once they got them. In July, the company finalized a settlement with the FTC and agreed to refund $7.8 million to consumers whose privacy regulators claimed had been compromised. (In a statement, BetterHelp admitted no wrongdoing and described the alleged sharing of user information as an “industry-standard practice.”)

We leave digital traces about our health everywhere we go: by completing forms like BetterHelp’s. By requesting a prescription refill online. By clicking on a link. By asking a search engine about dosages or directions to a clinic or pain in chest dying. By shopping, online or off. By participating in consumer genetic testing. By stepping on a smart scale or using a smart thermometer. By joining a Facebook group or a Discord server for people with a certain medical condition. By using internet-connected exercise equipment. By using an app or a service to count your steps or track your menstrual cycle or log your workouts. Even demographic and financial data unrelated to health can be aggregated and analyzed to reveal or infer sensitive information about people’s physical or mental-health conditions…(More)”.

From Print to Pixels: The Changing Landscape of the Public Sphere in the Digital Age


Paper by Taha Yasseri: “This Mini Review explores the evolution of the public sphere in the digital age. The public sphere is a social space where individuals come together to exchange opinions, discuss public affairs, and engage in collective decision-making. It is considered a defining feature of modern democratic societies, allowing citizens to participate in public life and promoting transparency and accountability in the political process. This Mini Review discusses the changes and challenges faced by the public sphere in recent years, particularly with the advent of new communication technologies such as the Internet and social media. We highlight benefits such as a) increase in political participation, b) facilitation of collective action, c) real time spread of information, and d) democratization of information exchange; and harms such as a) increasing polarization of public discourse, b) the spread of misinformation, and c) the manipulation of public opinion by state and non-state actors. The discussion will conclude with an assessment of the digital age public sphere in established democracies like the US and the UK…(More)”.

On the culture of open access: the Sci-hub paradox


Paper by Abdelghani Maddi and David Sapinho: “Shadow libraries, also known as ”pirate libraries”, are online collections of copyrighted publications that have been made available for free without the permission of the copyright holders. They have gradually become key players of scientific knowledge dissemination, despite their illegality in most countries of the world. Many publishers and scientist-editors decry such libraries for their copyright infringement and loss of publication usage information, while some scholars and institutions support them, sometimes in a roundabout way, for their role in reducing inequalities of access to knowledge, particularly in low-income countries. Although there is a wealth of literature on shadow libraries, none of this have focused on its potential role in knowledge dissemination, through the open access movement. Here we analyze how shadow libraries can affect researchers’ citation practices, highlighting some counter-intuitive findings about their impact on the Open Access Citation Advantage (OACA). Based on a large randomized sample, this study first shows that OA publications, including those in fully OA journals, receive more citations than their subscription-based counterparts do. However, the OACA has slightly decreased over the seven last years. The introduction of a distinction between those accessible or not via the Scihub platform among subscription-based suggest that the generalization of its use cancels the positive effect of OA publishing. The results show that publications in fully OA journals are victims of the success of Sci-hub. Thus, paradoxically, although Sci-hub may seem to facilitate access to scientific knowledge, it negatively affects the OA movement as a whole, by reducing the comparative advantage of OA publications in terms of visibility for researchers. The democratization of the use of Sci-hub may therefore lead to a vicious cycle, hindering efforts to develop full OA strategies without proposing a credible and sustainable alternative model for the dissemination of scientific knowledge…(More)”.

The Forgotten “Emerging” Technology


Report by Michael Garcia: “The widespread deployment of 5G devices in the United States will spur widespread use of augmented reality, virtual reality, and mixed reality applications—collectively known as extended reality. The over-commercialization of the term “metaverse” has impeded honest conversations about the implications of an insecure metaverse and the technologies associated with it. While these applications and devices will bring significant benefits, they will be accompanied by numerous cybersecurity challenges. As a result, U.S. policymakers run afoul of repeating past mistakes: failing to secure technology before it ushers in a new era of national security concerns. The U.S. government must work closely with industry, academia, nonprofits, and international partners to begin thinking about these consequential issues…(More)”.

Missing Persons: The Case of National AI Strategies


Article by Susan Ariel Aaronson and Adam Zable: “Policy makers should inform, consult and involve citizens as part of their efforts to data-driven technologies such as artificial intelligence (AI). Although many users rely on AI systems, they do not understand how these systems use their data to make predictions and recommendations that can affect their daily lives. Over time, if they see their data being misused, users may learn to distrust both the systems and how policy makers regulate them. This paper examines whether officials informed and consulted their citizens as they developed a key aspect of AI policy — national AI strategies. Building on a data set of 68 countries and the European Union, the authors used qualitative methods to examine whether, how and when governments engaged with their citizens on their AI strategies and whether they were responsive to public comment, concluding that policy makers are missing an opportunity to build trust in AI by not using this process to involve a broader cross-section of their constituents…(More)”.

Open Society Barometer: Can Democracy Deliver?


Open Society Foundation Report: “Between May and July of 2023, the Open Society Foundations commissioned a poll of more than 36,000 respondents from 30 countries to gauge the attitudes, concerns, and hopes of people in states with a collective population of over 5.5 billion—making it one of the largest studies of global public opinion on human rights and democracy over conducted.

The polling, conducted by Savanta as well as local vendors in Ukraine, surveyed participants on questions about democracy and human rights, major issues facing their countries and the world, and international governance.

The report, Open Society Barometer: Can Democracy Deliver?, finds that young people around the world hold the least faith in democracy of any age group.  

While the findings suggest that the concept of democracy remains widely popular, and a vast majority want to live in a democratic state, people cited a number of serious concerns that impact their daily life—from climate change to political violence or simply affording enough food to eat. At this critical turning point, the question becomes: can democracy deliver what people need most?…(More)”.

These Prisoners Are Training AI


Article by Morgan Meaker: “…Around the world, millions of so-called “clickworkers” train artificial intelligence models, teaching machines the difference between pedestrians and palm trees, or what combination of words describe violence or sexual abuse. Usually these workers are stationed in the global south, where wages are cheap. OpenAI, for example, uses an outsourcing firm that employs clickworkers in Kenya, Uganda, and India. That arrangement works for American companies, operating in the world’s most widely spoken language, English. But there are not a lot of people in the global south who speak Finnish.

That’s why Metroc turned to prison labor. The company gets cheap, Finnish-speaking workers, while the prison system can offer inmates employment that, it says, prepares them for the digital world of work after their release. Using prisoners to train AI creates uneasy parallels with the kind of low-paid and sometimes exploitive labor that has often existed downstream in technology. But in Finland, the project has received widespread support.

“There’s this global idea of what data labor is. And then there’s what happens in Finland, which is very different if you look at it closely,” says Tuukka Lehtiniemi, a researcher at the University of Helsinki, who has been studying data labor in Finnish prisons.

For four months, Marmalade has lived here, in Hämeenlinna prison. The building is modern, with big windows. Colorful artwork tries to enforce a sense of cheeriness on otherwise empty corridors. If it wasn’t for the heavy gray security doors blocking every entry and exit, these rooms could easily belong to a particularly soulless school or university complex.

Finland might be famous for its open prisons—where inmates can work or study in nearby towns—but this is not one of them. Instead, Hämeenlinna is the country’s highest-security institution housing exclusively female inmates. Marmalade has been sentenced to six years. Under privacy rules set by the prison, WIRED is not able to publish Marmalade’s real name, exact age, or any other information that could be used to identify her. But in a country where prisoners serving life terms can apply to be released after 12 years, six years is a heavy sentence. And like the other 100 inmates who live here, she is not allowed to leave…(More)”.

Artificial Intelligence, Climate Change and Innovative Democratic Governance


Paper by Florian Cortez: “This policy-oriented article explores the sustainability dimension of digitalisation and artificial intelligence (AI). While AI can contribute to halting climate change via targeted applications in specific domains, AI technology in general could also have detrimental effects for climate policy goals. Moreover, digitalisation and AI can have an indirect effect on climate policy via their impact on political processes. It will be argued that, if certain conditions are fulfilled, AI-facilitated digital tools could help with setting up frameworks for bottom-up citizen participation that could generate the legitimacy and popular buy-in required for speedy transformations needed to reach net zero such as radically revamping the energy infrastructure among other crucial elements of the green transition. This could help with ameliorating a potential dilemma of voice versus speed regarding the green transition. The article will further address the nexus between digital applications such as AI and climate justice. Finally, the article will consider whether innovative governance methods could instil new dynamism into the multi-level global climate regime, such as by facilitating interlinkages and integration between different levels. Before implementing innovative governance arrangements, it is crucial to assess whether they do not exacerbate old or even generate new inequalities of access and participation…(More)”

Open Science and Data Protection: Engaging Scientific and Legal Contexts


Editorial Paper of Special Issue edited by Ludovica Paseri: “This paper analyses the relationship between open science policies and data protection. In order to tackle the research data paradox of the contemporary science, i.e., the tension between the pursuit of data-driven scientific research and the crisis of repeatability or reproducibility of science, a theoretical perspective suggests a potential convergence between open science and data protection. Both fields regard governance mechanisms that shall take into account the plurality of interests at stake. The aim is to shed light on the processing of personal data for scientific research purposes in the context of open science. The investigation supports a threefold need: that of broadening the legal debate; of expanding the territorial scope of the analysis, in addition to the extra-territoriality effects of the European Union’s law; and an interdisciplinary discussion. Based on these needs, four perspectives are then identified, that encompass the challenges related to data processing in the context of open science: (i) the contextual and epistemological perspectives; (ii) the legal coordination perspectives; (iii) the governance perspectives; and (iv) the technical perspectives…(More)”.

Initial policy considerations for generative artificial intelligence


OECD Report: “Generative artificial intelligence (AI) creates new content in response to prompts, offering transformative potential across multiple sectors such as education, entertainment, healthcare and scientific research. However, these technologies also pose critical societal and policy challenges that policy makers must confront: potential shifts in labour markets, copyright uncertainties, and risk associated with the perpetuation of societal biases and the potential for misuse in the creation of disinformation and manipulated content. Consequences could extend to the spreading of mis- and disinformation, perpetuation of discrimination, distortion of public discourse and markets, and the incitement of violence. Governments recognise the transformative impact of generative AI and are actively working to address these challenges. This paper aims to inform these policy considerations and support decision makers in addressing them…(More)”.