The promise and pitfalls of the metaverse for science


Paper by Diego Gómez-Zará, Peter Schiffer & Dashun Wang: “The future of the metaverse remains uncertain and continues to evolve, as was the case for many technological advances of the past. Now is the time for scientists, policymakers and research institutions to start considering actions to capture the potential of the metaverse and take concrete steps to avoid its pitfalls. Proactive investments in the form of competitive grants, internal agency efforts and infrastructure building should be considered, supporting innovation and adaptation to the future in which the metaverse may be more pervasive in society.

Government agencies and other research funders could also have a critical role in funding and promoting interoperability and shared protocols among different metaverse technologies and environments. These aspects will help the scientific research community to ensure broad adoption and reproducibility. For example, government research agencies may create an open and publicly accessible metaverse platform with open-source code and standard protocols that can be translated to commercial platforms as needed. In the USA, an agency such as the National Institute of Standards and Technology could set standards for protocols that are suitable for the research enterprise or, alternatively, an international convention could set global standards. Similarly, an agency such as the National Institutes of Health could leverage its extensive portfolio of behavioural research and build and maintain a metaverse for human subjects studies. Within such an ecosystem, researchers could develop and implement their own research protocols with appropriate protections, standardized and reproducible conditions, and secure data management. A publicly sponsored research-focused metaverse — which could be cross-compatible with commercial platforms — may create and capture substantial value for science, from augmenting scientific productivity to protecting research integrity.

There are important precedents for this sort of action in that governments and universities have built open repositories for data in fields such as astronomy and crystallography, and both the US National Science Foundation and the US Department of Energy have built and maintained high-performance computing environments that are available to the broader research community. Such efforts could be replicated and adapted for emerging metaverse technologies, which would be especially beneficial for under-resourced institutions to access and leverage common resources. Critically, the encouragement of private sector innovation and the development of public–private alliances must be balanced with the need for interoperability, openness and accessibility to the broader research community…(More)”.

Best Practices for Disclosure and Citation When Using Artificial Intelligence Tools


Article by Mark Shope: “This article is intended to be a best practices guide for disclosing the use of artificial intelligence tools in legal writing. The article focuses on using artificial intelligence tools that aid in drafting textual material, specifically in law review articles and law school courses. The article’s approach to disclosure and citation is intended to be a starting point for authors, institutions, and academic communities to tailor based on their own established norms and philosophies. Throughout the entire article, the author has used ChatGPT to provide examples of how artificial intelligence tools can be used in writing and how the output of artificial intelligence tools can be expressed in text, including examples of how that use and text should be disclosed and cited. The article will also include policies for professors to use in their classrooms and journals to use in their submission guidelines…(More)”

The Social Side of Evidence-Based Policy


Comment by Adam Gamoran: “To Support Evidence-Based Policymaking, Bring Researchers and Policymakers Together,” by D. Max Crowley and J. Taylor Scott (Issues, Winter 2023), captures a simple truth: getting scientific evidence used in policy is about building relationships of trust between researchers and policymakers—the social side of evidence use. While the idea may seem obvious, it challenges prevailing notions of evidence-based policymaking, which typically rest on a logic akin to “if we build it, they will come.” In fact, the idea that producing high-quality evidence ensures its use is demonstrably false. Even when evidence is timely, relevant, and accessible, and even after researchers have filed their rigorous findings in a clearinghouse, the gap between evidence production and evidence use remains wide.

But how to build such relationships of trust? More than a decade of findings from research supported by the William T. Grant Foundation demonstrates the need for an infrastructure that supports evidence use. Such an infrastructure may involve new roles for staff within policy organizations to engage with research and researchers, as well as provision of resources that build their capacity to do so. For researchers, this infrastructure may involve committing to ongoing, mutual engagement with policymakers, in contrast with the traditional role of conveying written results or presenting findings without necessarily prioritizing policymakers’ concerns. Intermediary organizations such as funders and advocacy groups can play a key role in advancing the two-way streets through which researchers and policymakers can forge closer, more productive relationships…(More)”.

For chemists, the AI revolution has yet to happen


Editorial Team at Nature: “Many people are expressing fears that artificial intelligence (AI) has gone too far — or risks doing so. Take Geoffrey Hinton, a prominent figure in AI, who recently resigned from his position at Google, citing the desire to speak out about the technology’s potential risks to society and human well-being.

But against those big-picture concerns, in many areas of science you will hear a different frustration being expressed more quietly: that AI has not yet gone far enough. One of those areas is chemistry, for which machine-learning tools promise a revolution in the way researchers seek and synthesize useful new substances. But a wholesale revolution has yet to happen — because of the lack of data available to feed hungry AI systems.

Any AI system is only as good as the data it is trained on. These systems rely on what are called neural networks, which their developers teach using training data sets that must be large, reliable and free of bias. If chemists want to harness the full potential of generative-AI tools, they need to help to establish such training data sets. More data are needed — both experimental and simulated — including historical data and otherwise obscure knowledge, such as that from unsuccessful experiments. And researchers must ensure that the resulting information is accessible. This task is still very much a work in progress…(More)”.

Digital Anthropology Meets Data Science


Article by Katie Hillier: “Analyzing online ecosystems in real time, teams of anthropologists and data scientists can begin to understand rapid social changes as they happen.

Ask not what data science can do for anthropology, but what anthropology can do for data science. —Anders Kristian Munk, Why the World Needs Anthropologists Symposium 2022

In the last decade, emerging technologies, such as AI, immersive realities, and new and more addictive social networks, have permeated almost every aspect of our lives. These innovations are influencing how we form identities and belief systems. Social media influences the rise of subcultures on TikTok, the communications of extremist communities on Telegram, and the rapid spread of conspiracy theories that bounce around various online echo chambers. 

People with shared values or experiences can connect and form online cultures at unprecedented scales and speeds. But these new cultures are evolving and shifting faster than our current ability to understand them. 

To keep up with the depth and speed of online transformations, digital anthropologists are teaming up with data scientists to develop interdisciplinary methods and tools to bring the deep cultural context of anthropology to scales available only through data science—producing a surge in innovative methodologies for more effectively decoding online cultures in real time…(More)”.

Enhancing Trust in Science and Democracy in an Age of Misinformation 


Article by Marcia McNutt and Michael Crow: “Therefore, we believe the scientific community must more fully embrace its vital role in producing and disseminating knowledge in democratic societies. In Science in a Democratic Society, philosopher Philip Kitcher reminds us that “science should be shaped to promote democratic ideals.” To produce outcomes that advance the public good, scientists must also assess the moral bases of their pursuits. Although the United States has implemented the democratically driven, publicly engaged, scientific culture that Vannevar Bush outlined in Science, the Endless Frontier in 1945, Kitcher’s moral message remains relevant to both conducting science and communicating the results to the public, which pays for much of the enterprise of scientific discovery and technological innovation. It’s on scientists to articulate the moral and public values of the knowledge that they produce in ways that can be understood by citizens and decisionmakers.

However, by organizing themselves largely into groups that rarely reach beyond their own disciplines and by becoming somewhat disconnected from their fellow citizens and from the values of society, many scientists have become less effective than will be necessary in the future. Scientific culture has often left informing or educating the public to other parties such as science teachers, journalists, storytellers, and filmmakers. Instead, scientists principally share the results of their research within the narrow confines of academic and disciplinary journals…(More)”.

Behavioral Economics: Policy Impact and Future Directions


Report from the National Academies of Sciences, Engineering, and Medicine: “Behavioral economics – a field based in collaborations among economists and psychologists – focuses on integrating a nuanced understanding of behavior into models of decision-making. Since the mid-20th century, this growing field has produced research in numerous domains and has influenced policymaking, research, and marketing. However, little has been done to assess these contributions and review evidence of their use in the policy arena.

Behavioral Economics: Policy Impact and Future Directions examines the evidence for behavioral economics and its application in six public policy domains: health, retirement benefits, climate change, social safety net benefits, climate change, education, and criminal justice. The report concludes that the principles of behavioral economics are indispensable for the design of policy and recommends integrating behavioral specialists into policy development within government units. In addition, the report calls for strengthening research methodology and identifies research priorities for building on the accomplishments of the field to date…(More)”.

A Genealogy of Open


Paper by Betsy Yoon: “The term open has become a familiar part of library and education practice and discourse, with open source software being a common referent. However, the conditions surrounding the emergence of the open source movement are not well understood within librarianship. After identifying capitalism and neoliberalism as structures that shape library and open practice, this article contextualizes the term open by delineating the discursive struggle within the free software movement that led to the emergence of the open source movement. An understanding of the genealogy of open can lend clarity to many of the contradictions that have been grappled with in the literature, such as what open means, whether it supports social justice aims, and its relation to neoliberal and capitalist structures. The article concludes by inquiring into how librarianship and open can reframe practices that are typically oriented toward mitigation and survival to encompass an orientation toward life and flourishing…(More)”.

Soft power, hard choices: Science diplomacy and the race for solutions


Article by Stephan Kuster and Marga Gual Soler: “…Global challenges demand that we build consensus for action. But reaching agreement on how – and even if – science and technology should be applied, for the aggregate benefit of all, is complex, and increasingly so.

Science and technology are tightly intertwined with fast-changing economic, geopolitical, and ideological agendas. That pace of change complicates, and sometimes deviates, the discussions and decisions that could unlock the positive global impact of scientific advances.

Therefore, anticipation is key. Understanding the societal, economic, and geopolitical consequences of emerging and possible new technologies before they are deployed is critical. Just recently, for example, artificial intelligence (AI) labs have been urged by a large number of researchers and leading industry figures to pause the training of powerful AI systems, given the inherent risks to society and humanity’s existence.

Indeed, the rapid pace of scientific development calls for more effective global governance when it comes to emerging technology. That in turn requires better anticipatory tools and new mechanisms to embed the science community as key stakeholder and influencer in this work.

The Geneva Science and Diplomacy Anticipator (GESDA) was created with those goals in mind. GESDA identifies the most significant science breakthroughs in the next five, 10, and 25 years. It assesses those advances with the potential to most profoundly to impact people, society, and the planet. It then brings together scientific and policy leaders from around the world to devise the diplomatic envelopes and approaches needed to embrace these advances, while minimizing downsides risks of unintended consequences…(More)”.

Institutional review boards need new skills to review data sharing and management plans


Article by Vasiliki Rahimzadeh, Kimberley Serpico & Luke Gelinas: “New federal rules require researchers to submit plans for how to manage and share their scientific data, but institutional ethics boards may be underprepared to review them.

Data sharing is widely considered a conduit to scientific progress, the benefits of which should return to individuals and communities who invested in that science. This is the central premise underpinning changes recently announcement by the US Office of Science Technology and Policy (OSTP)1 on sharing and managing data generated from federally funded research. Researchers will now be required to make publicly accessible any scholarly publications stemming from their federally funded research, as well as supporting data, according to the OSTP announcement. However, the attendant risks to individuals’ privacy-related interests and the increasing threat of community-based harms remain barriers to fostering a trustworthy ecosystem of biomedical data science.

Institutional review boards (IRBs) are responsible for ensuring protections for all human participants engaged in research, but they rarely include members with specialized expertise needed to effectively minimize data privacy and security risks. IRBs must be prepared to meet these review demands given the new data sharing policy changes. They will need additional resources to conduct quality and effective reviews of data management and sharing (DMS) plans. Practical ways forward include expanding IRB membership, proactively consulting with researchers, and creating new research compliance resources. This Comment will focus on data management and sharing oversight by IRBs in the US, but the globalization of data science research underscores the need for enhancing similar review capacities in data privacy, management and security worldwide…(More)”.