Science Diplomacy and the Rise of Technopoles


Article by Vaughan Turekian and Peter Gluckman: “…Science diplomacy has an important, even existential imperative to help the world reconsider the necessity of working together toward big global goals. Climate change may be the most obvious example of where global action is needed, but many other issues have similar characteristics—deep ocean resources, space, and other ungoverned areas, to name a few.

However, taking up this mantle requires acknowledging why past efforts have failed to meet their goals. The global commitment to Sustainable Development Goals (SDGs) is an example. Weaknesses in the UN system, compounded by varied commitments from member states, will prevent the achievement of the SDGs by 2030. This year’s UN Summit of the Future is intended to reboot the global commitment to the sustainability agenda. Regardless of what type of agreement is signed at the summit, its impact may be limited.  

Science diplomacy has an important, even existential imperative to help the world reconsider the necessity of working together toward big global goals.

The science community must play an active part in ensuring progress is in fact made, but that will require an expansion of the community’s current role. To understand what this might mean, consider that the Pact for the Future agreed in New York City in September 2024 places “science, technology, and innovation” as one of its five themes. But that becomes actionable either in the narrow sense that technology will provide “answers” to global problems or in the platitudinous sense that science provides advice that is not acted upon. This dichotomy of unacceptable approaches has long bedeviled science’s influence.

For the world to make better use of science, science must take on an expanded responsibility in solving problems at both global and local scales. And science itself must become part of a toolkit—both at the practical and the diplomatic level—to address the sorts of challenges the world will face in the future. To make this happen, more countries must make science diplomacy a core part of their agenda by embedding science advisors within foreign ministries, connecting diplomats to science communities.

As the pace of technological change generates both existential risk and economic, environmental, and social opportunities, science diplomacy has a vital task in balancing outcomes for the benefit of more people. It can also bring the science community (including the social sciences and humanities) to play a critical role alongside nation states. And, as new technological developments enable nonstate actors, and especially the private sector, science diplomacy has an important role to play in helping nation states develop policy that can identify common solutions and engage key partners…(More)”.

From Bits to Biology: A New Era of Biological Renaissance powered by AI


Article by Milad Alucozai: “…A new wave of platforms is emerging to address these limitations. Designed with the modern scientist in mind, these platforms prioritize intuitive interfaces, enabling researchers with diverse computational backgrounds to easily navigate and analyze data. They emphasize collaboration, allowing teams to share data and insights seamlessly. And they increasingly incorporate artificial intelligence, offering powerful tools for accelerating analysis and discovery. This shift marks a move towards more user-centric, efficient, and collaborative computational biology, empowering researchers to tackle increasingly complex biological questions. 

Emerging Platforms: 

  • Seqera Labs: Spearheading a movement towards efficient and reproducible research, Seqera Labs provides a suite of tools, including the popular open-source workflow language Nextflow. Their platform empowers researchers to design scalable and reproducible data analysis pipelines, particularly for cloud environments. Seqera streamlines complex computational workflows across diverse biological disciplines by emphasizing automation and flexibility, making data-intensive research scalable, flexible, and collaborative. 
  • Form Bio: Aimed at democratizing access to computational biology, Form Bio provides a comprehensive tech suite built to enable accelerated cell and gene therapy development and computational biology at scale. Its emphasis on collaboration and intuitive design fosters a more inclusive research environment to help organizations streamline therapeutic development and reduce time-to-market.  
  • Code Ocean: Addressing the critical need for reproducibility in research, Code Ocean provides a unique platform for sharing and executing research code, data, and computational environments. By encapsulating these elements in a portable and reproducible format, Code Ocean promotes transparency and facilitates the reuse of research methods, ultimately accelerating scientific discovery. 
  • Pluto Biosciences: Championing a collaborative approach to biological discovery, Pluto Biosciences offers an interactive platform for visualizing and analyzing complex biological data. Its intuitive tools empower researchers to explore data, generate insights, and seamlessly share findings with collaborators. This fosters a more dynamic and interactive research process, facilitating knowledge sharing and accelerating breakthroughs. 

 Open Source Platform: 

  • Galaxy: A widely used open-source platform for bioinformatics analysis. It provides a user-friendly web interface and a vast collection of tools for various tasks, from sequence analysis to data visualization. Its open-source nature fosters community development and customization, making it a versatile tool for diverse research needs. 
  • Bioconductor is a prominent open-source platform for bioinformatics analysis, akin to Galaxy’s commitment to accessibility and community-driven development. It leverages the power of the R programming language, providing a wealth of packages for tasks ranging from genomic data analysis to statistical modeling. Its open-source nature fosters a collaborative environment where researchers can freely access, utilize, and contribute to a growing collection of tools…(More)”

Mapmatics: A Mathematician’s Guide to Navigating the World


Book by Paulina Rowińska: “Why are coastlines and borders so difficult to measure? How does a UPS driver deliver hundreds of packages in a single day? And where do elusive serial killers hide? The answers lie in the crucial connection between maps and math.

In Mapmatics, mathematician Paulina Rowińska leads us on a riveting journey around the globe to discover how maps and math are deeply entwined, and always have been. From a sixteenth-century map, an indispensable navigation tool that exaggerates the size of northern countries, to public transport maps that both guide and confound passengers, to congressional maps that can empower or silence whole communities, she reveals how maps and math have shaped not only our sense of space but our worldview. In her hands, we learn how to read maps like a mathematician—to extract richer information and, just as importantly, to question our conclusions by asking what we don’t see…(More)”.

Scientists around the world call to protect research on one of humanity’s greatest short-term threats – Disinformation


Forum on Democracy and Information: “At a critical time for understanding digital communications’ impact on societies, research on disinformation is endangered. 

In August, researchers around the world bid farewell to CrowdTangle – the Meta-owned social media monitoring tool. The decision by Meta to close the number one platform used to track mis- and disinformation, in what is a major election year, only to present its alternative tool Meta Content Library and API, has been met with a barrage of criticism.

If, as suggested by the World Economic Forum’s 2024 global risk report, disinformation is one of the biggest short-term threats to humanity, our collective ability to understand how it spreads and impacts our society is crucial. Just as we would not impede scientific research into the spread of viruses and disease, nor into natural ecosystems or other historical and social sciences, disinformation research must be permitted to be carried out unimpeded and with access to information needed to understand its complexity. Understanding the political economy of disinformation as well as its technological dimensions is also a matter of public health, democratic resilience, and national security.

By directly affecting the research community’s ability to open social media black boxes, this radical decision will also, in turn, hamper public understanding of how technology affects democracy. Public interest scrutiny is also essential for the next era of technology, notably for the world’s largest AI systems, which are similarly proprietary and opaque. The research community is already calling on AI companies to learn from the mistakes of social media and guarantee protections for good faith research. The solution falls on multiple shoulders and the global scientific community, civil society, public institutions and philanthropies must come together to meaningfully foster and protect public interest research on information and democracy…(More)”.

The Arrival of Field Experiments in Economics


Article by Timothy Taylor: “When most people think of “experiments,” they think of test tubes and telescopes, of Petri dishes and Bunsen burners. But the physical apparatus is not central to what an “experiment” means. Instead, what matters is the ability to specify different conditions–and then to observe how the differences in the underlying conditions alter the outcomes. When “experiments” are understood in this broader way, the application of “experiments” is expanded.

For example, back in 1881 when Louis Pasteur tested his vaccine for sheep anthrax, he gave the vaccine to half of a flock of sheep, expose the entire group to anthrax, and showed that those with the vaccine survived. More recently, the “Green Revolution” in agricultural technology was essentially a set of experiments, by systematically breeding plant varieties and then looking at the outcomes in terms of yield, water use, pest resistance, and the like.

This understanding of “experiment” can be applied in economics, as well. John A. List explains in “Field Experiments: Here Today Gone Tomorrow?” (American Economist, published online August 6, 2024). By “field experiments,” List is seeking to differentiate his topic from “lab experiments,” which for economists refers to experiments carried out in a classroom context, often with students as the subjects, and to focus instead on experiments that involve people in the “field”–that is, in the context of their actual economic activities, including work, selling and buying, charitable giving, and the like. As List points out, these kinds of economic experiments have been going on for decades. He points out that government agencies have been conducting field experiments for decades…(More)”.

Synthetic Data and Social Science Research


Paper by Jordan C. Stanley & Evan S. Totty: “Synthetic microdata – data retaining the structure of original microdata while replacing original values with modeled values for the sake of privacy – presents an opportunity to increase access to useful microdata for data users while meeting the privacy and confidentiality requirements for data providers. Synthetic data could be sufficient for many purposes, but lingering accuracy concerns could be addressed with a validation system through which the data providers run the external researcher’s code on the internal data and share cleared output with the researcher. The U.S. Census Bureau has experience running such systems. In this chapter, we first describe the role of synthetic data within a tiered data access system and the importance of synthetic data accuracy in achieving a viable synthetic data product. Next, we review results from a recent set of empirical analyses we conducted to assess accuracy in the Survey of Income & Program Participation (SIPP) Synthetic Beta (SSB), a Census Bureau product that made linked survey-administrative data publicly available. Given this analysis and our experience working on the SSB project, we conclude with thoughts and questions regarding future implementations of synthetic data with validation…(More)”

The Road to Wisdom: On Truth, Science, Faith, and Trust


Book by Francis Collins: “As the COVID-19 pandemic revealed, we have become not just a hyper-partisan society but also a deeply cynical one, distrustful of traditional sources of knowledge and wisdom. Skepticism about vaccines led to the needless deaths of at least 230,000 Americans. “Do your own research” is now a rallying cry in many online rabbit holes. Yet experts can make mistakes, and institutions can lose their moral compass. So how can we navigate through all this?

In The Road to Wisdom, Francis Collins reminds us of the four core sources of judgement and clear thinking: truth, science, faith, and trust. Drawing on his work from the Human Genome Project and heading the National Institutes of Health, as well as on ethics, philosophy, and Christian theology, Collins makes a robust, thoughtful case for each of these sources—their reliability, and their limits. Ultimately, he shows how they work together, not separately—and certainly not in conflict. It is only when we relink these four foundations of wisdom that we can begin to discern the best path forward in life.

​Thoughtful, accessible, winsome, and deeply wise, The Road to Wisdom leads us beyond current animosities to surer footing. Here is the moral, philosophical, and scientific framework with which to address the problems of our time—including distrust of public health, partisanship, racism, response to climate change, and threats to our democracy—but also to guide us in our daily lives. This is a book that will repay many readings, and resolve dilemmas that we all face every day…(More)”.

New Data Browser on education, science, and culture


UNESCO: “The UIS is excited to introduce the new UIS Data Browser, which brings together all our data on education, science, and culture, making it a convenient resource for everyone, from policymakers to researchers.

With a refreshed interface, users can easily view and download customized data for their needs. The new browser also offers better tools for exploring metadata and documentation. Plus, the browser has great visualization features. You can filter indicators by country or region and create line or bar charts to see trends over time. It’s easy to share your findings on social media, too!

For those who like to dive deeper, a web-based UIS Data Application Programming Interface (API) allows for more technical data extraction for use in reports and applications. The UIS Data API provides access to all education, science, and culture data available on the UIS data browser through HTTP requests. It allows for the regular retrieval of data for custom analysis, visualizations, and applications…(More)”.

Augmenting the availability of historical GDP per capita estimates through machine learning


Paper by Philipp Koch, Viktor Stojkoski, and César A. Hidalgo: “Can we use data on the biographies of historical figures to estimate the GDP per capita of countries and regions? Here, we introduce a machine learning method to estimate the GDP per capita of dozens of countries and hundreds of regions in Europe and North America for the past seven centuries starting from data on the places of birth, death, and occupations of hundreds of thousands of historical figures. We build an elastic net regression model to perform feature selection and generate out-of-sample estimates that explain 90% of the variance in known historical income levels. We use this model to generate GDP per capita estimates for countries, regions, and time periods for which these data are not available and externally validate our estimates by comparing them with four proxies of economic output: urbanization rates in the past 500 y, body height in the 18th century, well-being in 1850, and church building activity in the 14th and 15th century. Additionally, we show our estimates reproduce the well-known reversal of fortune between southwestern and northwestern Europe between 1300 and 1800 and find this is largely driven by countries and regions engaged in Atlantic trade. These findings validate the use of fine-grained biographical data as a method to augment historical GDP per capita estimates. We publish our estimates with CI together with all collected source data in a comprehensive dataset…(More)”.

Revisiting the ‘Research Parasite’ Debate in the Age of AI


Article by C. Brandon Ogbunu: “A 2016 editorial published in the New England Journal of Medicine lamented the existence of “research parasites,” those who pick over the data of others rather than generating new data themselves. The article touched on the ethics and appropriateness of this practice. The most charitable interpretation of the argument centered around the hard work and effort that goes into the generation of new data, which costs millions of research dollars and takes countless person-hours. Whatever the merits of that argument, the editorial and its associated arguments were widely criticized.

Given recent advances in AI, revisiting the research parasite debate offers a new perspective on the ethics of sharing and data democracy. It is ironic that the critics of research parasites might have made a sound argument — but for the wrong setting, aimed at the wrong target, at the wrong time. Specifically, the large language models, or LLMs, that underlie generative AI tools such as OpenAI’s ChatGPT, have an ethical challenge in how they parasitize freely available data. These discussions bring up new conversations about data security that may undermine, or at least complicate, efforts at openness and data democratization.

The backlash to that 2016 editorial was swift and violent. Many arguments centered around the anti-science spirit of the message. For example, metanalysis – which re-analyzes data from a selection of studies – is a critical practice that should be encouraged. Many groundbreaking discoveries about the natural world and human health have come from this practice, including new pictures of the molecular causes of depression and schizophrenia. Further, the central criticisms of research parasitism undermine the ethical goals of data sharing and ambitions for open science, where scientists and citizen-scientists can benefit from access to data. This differs from the status quo in 2016, when data published in many of the top journals of the world were locked behind a paywall, illegible, poorly labeled, or difficult to use. This remains largely true in 2024…(More)”.