Invention and Innovation: A Brief History of Hype and Failure


Book by Vaclav Smil: “The world is never finished catching up with Vaclav Smil. In his latest and perhaps most readable book, Invention and Innovation, the prolific author—a favorite of Bill Gates—pens an insightful and fact-filled jaunt through the history of human invention. Impatient with the hype that so often accompanies innovation, Smil offers in this book a clear-eyed corrective to the overpromises that accompany everything from new cures for diseases to AI. He reminds us that even after we go quite far along the invention-development-application trajectory, we may never get anything real to deploy. Or worse, even after we have succeeded by introducing an invention, its future may be marked by underperformance, disappointment, demise, or outright harm.

Drawing on his vast breadth of scientific and historical knowledge, Smil explains the difference between invention and innovation, and looks not only at inventions that failed to dominate as promised (such as the airship, nuclear fission, and supersonic flight), but also at those that turned disastrous (leaded gasoline, DDT, and chlorofluorocarbons). And finally, most importantly, he offers a “wish list” of inventions that we most urgently need to confront the staggering challenges of the twenty-first century.

Filled with engaging examples and pragmatic approaches, this book is a sobering account of the folly that so often attends human ingenuity—and how we can, and must, better align our expectations with reality…(More)”.

582,462 and Counting


Series in The New York Times: “They go into the streets in search of data. Peeking behind dumpsters, shining flashlights under bridges, rustling a frosted tent to see if anyone was inside. This is what it takes to count the people in America who don’t have a place to live. To get a number, however flawed, that describes the scope of a deeply entrenched problem and the country’s progress toward fixing it.

Last year, the Biden administration laid out a goal to reduce homelessness by 25 percent by 2025. The problem increasingly animates local politics, with ambitious programs to build affordable housing getting opposition from homeowners who say they want encampments gone but for the solution to be far from their communities. Across the country, homelessness is a subject in which declarations of urgency outweigh measurable progress.

Officially called the Point-in-Time Count, the annual tally of those who live outside or in homeless shelters takes place in every corner of the country through the last 10 days of January, and over the past dozen years has found 550,000 to 650,000 people experiencing homelessness. The endeavor is far from perfect, advocates note, since it captures no more than a few days and is almost certainly a significant undercount. But it’s a snapshot from which resources flow, and creates a shared understanding of a common problem.

This year, reporters and photographers from The New York Times shadowed the count, using a sampling of four very different communities — warm and cold, big and small, rural and urban — to examine the same problem in vastly different places…(More)”.

Leveraging Data for Racial Equity in Workforce Opportunity


Report by CODE: “Across many decades, obstacles to gainful employment have limited the ability of Black Americans and other people of color to obtain well-paying jobs that create wealth and contribute to health and well-being.

A dearth of opportunity in the job market is related to inequalities in education, bias in hiring, and other forms of systemic inequality in the U.S.

Over time, federal efforts have addressed the need to increase diversity, equity, and inclusion in the government workforce, and promoted similar changes in the broader society. While these efforts have brought progress, they have not been entirely effective. At the same time, federal action has made new kinds of data available—data that can shed light on some of the historic drivers of workforce inequity and help inform solutions to their ongoing impact.

This report explores a number of current opportunities to strengthen longstanding data-driven tools to address workforce inequity. The report shows how the effects of workforce discrimination and other historic practices are still being felt today. At the same time, it outlines opportunities to apply data to increase equity in many areas related to the workforce gap, including disparities in health and wellbeing, socioeconomic status, and housing insecurity…(More)”.

Big Data and Public Policy


Book by Rebecca Moody and Victor Bekkers: “This book provides a comprehensive overview of how the course, content and outcome of policy making is affected by big data. It scrutinises the notion that big and open data makes policymaking a more rational process, in which policy makers are able to predict, assess and evaluate societal problems. It also examines how policy makers deal with big data, the problems and limitations they face, and how big data shapes policymaking on the ground. The book considers big data from various perspectives, not just the political, but also the technological, legal, institutional and ethical dimensions. The potential of big data use in the public sector is also assessed, as well as the risks and dangers this might pose. Through several extended case studies, it demonstrates the dynamics of big data and public policy. Offering a holistic approach to the study of big data, this book will appeal to students and scholars of public policy, public administration and data science, as well as those interested in governance and politics…(More)”.

Understanding how to build a social licence for using novel linked datasets for planning and research in Kent, Surrey and Sussex: results of deliberative focus groups.


Paper by Elizabeth Ford et al: “Digital programmes in the newly created NHS integrated care boards (ICBs) in the United Kingdom mean that curation and linkage of anonymised patient data is underway in many areas for the first time. In Kent, Surrey and Sussex (KSS), in Southeast England, public health teams want to use these datasets to answer strategic population health questions, but public expectations around use of patient data are unknown….We aimed to engage with citizens of KSS to gather their views and expectations of data linkage and re-use, through deliberative discussions…
We held five 3-hour deliberative focus groups with 79 citizens of KSS, presenting information about potential uses of data, safeguards, and mechanisms for public involvement in governance and decision making about datasets. After each presentation, participants discussed their views in facilitated small groups which were recorded, transcribed and analysed thematically…
The focus groups generated 15 themes representing participants’ views on the benefits, risks and values for safeguarding linked data. Participants largely supported use of patient data to improve health service efficiency and resource management, preventative services and out of hospital care, joined-up services and information flows. Most participants expressed concerns about data accuracy, breaches and hacking, and worried about commercial use of data. They suggested that transparency of data usage through audit trails and clear information about accountability, ensuring data re-use does not perpetuate stigma and discrimination, ongoing, inclusive and valued involvement of the public in dataset decision-making, and a commitment to building trust, would meet their expectations for responsible data use…
Participants were largely favourable about the proposed uses of patient linked datasets but expected a commitment to transparency and public involvement. Findings were mapped to previous tenets of social license and can be used to inform ICB digital programme teams on how to proceed with use of linked datasets in a trustworthy and socially acceptable way…(More)”.

Secondary data for global health digitalisation


Paper by Anatol-Fiete Näher, et al: “Substantial opportunities for global health intelligence and research arise from the combined and optimised use of secondary data within data ecosystems. Secondary data are information being used for purposes other than those intended when they were collected. These data can be gathered from sources on the verge of widespread use such as the internet, wearables, mobile phone apps, electronic health records, or genome sequencing. To utilise their full potential, we offer guidance by outlining available sources and approaches for the processing of secondary data. Furthermore, in addition to indicators for the regulatory and ethical evaluation of strategies for the best use of secondary data, we also propose criteria for assessing reusability. This overview supports more precise and effective policy decision making leading to earlier detection and better prevention of emerging health threats than is currently the case…(More)”.

Measuring Partial Democracies: Rules and their Implementation


Paper by Debarati Basu,  Shabana Mitra &  Archana Purohit: “This paper proposes a new index that focuses on capturing the extent of democracy in a country using not only the existence of rules but also the extent of their implementation. The measure, based on the axiomatically robust framework of (Alkire and Foster, J Public Econ 95:476–487, 2011), is able to moderate the existence of democratic rules by their actual implementation. By doing this we have a meaningful way of capturing the notion of a partial democracy within a continuum between non-democratic and democratic, separating out situations when the rules exist but are not implemented well. We construct our index using V-Dem data from 1900 to 2010 for over 100 countries to measure the process of democratization across the world. Our results show that we can track the progress in democratization, even when the regime remains either a democracy or an autarchy. This is the notion of partial democracy that our implementation-based index measures through a wide-based index that is consistent, replicable, extendable, easy to interpret, and more nuanced in its ability to capture the essence of democracy…(More)”.

Big Data and the Law of War


Essay by Paul Stephan: “Big data looms large in today’s world. Much of the tech sector regards the building up of large sets of searchable data as part (sometimes the greater part) of its business model. Surveillance-oriented states, of which China is the foremost example, use big data to guide and bolster monitoring of their own people as well as potential foreign threats. Many other states are not far behind in the surveillance arms race, notwithstanding the attempts of the European Union to put its metaphorical finger in the dike. Finally, ChatGPT has revived popular interest in artificial intelligence (AI), which uses big data as a means of optimizing the training and algorithm design on which it depends, as a cultural, economic, and social phenomenon. 

If big data is growing in significance, might it join territory, people, and property as objects of international conflict, including armed conflict? So far it has not been front and center in Russia’s invasion of Ukraine, the war that currently consumes much of our attention. But future conflicts could certainly feature attacks on big data. China and Taiwan, for example, both have sophisticated technological infrastructures that encompass big data and AI capabilities. The risk that they might find themselves at war in the near future is larger than anyone would like. What, then, might the law of war have to say about big data? More generally, if existing law does not meet our needs,  how might new international law address the issue?

In a recent essay, part of an edited volume on “The Future Law of Armed Conflict,” I argue that big data is a resource and therefore a potential target in an armed conflict. I address two issues: Under the law governing the legality of war (jus ad bellum), what kinds of attacks on big data might justify an armed response, touching off a bilateral (or multilateral) armed conflict (a war)? And within an existing armed conflict, what are the rules (jus in bello, also known as international humanitarian law, or IHL) governing such attacks?

The distinction is meaningful. If cyber operations rise to the level of an armed attack, then the targeted state has, according to Article 51 of the U.N. Charter, an “inherent right” to respond with armed force. Moreover, the target need not confine its response to a symmetrical cyber operation. Once attacked, a state may use all forms of armed force in response, albeit subject to the restrictions imposed by IHL. If the state regards, say, a takedown of its financial system as an armed attack, it may respond with missiles…(More)”.

Work and meaning in the age of AI


Report by Daniel Susskind: “It is often said that work is not only a source of income but also of meaning. In this paper, I explore the theoretical and empirical literature that addresses this relationship between work and meaning. I show that the relationship is far less clear than is commonly supposed: There is a great heterogeneity in its nature, both among today’s workers and workers over time. I explain why this relationship matters for policymakers and economists concerned about the impact of technology on work. In the short term, it is important for predicting labour market outcomes of interest. It also matters for understanding how artificial intelligence (AI) affects not only the quantity of work but its quality as well: These new technologies may erode the meaning that people get from their work. In the medium term, if jobs are lost, this relationship also matters for designing bold policy interventions like the ‘Universal Basic Income’ and ‘Job Guarantee Schemes’: Their design, and any choice between them, is heavily dependent on policymakers’—often tacit—assumptions about the nature of this underlying relationship between work and meaning. For instance, policymakers must decide whether to simply focus on replacing lost income alone (as with a Universal Basic Income) or, if they believe that work is an important and non-substitutable source of meaning, on protecting jobs for that additional role as well (as with a Job Guarantee Scheme). In closing, I explore the challenge that the age of AI presents for an important feature of liberal political theory: the idea of ‘neutrality.’..(More)”

Ready, set, share: Researchers brace for new data-sharing rules


Jocelyn Kaiser and Jeffrey Brainard in Science: “…By 2025, new U.S. requirements for data sharing will extend beyond biomedical research to encompass researchers across all scientific disciplines who receive federal research funding. Some funders in the European Union and China have also enacted data-sharing requirements. The new U.S. moves are feeding hopes that a worldwide movement toward increased sharing is in the offing. Supporters think it could speed the pace and reliability of science.

Some scientists may only need to make a few adjustments to comply with the policies. That’s because data sharing is already common in fields such as protein crystallography and astronomy. But in other fields the task could be weighty, because sharing is often an afterthought. For example, a study involving 7750 medical research papers found that just 9% of those published from 2015 to 2020 promised to make their data publicly available, and authors of just 3% actually shared, says lead author Daniel Hamilton of the University of Melbourne, who described the finding at the International Congress on Peer Review and Scientific Publication in September 2022. Even when authors promise to share their data, they often fail to follow through. Out of 21,000 journal articles that included data-sharing plans, a study published in PLOS ONE in 2020 found, fewer than 21% provided links to the repository storing the data.

Journals and funders, too, have a mixed record when it comes to supporting data sharing. Research presented at the September 2022 peer-review congress found only about half of the 110 largest public, corporate, and philanthropic funders of health research around the world recommend or require grantees to share data…

“Health research is the field where the ethical obligation to share data is the highest,” says Aidan Tan, a clinician-researcher at the University of Sydney who led the study. “People volunteer in clinical trials and put themselves at risk to advance medical research and ultimately improve human health.”

Across many fields of science, researchers’ support for sharing data has increased during the past decade, surveys show. But given the potential cost and complexity, many are apprehensive about the NIH policy, and other requirements to follow. “How we get there is pretty messy right now,” says Parker Antin, a developmental biologist and associate vice president for research at the University of Arizona. “I’m really not sure whether the total return will justify the cost. But I don’t know of any other way to find out than trying to do it.”

Science offers this guide as researchers prepare to plunge in….(More)”.