Book Review by Reema Patel of “In AI We Trust: Power, Illusion and Control of Predictive Algorithms Helga Nowotny Polity (2021)”: “In the 1980s, a plaque at NASA’s Johnson Space Center in Houston, Texas, declared: “In God we trust. All others must bring data.” Helga Nowotny’s latest book, In AI We Trust, is more than a play on the first phrase in this quote attributed to statistician W. Edwards Deming. It is most occupied with the second idea.
What happens, Nowotny asks, when we deploy artificial intelligence (AI) without interrogating its effectiveness, simply trusting that it ‘works’? What happens when we fail to take a data-driven approach to things that are themselves data driven? And what about when AI is shaped and influenced by human bias? Data can be inaccurate, of poor quality or missing. And technologies are, Nowotny reminds us, “intrinsically intertwined with conscious or unconscious bias since they reflect existing inequalities and discriminatory practices in society”.
Nowotny, a founding member and former president of the European Research Council, has a track record of trenchant thought on how society should handle innovation. Here, she offers a compelling analysis of the risks and challenges of the AI systems that pervade our lives. She makes a strong case for digital humanism: “Human values and perspectives ought to be the starting point” for the design of systems that “claim to serve humanity”….(More)”.