Daniel Winter at the Financial Times: “…Concern over technology’s capacity both to shrink the world and complicate it has grown steadily since the second world war — little wonder, perhaps, when the existential threats it throws up have expanded from nuclear weapons to encompass climate change (and any consequent geoengineering), gene editing and AI as well. The financial crisis of 2008, in which poorly understood investment instruments made economies totter, has added to the unease over our ability to make sense of things.
From preoccupying cold war planners, attempts to codify best practice in sense-making have gone on to exercise (often profitably) business academics and management consultants, and now draw large audiences online.
Blogs, podcasts and YouTube channels such as Rebel Wisdom and Future Thinkers aim to arm their followers with the tools they need to understand the world, and make the right decisions. Daniel Schmachtenberger is one such voice, whose interviews on YouTube and his podcast Civilization Emerging have reached hundreds of thousands of people.
“Due to increasing technological capacity — increasing population multiplied by increasing impact per person — we’re making more and more consequential choices with worse and worse sense-making to inform those choices,” he says in one video. “Exponential tech is leading to exponential disinformation.” Strengthening individuals’ ability to handle and filter information would go a long way towards improving the “information ecology”, Mr Schmachtenberger argues. People need to get used to handling complex information and should train themselves to be less distracted. “The impulse to say, ‘hey, make it really simple so everyone can get it’ and the impulse to say ‘[let’s] help people actually make sense of the world well’ are different things,” he says. Of course, societies have long been accustomed to handling complexity. No one person can possibly memorise the entirety of US law or be an expert in every field of medicine. Libraries, databases, and professional and academic networks exist to aggregate expertise.
The increasing bombardment of data — the growing amount of evidence that can inform any course of action — pushes such systems to the limit, prompting people to offload the work to computers. Yet this only defers the problem. As AI becomes more sophisticated, its decision-making processes become more opaque. The choice as to whether to trust it — to let it run a self-driving car in a crowded town, say — still rests with us.
Far from being able to outsource all complex thinking to the cloud, Prof Guillén warns that leaders will need to be as skilled as ever at handling and critically evaluating information. It will be vital, he suggests, to build flexibility into the policymaking process.
“The feedback loop between the effects of the policy and how you need to recalibrate the policy in real time becomes so much faster and so much more unpredictable,” he says. “That’s the effect that complex policies produce.” A more piecemeal approach could better suit regulation in fast-moving fields, he argues, with shorter “bursts” of rulemaking, followed by analysis of the effects and then adjustments or additions where necessary.
Yet however adept policymakers become at dealing with a complex world, their task will at some point always resist simplification. That point is where the responsibility resides. Much as we may wish it otherwise, governance will always be as much an art as a science….(More)”.