How to improve economic forecasting


Article by Nicholas Gruen: “Today’s four-day weather forecasts are as accurate as one-day forecasts were 30 years ago. Economic forecasts, on the other hand, aren’t noticeably better. Former Federal Reserve chair Ben Bernanke should ponder this in his forthcoming review of the Bank of England’s forecasting.

There’s growing evidence that we can improve. But myopia and complacency get in the way. Myopia is an issue because economists think technical expertise is the essence of good forecasting when, actually, two things matter more: forecasters’ understanding of the limits of their expertise and their judgment in handling those limits.

Enter Philip Tetlock, whose 2005 book on geopolitical forecasting showed how little experts added to forecasting done by informed non-experts. To compare forecasts between the two groups, he forced participants to drop their vague weasel words — “probably”, “can’t be ruled out” — and specify exactly what they were forecasting and with what probability. 

That started sorting the sheep from the goats. The simple “point forecasts” provided by economists — such as “growth will be 3.0 per cent” — are doubly unhelpful in this regard. They’re silent about what success looks like. If I have forecast 3.0 per cent growth and actual growth comes in at 3.2 per cent — did I succeed or fail? Such predictions also don’t tell us how confident the forecaster is.

By contrast, “a 70 per cent chance of rain” specifies a clear event with a precise estimation of the weather forecaster’s confidence. Having rigorously specified the rules of the game, Tetlock has since shown how what he calls “superforecasting” is possible and how diverse teams of superforecasters do even better. 

What qualities does Tetlock see in superforecasters? As well as mastering necessary formal techniques, they’re open-minded, careful, curious and self-critical — in other words, they’re not complacent. Aware, like Socrates, of how little they know, they’re constantly seeking to learn — from unfolding events and from colleagues…(More)”.

Informing the Global Data Future: Benchmarking Data Governance Frameworks


Paper by Sara Marcucci, Natalia González Alarcón, Stefaan G. Verhulst and Elena Wüllhorst: “Data has become a critical trans-national and cross-border resource. Yet, the lack of a well-defined approach to using it poses challenges to harnessing its value. This article explores the increasing importance of global data governance due to the rapid growth of data, and the need for responsible data practices. The purpose of this paper is to compare approaches and identify patterns in the emergent data governance ecosystem within sectors close to the international development field, ultimately presenting key takeaways and reflections on when and why a global data governance framework may be needed. Overall, the paper provides information about the conditions when a more holistic, coordinated transnational approach to data governance may be needed to responsibly manage the global flow of data. The report does this by (a) considering conditions specified by the literature that may be conducive to global data governance, and (b) analyzing and comparing existing frameworks, specifically investigating six key elements: purpose, principles, anchoring documents, data description and lifecycle, processes, and practices. The article closes with a series of final recommendations, which include adopting a broader concept of data stewardship to reconcile data protection and promotion, focusing on responsible reuse of data to unlock socioeconomic value, harmonizing meanings to operationalize principles, incorporating global human rights frameworks to provide common North Stars, unifying key definitions of data, adopting a data lifecycle approach, incorporating participatory processes and collective agency, investing in new professions with specific roles, improving accountability through oversight and compliance mechanisms, and translating recommendations into practical tools…(More)”

It’s like jury duty, but for getting things done


Article by Hollie Russon Gilman and Amy Eisenstein: “Citizens’ assemblies have the potential to repair our broken politics…Imagine a democracy where people come together and their voices are heard and are translated directly into policy. Frontline workers, doctors, teachers, friends, and neighbors — young and old — are brought together in a random, representative sample to deliberate the most pressing issues facing our society. And they are compensated for their time.

The concept may sound radical. But we already use this method for jury duty. Why not try this widely accepted practice to tackle the deepest, most crucial, and most divisive issues facing our democracy?

The idea — known today as citizens’ assemblies — originated in ancient Athens. Instead of a top-down government, Athens used sortition — a system that was horizontal and distributive. The kleroterion, an allotment machine, randomly selected citizens to hold civic office, ensuring that the people had a direct say in their government’s dealings….(More)”.

The Design of Digital Democracy


Book by Gianluca Sgueo: “Ever-stronger ties between technology, entertainment and design are transforming our relationship with democratic decision-making. When we are online, or when we use digital products and services, we tend to focus more on certain factors like speed of service and user-friendliness, and to overlook the costs – both for ourselves and others. As a result, a widening gap separates our expectations of everything related to digitalization – including government – and the actual practice of democratic governance. Democratic regulators, unable to meet citizens’ demands for tangible, fast and gratifying returns, are seeing the poorest results ever recorded in terms of interest, engagement and retention, despite using the most cutting-edge technologies.

This book explores various aspects of the relationship between democracy, technology and entertainment. These include, on the one hand, the role that digital technology has in strengthening our collective intelligence, nurturing empathic relations between citizens and democratic institutions, and supporting processes of political aggregation, deliberation and collaboration. On the other hand, they comprise the challenges accompanying digital technology for representation, transparency and inclusivity in democratic decision-making.

The book’s main argument is that digital democratic spaces should be redesigned to narrow the gap between the expectations and outcomes of democratic decision-making. It suggests abandoning the notion of digital participatory rights as being fast and easy to enjoy. It also refutes the notion that digital democratic decision-making can only be effective when it delivers rapid and successful responses to the issues of the day, regardless of their complexity.

Ultimately, the success or failure of digital democracy will depend on the ability of public regulators to design digital public spaces with a commitment to complexity, so as to make them appealing, but also effective at engaging citizens…(More)”.

The Legal Singularity


Book by Abdi Aidid and Benjamin Alarie: “…argue that the proliferation of artificial intelligence–enabled technology – and specifically the advent of legal prediction – is on the verge of radically reconfiguring the law, our institutions, and our society for the better.

Revealing the ways in which our legal institutions underperform and are expensive to administer, the book highlights the negative social consequences associated with our legal status quo. Given the infirmities of the current state of the law and our legal institutions, the silver lining is that there is ample room for improvement. With concerted action, technology can help us to ameliorate the problems of the law and improve our legal institutions. Inspired in part by the concept of the “technological singularity,” The Legal Singularity presents a future state in which technology facilitates the functional “completeness” of law, where the law is at once extraordinarily more complex in its specification than it is today, and yet operationally, the law is vastly more knowable, fairer, and clearer for its subjects. Aidid and Alarie describe the changes that will culminate in the legal singularity and explore the implications for the law and its institutions…(More)”.

Data Governance and Policy in Africa


This open access book edited by Bitange Ndemo, Njuguna Ndung’u, Scholastica Odhiambo and Abebe Shimeles: “…examines data governance and its implications for policymaking in Africa. Bringing together economists, lawyers, statisticians, and technology experts, it assesses gaps in both the availability and use of existing data across the continent, and argues that data creation, management and governance need to improve if private and public sectors are to reap the benefits of big data and digital technologies. It also considers lessons from across the globe to assess principles, norms and practices that can guide the development of data governance in Africa….(More)”.

What if You Knew What You Were Missing on Social Media?


Article by Julia Angwin: “Social media can feel like a giant newsstand, with more choices than any newsstand ever. It contains news not only from journalism outlets, but also from your grandma, your friends, celebrities and people in countries you have never visited. It is a bountiful feast.

But so often you don’t get to pick from the buffet. On most social media platforms, algorithms use your behavior to narrow in on the posts you are shown. If you send a celebrity’s post to a friend but breeze past your grandma’s, it may display more posts like the celebrity’s in your feed. Even when you choose which accounts to follow, the algorithm still decides which posts to show you and which to bury.

There are a lot of problems with this model. There is the possibility of being trapped in filter bubbles, where we see only news that confirms our existing beliefs. There are rabbit holes, where algorithms can push people toward more extreme content. And there are engagement-driven algorithms that often reward content that is outrageous or horrifying.

Yet not one of those problems is as damaging as the problem of who controls the algorithms. Never has the power to control public discourse been so completely in the hands of a few profit-seeking corporations with no requirements to serve the public good.

Elon Musk’s takeover of Twitter, which he renamed X, has shown what can happen when an individual pushes a political agenda by controlling a social media company.

Since Mr. Musk bought the platform, he has repeatedly declared that he wants to defeat the “woke mind virus” — which he has struggled to define but largely seems to mean Democratic and progressive policies. He has reinstated accounts that were banned because of the white supremacist and antisemitic views they espoused. He has banned journalists and activists. He has promoted far-right figures such as Tucker Carlson and Andrew Tate, who were kicked off other platforms. He has changed the rules so that users can pay to have some posts boosted by the algorithm, and has purportedly changed the algorithm to boost his own posts. The result, as Charlie Warzel said in The Atlantic, is that the platform is now a “far-right social network” that “advances the interests, prejudices and conspiracy theories of the right wing of American politics.”

The Twitter takeover has been a public reckoning with algorithmic control, but any tech company could do something similar. To prevent those who would hijack algorithms for power, we need a pro-choice movement for algorithms. We, the users, should be able to decide what we read at the newsstand…(More)”.

An AI Model Tested In The Ukraine War Is Helping Assess Damage From The Hawaii Wildfires


Article by Irene Benedicto: “On August 7, 2023, the day before the Maui wildfires started in Hawaii, a constellation of earth-observing satellites took multiple pictures of the island at noon, local time. Everything was quiet, still. The next day, at the same, the same satellites captured images of fires consuming the island. Planet, a San Francisco-based company that owns the largest fleet of satellites taking pictures of the Earth daily, provided this raw imagery to Microsoft engineers, who used it to train an AI model designed to analyze the impact of disasters. Comparing before and after the fire photographs, the AI model created maps that highlighted the most devastated areas of the island.

With this information, the Red Cross rearranged its work on the field that same day to respond to the most urgent priorities first, helping evacuate thousands of people who’ve been affected by one of the deadliest fires in over a century. The Hawaii wildfires have already killed over a hundred people, a hundred more remain missing and at least 11,000 people have been displaced. The relief efforts are ongoing 10 days after the start of the fire, which burned over 3,200 acres. Hawaii Governor Josh Green estimated the recovery efforts could cost $6 billion.

Planet and Microsoft AI were able to pull and analyze the satellite imagery so quickly because they’d struggled to do so the last time they deployed their system: during the Ukraine war. The successful response in Maui is the result of a year and a half of building a new AI tool that corrected fundamental flaws in the previous system, which didn’t accurately recognize collapsed buildings in a background of concrete.

“When Ukraine happened, all the AI models failed miserably,” Juan Lavista, chief scientist at Microsoft AI, told Forbes.

The problem was that the company’s previous AI models were mainly trained with natural disasters in the U.S. and Africa. But devastation doesn’t look the same when it is caused by war and in an Eastern European city. “We learned that having one single model that would adapt to every single place on earth was likely impossible,” Lavista said…(More)”.

Driving Excellence in Official Statistics: Unleashing the Potential of Comprehensive Digital Data Governance


Paper by Hossein Hassani and Steve McFeely: “With the ubiquitous use of digital technologies and the consequent data deluge, official statistics faces new challenges and opportunities. In this context, strengthening official statistics through effective data governance will be crucial to ensure reliability, quality, and access to data. This paper presents a comprehensive framework for digital data governance for official statistics, addressing key components, such as data collection and management, processing and analysis, data sharing and dissemination, as well as privacy and ethical considerations. The framework integrates principles of data governance into digital statistical processes, enabling statistical organizations to navigate the complexities of the digital environment. Drawing on case studies and best practices, the paper highlights successful implementations of digital data governance in official statistics. The paper concludes by discussing future trends and directions, including emerging technologies and opportunities for advancing digital data governance…(More)”.

Should Computers Decide How Much Things Cost?


Article by Colin Horgan: “In the summer of 2012, the Wall Street Journal reported that the travel booking website Orbitz had, in some cases, been suggesting to Apple users hotel rooms that cost more per night than those it was showing to Windows users. The company found that people who used Mac computers spent as much as 30 percent more a night on hotels. It was one of the first high-profile instances where the predictive capabilities of algorithms were shown to impact consumer-facing prices.

Since then, the pool of data available to corporations about each of us (the information we’ve either volunteered or that can be inferred from our web browsing and buying histories) has expanded significantly, helping companies build ever more precise purchaser profiles. Personalized pricing is now widespread, even if many consumers are only just realizing what it is. Recently, other algorithm-driven pricing models, like Uber’s surge or Ticketmaster’s dynamic pricing for concerts, have surprised users and fans. In the past few months, dynamic pricing—which is based on factors such as quantity—has pushed up prices of some concert tickets even before they hit the resale market, including for artists like Drake and Taylor Swift. And while personalized pricing is slightly different, these examples of computer-driven pricing have spawned headlines and social media posts that reflect a growing frustration with data’s role in how prices are dictated.

The marketplace is said to be a realm of assumed fairness, dictated by the rules of competition, an objective environment where one consumer is the same as any other. But this idea is being undermined by the same opaque and confusing programmatic data profiling that’s slowly encroaching on other parts of our lives—the algorithms. The Canadian government is currently considering new consumer-protection regulations, including what to do to control algorithm-based pricing. While strict market regulation is considered by some to be a political risk, another solution may exist—not at the point of sale but at the point where our data is gathered in the first place.

In theory, pricing algorithms aren’t necessarily bad…(More)”.