The Case for Abolishing Elections


Essay by Nicholas Coccoma: “Terry Bouricius remembers the moment he converted to democracy by lottery. A bookish Vermonter, now 68, he was elected to the State House in 1990 after working for years as a public official in Burlington. At first state government excited him, but he quickly grew disillusioned. “During my time as a legislator,” he told me in an interview last year, “it became obvious to me that the ‘people’s house’ was not very representative of the people who actually lived in Vermont.”

The revelation came while Bouricius was working on a housing committee. “The committee members were an outgoing and garrulous bunch,” he observed. “Shy wallflowers almost never become legislators.” More disturbing, he noted how his fellow politicians—all of whom owned their homes—tended to legislate in favor of landlords and against tenants. “I saw that the experiences and beliefs of legislators shape legislation far more than facts,” he said. “After that, I frequently commented that any 150 Vermonters pulled from the phone book would be more representative than the elected House membership.”

There is widespread disgust with electoral politics and a hunger for greater responsiveness—a hunger, in other words, for democracy.

Many Americans agree. In a poll conducted in January 2020, 65 percent of respondents said that everyday people selected by lottery—who meet some basic requirements and are willing and able to serve—would perform better or much better compared to elected politicians. In March last year a Pew survey found that a staggering 79 percent believe it’s very or somewhat important for the government to create assemblies where everyday citizens from all walks of life can debate issues and make recommendations about national laws. “My decade of experience serving in the state legislature convinces me that this popular assessment is correct,” Bouricius said.

The idea—technically known as “sortition”—has been spreading. Perhaps its most prominent academic advocate is Yale political theorist Hélène Landemore. Her 2020 book Open Democracy: Reinventing Popular Rule for the Twenty-First Century explores the limitations of both direct democracy and electoral-representative democracy, advocating instead for government by large, randomly selected “mini-publics.” As she put it in conversation with Ezra Klein at the New York Times last year, “I think we are realizing the limits of just being able to choose rulers, as opposed to actually being able to choose outcomes.” She is not alone. Rutgers philosopher Alex Guerrero and Belgian public intellectual David Van Reybrouck have made similar arguments in favor of democracy by lottery. In the 2016 translation of his book Against Elections, Van Reybrouck characterizes elections as “the fossil fuel of politics.” “Whereas once they gave democracy a huge boost,” he writes, “much like the boost that oil gave the economy, it now it turns out they cause colossal problems of their own.”…(More)”.

Algorithms Quietly Run the City of DC—and Maybe Your Hometown


Article by Khari Johnson: “Washington, DC, IS the home base of the most powerful government on earth. It’s also home to 690,000 people—and 29 obscure algorithms that shape their lives. City agencies use automation to screen housing applicants, predict criminal recidivism, identify food assistance fraud, determine if a high schooler is likely to drop out, inform sentencing decisions for young people, and many other things.

That snapshot of semiautomated urban life comes from a new report from the Electronic Privacy Information Center (EPIC). The nonprofit spent 14 months investigating the city’s use of algorithms and found they were used across 20 agencies, with more than a third deployed in policing or criminal justice. For many systems, city agencies would not provide full details of how their technology worked or was used. The project team concluded that the city is likely using still more algorithms that they were not able to uncover.

The findings are notable beyond DC because they add to the evidence that many cities have quietly put bureaucratic algorithms to work across their departments, where they can contribute to decisions that affect citizens’ lives.

Government agencies often turn to automation in hopes of adding efficiency or objectivity to bureaucratic processes, but it’s often difficult for citizens to know they are at work, and some systems have been found to discriminate and lead to decisions that ruin human lives. In Michigan, an unemployment-fraud detection algorithm with a 93 percent error rate caused 40,000 false fraud allegations. A 2020 analysis by Stanford University and New York University found that nearly half of federal agencies are using some form of automated decisionmaking systems…(More)”.

How Food Delivery Workers Shaped Chinese Algorithm Regulations


Article by Matt Sheehan and Sharon Du: “In 2021, China issued a series of policy documents aimed at governing the algorithms that underpin much of the internet today. The policies included a regulation on recommendation algorithms and a draft regulation on synthetically generated media, commonly known as deepfakes. Domestically, Chinese media touted the recommendation engine regulations for the options they gave Chinese internet users, such as the choice to “turn off the algorithm” on major platforms. Outside China, these regulations have largely been seen through the prism of global geopolitics, framed as questions over whether China is “ahead” in algorithm regulations or whether it will export a “Chinese model” of artificial intelligence (AI) governance to the rest of the world.

These are valid questions with complex answers, but they overlook the core driver of China’s algorithm regulations: they are designed primarily to address China’s domestic social, economic, and political problems. The Chinese Communist Party (CCP) is the ultimate arbiter here, deciding both what counts as a problem and how it should be solved. But the CCP doesn’t operate in a vacuum. Like any governing party, it is constantly creating new policies to try to put out fires, head off problems, and respond to public desires.

Through a short case study, we can see how Chinese food delivery drivers, investigative journalists, and academics helped shape one part of the world’s first regulations on recommendation algorithms. From that process, we can learn how international actors might better predict and indirectly influence Chinese algorithm policy…(More)”.

What Moneyball-for-Everything Has Done to American Culture


Article by Derek Thompson: “…The analytics revolution, which began with the movement known as Moneyball, led to a series of offensive and defensive adjustments that were, let’s say, catastrophically successful. Seeking strikeouts, managers increased the number of pitchers per game and pushed up the average velocity and spin rate per pitcher. Hitters responded by increasing the launch angles of their swings, raising the odds of a home run, but making strikeouts more likely as well. These decisions were all legal, and more important, they were all correct from an analytical and strategic standpoint….

When universal smarts lead to universal strategies, it can lead to a more homogenous product. Take the NBA. When every basketball team wakes up to the calculation that three points is 50 percent more than two points, you get a league-wide blitz of three-point shooting to take advantage of the discrepancy. Before the 2011–12 season, the league as a whole had never averaged more than 20 three-point-shot attempts per game. This year, no team is attempting fewer than 25 threes per game; four teams are attempting more than 40.

As I’ve written before, the quantitative revolution in culture is a living creature that consumes data and spits out homogeneity. Take the music industry. Before the ’90s, music labels routinely lied to Billboard about their sales figures to boost their preferred artists. In 1991Billboard switched methodologies to use more objective data, including point-of-sale information and radio surveys that didn’t rely on input from the labels. The charts changed overnight. Rock-and-roll bands were toppled, and hip-hop and country surged. When the charts became more honest, they also became more static. Popular songs stick around longer than they used to. One analysis of the history of pop-music styles found that rap and hip-hop have dominated American pop music longer than any other musical genre. As the analytics revolution in music grew, radio playlists became more repetitive, and by some measures, the most popular songs became more similar to one another…(More)”.

How Technology Companies Are Shaping the Ukraine Conflict


Article by Abishur Prakash: “Earlier this year, Meta, the company that owns Facebook and Instagram, announced that people could create posts calling for violence against Russia on its social media platforms. This was unprecedented. One of the world’s largest technology firms very publicly picked sides in a geopolitical conflict. Russia was now not just fighting a country but also multinational companies with financial stakes in the outcome. In response, Russia announced a ban on Instagram within its borders. The fallout was significant. The ban, which eventually included Facebook, cost Meta close to $2 billion.

Through the war in Ukraine, technology companies are showing how their decisions can affect geopolitics, which is a massive shift from the past. Technology companies have been either dragged into conflicts because of how customers were using their services (e.g., people putting their houses in the West Bank on Airbnb) or have followed the foreign policy of governments (e.g., SpaceX supplying Internet to Iran after the United States removed some sanctions)…(More)”.

Everything dies, including information


Article by Erik Sherman: “Everything dies: people, machines, civilizations. Perhaps we can find some solace in knowing that all the meaningful things we’ve learned along the way will survive. But even knowledge has a life span. Documents fade. Art goes missing. Entire libraries and collections can face quick and unexpected destruction. 

Surely, we’re at a stage technologically where we might devise ways to make knowledge available and accessible forever. After all, the density of data storage is already incomprehensibly high. In the ever-­growing museum of the internet, one can move smoothly from images from the James Webb Space Telescope through diagrams explaining Pythagoras’s philosophy on the music of the spheres to a YouTube tutorial on blues guitar soloing. What more could you want?

Quite a bit, according to the experts. For one thing, what we think is permanent isn’t. Digital storage systems can become unreadable in as little as three to five years. Librarians and archivists race to copy things over to newer formats. But entropy is always there, waiting in the wings. “Our professions and our people often try to extend the normal life span as far as possible through a variety of techniques, but it’s still holding back the tide,” says Joseph Janes, an associate professor at the University of Washington Information School. 

To complicate matters, archivists are now grappling with an unprecedented deluge of information. In the past, materials were scarce and storage space limited. “Now we have the opposite problem,” Janes says. “Everything is being recorded all the time.”…(More)”.

Democratised and declassified: the era of social media war is here


Essay by David V. Gioe & Ken Stolworthy: “In October 1962, Adlai Stevenson, US ambassador to the United Nations, grilled Soviet Ambassador Valerian Zorin about whether the Soviet Union had deployed nuclear-capable missiles to Cuba. While Zorin waffled (and didn’t know in any case), Stevenson went in for the kill: ‘I am prepared to wait for an answer until Hell freezes over… I am also prepared to present the evidence in this room.’ Stevenson then theatrically revealed several poster-sized photographs from a US U-2 spy plane, showing Soviet missile bases in Cuba, directly contradicting Soviet claims to the contrary. It was the first time that (formerly classified) imagery intelligence (IMINT) had been marshalled as evidence to publicly refute another state in high-stakes diplomacy, but it also revealed the capabilities of US intelligence collection to a stunned audience. 

During the Cuban missile crisis — and indeed until the end of the Cold War — such exquisite airborne and satellite collection was exclusively the purview of the US, UK and USSR. The world (and the world of intelligence) has come a long way in the past 60 years. By the time President Putin launched his ‘special military operation’ in Ukraine in late February 2022, IMINT and geospatial intelligence (GEOINT) was already highly democratised. Commercial satellite companies, such as Maxar or Google Earth, provide high resolution images free of charge. Thanks to such ubiquitous imagery online, anyone could see – in remarkable clarity – that the Russian military was massing on Ukraine’s border. Geolocation stamped photos and user generated videos uploaded to social media platforms, such as Telegram or TikTok, enabled  further refinement of – and confidence in – the view of Russian military activity. And continued citizen collection showed a change in Russian positions over time without waiting for another satellite to pass over the area. Of course, such a show of force was not guaranteed to presage an invasion, but there was no hiding the composition and scale of the build-up. 

Once the Russians actually invaded, there was another key development – the democratisation of near real-time battlefield awareness. In a digitally connected context, everyone can be a sensor or intelligence collector, wittingly or unwittingly. This dispersed and crowd-sourced collection against the Russian campaign was based on the huge number of people taking pictures of Russian military equipment and formations in Ukraine and posting them online. These average citizens likely had no idea what exactly they were snapping a picture of, but established military experts on the internet do. Sometimes within minutes, internet platforms such as Twitter had threads and threads of what the pictures were, and what they revealed, providing what intelligence professionals call Russian ‘order of battle’…(More)”.

Citizen science tackles plastics in Ghana


Interview with Dilek Fraisl and Omar Seidu by Stephanie Olen: “An estimated 8 million tonnes of plastic waste leaks into the ocean every year, and Ghana generates approximately 1.1 million tonnes of plastics per year. This is due to the substantial economic growth that Ghana has experienced in recent years, as well as the 2.2% population growth annually, which has urged the Ghanaian authorities to act. Ghana was the first African country to join the Global Plastic Action Partnership in 2019. Ghana also has a growing and active citizen science beach clean-up community including one of our project partners, the Smart Nature Freak Youth Volunteers Foundation (SNFYVF).

Before our work, Ghana had no official data available related to marine plastic litter. Based on the data collected through citizen science initiatives in the country and our project ‘Citizen Science for the SDGs in Ghana’ (CS4SDGs), we now know that in 2020 alone more than 152 million plastic items were found along the beaches in the country…

One of the key factors for the success of our project was due to Ghana’s progressive approach to the use of new sources of data for official statistics. For example, the Ghanaian Government passed the new Statistical Service Act in 2019, which mandates the GSS to coordinate statistical information across the whole government system, develop and raise awareness of codes of ethics and practices to produce data, and include new sources of data as a valid input for production of official statistics. This shows that the effective legal arrangements can prepare the groundwork for citizen science data to be used as official statistics and for SDG monitoring and reporting. Political commitment from the partners in Ghana also helped to achieve success. Ultimately, without the support of citizen science and action groups in the country that actually collected the litter and the data on the ground, this project would have never been successful. Since the start, citizen scientists have been willing to work with the government agencies and international partners, as well as other key stakeholders to support our project, which played a significant role in achieving our result…(More)”.

Public Access to Advance Equity


Essay by Alondra Nelson, Christopher Marcum and Jedidah Isler: “Open science began in the scientific community as a movement committed to making all aspects of research freely available to all members of society. 

As a member of the Organisation for Economic Co-operation and Development (OECD), the United States is committed to promoting open science, which the OECD defines as “unhindered access to scientific articles, access to data from public research, and collaborative research enabled by information and communication technology tools and incentives.”

At the White House Office of Science and Technology Policy (OSTP), we have been inspired by the movement to push for openness in research by community activists, researchers, publishers, higher-education leaders, policymakers, patient advocates, scholarly associations, librarians, open-government proponents, philanthropic organizations, and the public. 

Open science is an essential part of the Biden-Harris administration’s broader commitment to providing public access to data, publications, and the other important products of the nation’s taxpayer-supported research and innovation enterprise. We look to the lessons, methods, and products of open science to deliver on this commitment to policy that advances equity, accelerates discovery and innovation, provides opportunities for all to participate in research, promotes public trust, and is evidence-based. Here, we detail some of the ways OSTP is working to expand the American public’s access to the federal research and development ecosystem, and to ensure it is open, equitable, and secure…(More)”.

Could an algorithm predict the next pandemic?


Article by Simon Makin: “Leap is a machine-learning algorithm that uses sequence data to classify influenza viruses as either avian or human. The model had been trained on a huge number of influenza genomes — including examples of H5N8 — to learn the differences between those that infect people and those that infect birds. But the model had never seen an H5N8 virus categorized as human, and Carlson was curious to see what it made of this new subtype.

Somewhat surprisingly, the model identified it as human with 99.7% confidence. Rather than simply reiterating patterns in its training data, such as the fact that H5N8 viruses do not typically infect people, the model seemed to have inferred some biological signature of compatibility with humans. “It’s stunning that the model worked,” says Carlson. “But it’s one data point; it would be more stunning if I could do it a thousand more times.”

The zoonotic process of viruses jumping from wildlife to people causes most pandemics. As climate change and human encroachment on animal habitats increase the frequency of these events, understanding zoonoses is crucial to efforts to prevent pandemics, or at least to be better prepared.

Researchers estimate that around 1% of the mammalian viruses on the planet have been identified1, so some scientists have attempted to expand our knowledge of this global virome by sampling wildlife. This is a huge task, but over the past decade or so, a new discipline has emerged — one in which researchers use statistical models and machine learning to predict aspects of disease emergence, such as global hotspots, likely animal hosts or the ability of a particular virus to infect humans. Advocates of such ‘zoonotic risk prediction’ technology argue that it will allow us to better target surveillance to the right areas and situations, and guide the development of vaccines and therapeutics that are most likely to be needed.

However, some researchers are sceptical of the ability of predictive technology to cope with the scale and ever-changing nature of the virome. Efforts to improve the models and the data they rely on are under way, but these tools will need to be a part of a broader effort if they are to mitigate future pandemics…(More)”.