Statistical comfort distorts our politics


Wolfgang Münchau at the Financial Times: “…So how should we deal with data and statistics in areas where we are not experts?

My most important advice is to treat statistics as tools to help you ask questions, not to answer them. If you have to seek answers from data, make sure that you understand the issues and that the data are independently verified by people with no skin in the game.

What I am saying here is issuing a plea for perspective, not a rant against statistics. On the contrary. I am in awe of mathematical statistics and its theoretical foundations.

Modern statistics has a profound impact on our daily lives. I rely on Google’s statistical translation technology to obtain information from Danish newspapers, for example.  Statistical advances allow our smartphone cameras to see in the dark, or a medical imaging device to detect a disease. But political data are of a much more uncertain quality. In political discussions, especially on social networks, statistics are used almost entirely to confirm political biases or as weapons in an argument. To the extent that this is so, you are better off without them….(More)”.

A World With a Billion Cameras Watching You Is Just Around the Corner


Liza Lin and Newley Purnell at the Wall Street Journal: “As governments and companies invest more in security networks, hundreds of millions more surveillance cameras will be watching the world in 2021, mostly in China, according to a new report.

The report, from industry researcher IHS Markit, to be released Thursday, said the number of cameras used for surveillance would climb above 1 billion by the end of 2021. That would represent an almost 30% increase from the 770 million cameras today. China would continue to account for a little over half the total.

Fast-growing, populous nations such as India, Brazil and Indonesia would also help drive growth in the sector, the report said. The number of surveillance cameras in the U.S. would grow to 85 million by 2021, from 70 million last year, as American schools, malls and offices seek to tighten security on their premises, IHS analyst Oliver Philippou said.

Mr. Philippou said government programs to implement widespread video surveillance to monitor the public would be the biggest catalyst for the growth in China. City surveillance also was driving demand elsewhere.

“It’s a public-safety issue,” Mr. Philippou said in an interview. “There is a big focus on crime and terrorism in recent years.”

The global security-camera industry has been energized by breakthroughs in image quality and artificial intelligence. These allow better and faster facial recognition and video analytics, which governments are using to do everything from managing traffic to predicting crimes.

China leads the world in the rollout of this kind of technology. It is home to the world’s largest camera makers, with its cameras on street corners, along busy roads and in residential neighborhoods….(More)”.

Is There a Crisis of Truth?


Essay by Steven Shapin: “…It seems irresponsible or perverse to reject the idea that there is a Crisis of Truth. No time now for judicious reflection; what’s needed is a full-frontal attack on the Truth Deniers. But it’s good to be sure about the identity of the problem before setting out to solve it. Conceiving the problem as a Crisis of Truth, or even as a Crisis of Scientific Authority, is not, I think, the best starting point. There’s no reason for complacency, but there is reason to reassess which bits of our culture are in a critical state and, once they are securely identified, what therapies are in order.

Start with the idea of Truth. What could be more important, especially if the word is used — as it often is in academic writing — as a placeholder for Reality? But there’s a sort of luminous glow around the notion of Truth that prejudges and pre-processes the attitudes proper to entertain about it. The Truth goes marching on. God is Truth. The Truth shall set you free. Who, except the mad and the malevolent, could possibly be against Truth? It was, after all, Pontius Pilate who asked, “What is Truth?” — and then went off to wash his hands.

So here’s an only apparently pedantic hint about how to construe Truth and also about why our current problem might not be described as a Crisis of Truth. In modern common usage, Truth is a notably uncommon term. The natural home of Truth is not in the workaday vernacular but in weekend, even language-gone-on-holiday, scenes. The notion of Truth tends to crop up when statements about “what’s the case” are put under pressure, questioned, or picked out for celebration. Statements about “the case” can then become instances of the Truth, surrounded by an epistemic halo. Truth is invoked when we swear to tell it — “the whole Truth and nothing but” — in legal settings or in the filling-out of official forms when we’re cautioned against departing from it; or in those sorts of school and bureaucratic exams where we’re made to choose between True and False. Truth is brought into play when it’s suspected that something of importance has been willfully obscured — as when Al Gore famously responded to disbelief in climate change by insisting on “an inconvenient truth” or when we demand to be told the Truth about the safety of GMOs. [2]

Truth-talk appears in such special-purpose forums as valedictory statements where scientists say that their calling is a Search for Truth. And it’s worth considering the difference between saying that and saying they’re working to sequence a breast cancer gene or to predict when a specific Indonesian volcano is most likely to erupt. Truth stands to Matters-That-Are-the-Case roughly as incantations, proverbs, and aphorisms stand to ordinary speech. Truth attaches more to some formal intellectual practices than to others — to philosophy, religion, art, and, of course, science, even though in science there is apparent specificity. Compare those sciences that seem good fits with the notion of a Search for Truth to those that seem less good fits: theoretical physics versus seismology, academic brain science versus research on the best flavoring for a soft drink. And, of course, Truth echoes around philosophy classrooms and journals, where theories of what it is are advanced, defended, and endlessly disputed. Philosophers collectively know that Truth is very important, but they don’t collectively know what it is.

I’ve said that Truth figures in worries about the problems of knowledge we’re said to be afflicted with, where saying that we have a Crisis of Truth both intensifies the problem and gives it a moral charge. In May 2019, Angela Merkel gave the commencement speech at Harvard. Prettily noting the significance of Harvard’s motto, Veritas, the German Chancellor described the conditions for academic inquiry, which, she said, requires that “we do not describe lies as truth and truth as lies,” nor that “we accept abuses [Missstände] as normal.” The Harvard audience stood and cheered: they understood the coded political reference to Trump and evidently agreed that the opposite of Truth was a lie — not just a statement that didn’t match reality but an intentional deception. You can, however, think of Truth’s opposite as nonsense, error, or bullshit, but calling it a lie was to position Truth in a moral field. Merkel was not giving Harvard a lesson in philosophy but a lesson in global civic virtue….(More)”.

The Downside of Tech Hype


Jeffrey Funk at Scientific American: “Science and technology have been the largest drivers of economic growth for more than 100 years. But this contribution seems to be declining. Growth in labor productivity has slowed, corporate revenue growth per research dollar has fallen, the value of Nobel Prize–winning research has declined, and the number of researchers needed to develop new molecular entities (e.g., drugs) and same percentage improvements in crop yields and numbers of transistors on a microprocessor chip (commonly known as Moore’s Law) has risen. More recently, the percentage of profitable start-ups at the time of their initial public stock offering has dropped to record lows, not seen since the dot-com bubble and start-ups such as Uber, Lyft and WeWork have accumulated losses much larger than ever seen by start-ups, including Amazon.

Although the reasons for these changes are complex and unclear, one thing is certain: excessive hype about new technologies makes it harder for scientists, engineers and policy makers to objectively analyze and understand these changes, or to make good decisions about new technologies.

One driver of hype is the professional incentives of venture capitalists, entrepreneurs, consultants and universities. Venture capitalists have convinced decision makers that venture capitalist funding and start-ups are the new measures of their success. Professional and business service consultants hype technology for both incumbents and start-ups to make potential clients believe that new technologies make existing strategies, business models and worker skills obsolete every few years.

Universities are themselves a major source of hype. Their public relations offices often exaggerate the results of research papers, commonly implying that commercialization is close at hand, even though the researchers know it will take many years if not decades. Science and engineering courses often imply an easy path to commercialization, while misleading and inaccurate forecasts from Technology Review and Scientific American make it easier for business schools and entrepreneurship programs to claim that opportunities are everywhere and that incumbent firms are regularly being disrupted. With a growth in entrepreneurship programs from about 16 in 1970 to more than 2,000 in 2014, many young people now believe that being an entrepreneur is the cool thing to be, regardless of whether they have a good idea.

Hype from these types of experts is exacerbated by the growth of social media, the falling cost of website creation, blogging, posting of slides and videos and the growing number of technology news, investor and consulting websites….(More)”.

Facial recognition needs a wider policy debate


Editorial Team of the Financial Times: “In his dystopian novel 1984, George Orwell warned of a future under the ever vigilant gaze of Big Brother. Developments in surveillance technology, in particular facial recognition, mean the prospect is no longer the stuff of science fiction.

In China, the government was this year found to have used facial recognition to track the Uighurs, a largely Muslim minority. In Hong Kong, protesters took down smart lamp posts for fear of their actions being monitored by the authorities. In London, the consortium behind the King’s Cross development was forced to halt the use of two cameras with facial recognition capabilities after regulators intervened. All over the world, companies are pouring money into the technology.

At the same time, governments and law enforcement agencies of all hues are proving willing buyers of a technology that is still evolving — and doing so despite concerns over the erosion of people’s privacy and human rights in the digital age. Flaws in the technology have, in certain cases, led to inaccuracies, in particular when identifying women and minorities.

The news this week that Chinese companies are shaping new standards at the UN is the latest sign that it is time for a wider policy debate. Documents seen by this newspaper revealed Chinese companies have proposed new international standards at the International Telecommunication Union, or ITU, a Geneva-based organisation of industry and official representatives, for things such as facial recognition. Setting standards for what is a revolutionary technology — one recently described as the “plutonium of artificial intelligence” — before a wider debate about its merits and what limits should be imposed on its use, can only lead to unintended consequences. Crucially, standards ratified in the ITU are commonly adopted as policy by developing nations in Africa and elsewhere — regions where China has long wanted to expand its influence. A case in point is Zimbabwe, where the government has partnered with Chinese facial recognition company CloudWalk Technology. The investment, part of Beijing’s Belt and Road investment in the country, will see CloudWalk technology monitor major transport hubs. It will give the Chinese company access to valuable data on African faces, helping to improve the accuracy of its algorithms….

Progress is needed on regulation. Proposals by the European Commission for laws to give EU citizens explicit rights over the use of their facial recognition data as part of a wider overhaul of regulation governing artificial intelligence are welcome. The move would bolster citizens’ protection above existing restrictions laid out under its general data protection regulation. Above all, policymakers should be mindful that if the technology’s unrestrained rollout continues, it could hold implications for other, potentially more insidious, innovations. Western governments should step up to the mark — or risk having control of the technology’s future direction taken from them….(More)”.

The Challenges of Sharing Data in an Era of Politicized Science


Editorial by Howard Bauchner in JAMA: “The goal of making science more transparent—sharing data, posting results on trial registries, use of preprint servers, and open access publishing—may enhance scientific discovery and improve individual and population health, but it also comes with substantial challenges in an era of politicized science, enhanced skepticism, and the ubiquitous world of social media. The recent announcement by the Trump administration of plans to proceed with an updated version of the proposed rule “Strengthening Transparency in Regulatory Science,” stipulating that all underlying data from studies that underpin public health regulations from the US Environmental Protection Agency (EPA) must be made publicly available so that those data can be independently validated, epitomizes some of these challenges. According to EPA Administrator Andrew Wheeler: “Good science is science that can be replicated and independently validated, science that can hold up to scrutiny. That is why we’re moving forward to ensure that the science supporting agency decisions is transparent and available for evaluation by the public and stakeholders.”

Virtually every time JAMA publishes an article on the effects of pollution or climate change on health, the journal immediately receives demands from critics to retract the article for various reasons. Some individuals and groups simply do not believe that pollution or climate change affects human health. Research on climate change, and the effects of climate change on the health of the planet and human beings, if made available to anyone for reanalysis could be manipulated to find a different outcome than initially reported. In an age of skepticism about many issues, including science, with the ability to use social media to disseminate unfounded and at times potentially harmful ideas, it is challenging to balance the potential benefits of sharing data with the harms that could be done by reanalysis.

Can the experience of sharing data derived from randomized clinical trials (RCTs)—either as mandated by some funders and journals or as supported by individual investigators—serve as examples as a way to safeguard “truth” in science….

Although the sharing of data may have numerous benefits, it also comes with substantial challenges particularly in highly contentious and politicized areas, such as the effects of climate change and pollution on health, in which the public dialogue appears to be based on as much fiction as fact. The sharing of data, whether mandated by funders, including foundations and government, or volunteered by scientists who believe in the principle of data transparency, is a complicated issue in the evolving world of science, analysis, skepticism, and communication. Above all, the scientific process—including original research and reanalysis of shared data—must prevail, and the inherent search for evidence, facts, and truth must not be compromised by special interests, coercive influences, or politicized perspectives. There are no simple answers, just words of caution and concern….(More)”.

A New Wave of Deliberative Democracy


Essay by Claudia Chwalisz: “….Deliberative bodies such as citizens’ councils, assemblies, and juries are often called “deliberative mini-publics” in academic literature. They are just one aspect of deliberative democracy and involve randomly selected citizens spending a significant period of time developing informed recommendations for public authorities. Many scholars emphasize two core defining featuresdeliberation (careful and open discussion to weigh the evidence about an issue) and representativeness, achieved through sortition (random selection).

Of course, the principles of deliberation and sortition are not new. Rooted in ancient Athenian democracy, they were used throughout various points of history until around two to three centuries ago. Evoked by the Greek statesman Pericles in 431 BCE, the ideas—that “ordinary citizens, though occupied with the pursuits of industry, are still fair judges of public matters” and that instead of being a “stumbling block in the way of action . . . [discussion] is an indispensable preliminary to any wise action at all”—faded to the background when elections came to dominate the contemporary notion of democracy.

But the belief in the ability of ordinary citizens to deliberate and participate in public decisionmaking has come back into vogue over the past several decades. And it is modern applications of the principles of sortition and deliberation, meaning their adaption in the context of liberal representative democratic institutions, that make them “democratic innovations” today. This is not to say that there are no longer proponents who claim that governance should be the domain of “experts” who are committed to govern for the general good and have superior knowledge to do it. Originally espoused by Plato, the argument in favor of epistocracy—rule by experts—continues to be reiterated, such as in Jason Brennan’s 2016 book Against Democracy. It is a reminder that the battle of ideas for democracy’s future is nothing new and requires constant engagement.

Today’s political context—characterized by political polarization; mistrust in politicians, governments, and fellow citizens; voter apathy; increasing political protests; and a new context of misinformation and disinformation—has prompted politicians, policymakers, civil society organizations, and citizens to reflect on how collective public decisions are being made in the twenty-first century. In particular, political tensions have raised the need for new ways of achieving consensus and taking action on issues that require long-term solutions, such as climate change and technology use. Assembling ordinary citizens from all parts of society to deliberate on a complex political issue has thus become even more appealing.

Some discussions have returned to exploring democracy’s deliberative roots. An ongoing study by the Organization for Economic Co-operation and Development (OECD) is analyzing over 700 cases of deliberative mini-publics commissioned by public authorities to inform their decisionmaking. The forthcoming report assesses the mini-publics’ use, principles of good practice, and routes to institutionalization.3 This new area of work stems from the 2017 OECD Recommendation of the Council on Open Government, which recommends that adherents (OECD members and some nonmembers) grant all stakeholders, including citizens, “equal and fair opportunities to be informed and consulted and actively engage them in all phases of the policy-cycle” and “promote innovative ways to effectively engage with stakeholders to source ideas and co-create solutions.” A better understanding of how public authorities have been using deliberative mini-publics to inform their decisionmaking around the world, not just in OECD countries, should provide a richer understanding of what works and what does not. It should also reveal the design principles needed for mini-publics to effectively function, deliver strong recommendations, increase legitimacy of the decisionmaking process, and possibly even improve public trust….(More)”.

The people, not governments, should exercise digital sovereignty


John Thornhill at the Financial Times: “European politicians who have been complaining recently about the loss of “digital sovereignty” to US technology companies are like children grumbling in the back of a car about where they are heading. …

Sovereign governments used to wield exclusive power over validating identity, running critical infrastructure, regulating information flows and creating money. Several of those functions are being usurped by the latest tech.

Emmanuel Macron, France’s president, recently told The Economist that Europe had inadvertently abandoned the “grammar” of sovereignty by allowing private companies, rather than public interest, to decide on digital infrastructure. In 10 years’ time, he feared, Europe would no longer be able to guarantee the soundness of its cyber infrastructure or control its citizens’ and companies’ data.

The instinctive response of many European politicians is to invest in grand, state-led projects and to regulate the life out of Big Tech. A recent proposal to launch a European cloud computing company, called Gaia-X, reflects the same impulse that lay behind the creation of Quaero, the Franco-German search engine set up in 2008 to challenge Google. That you have to Google “Quaero” rather than Quaero “Quaero” tells you how that fared. The risk of ill-designed regulation is that it can stifle innovation and strengthen the grip of dominant companies.

Rather than just trying to shore up the diminishing sovereignty of European governments and prop up obsolete national industrial champions, leaders may do better to reshape the rules of the data economy to empower users and stimulate a new wave of innovation. True sovereignty, after all, lies in the hands of the people. To this end, Europe should encourage greater efforts to “re-decentralise the web”, as computer scientists say, to accelerate the development of the next generation internet. The principle of privacy by design should be enshrined in the next batch of regulations, following the EU’s landmark General Data Protection Regulation, and written into all public procurement contracts. …(More).

Is Plagiarism Wrong?


Agnes Callard at The Point: “…Academia has confused a convention with a moral rule, and this confusion is not unmotivated. We academics cannot make much money off the papers and books in which we express our ideas, and ideas cannot be copyrighted, so we have invented a moral law that offers us the “property rights” the legal system denies us.

Here is an analogy. Suppose that I legally own a tree on the edge of my property, but not the apples that fall into the road. I might create a set of norms that shame people who take those apples: if you want one of “my” road-apples, you must first bow down to me or kiss my ring. Otherwise I will call you a “thief,” and if you insist that the apple you have just picked up is your own, I will compound the charge with “liar.”

The academic’s problem is that all of their apples fall into the road. Academia is an honor-culture, in which recognition—in the form of citations—serves as a kind of ersatz currency. In ancient Greek, there is a word “pleonexia,” which means “grasping after more than your share.” Plagiarism norms encourage pleonectic overreach. One can see such overreach in the fact that those with perfect job-security—famous, tenured faculty—do not seem less given to touchiness about having “their” ideas surface in the work of another, unattributed. Quite the contrary. The higher one rises, the louder the call for obeisance: kiss my ring! Stigmatizing plagiarism serves those at the top.

But isn’t there some form of reward—respect, gratitude, admiration, eternal life in historical memory—that people are entitled to on the basis of their intellectual work?

No. If you are an academic and you want to feel waves upon waves of gratitude, I have a simple recommendation for you: do a half-decent job teaching undergraduates. You don’t need to—and probably shouldn’t—teach them “your” ideas; if you help them furnish their minds with some of the great ideas of the past few millennia, they will thank you in ways no citation can: gushing notes of heartfelt appreciation, trinkets to fill your office, email messages of remembrance a decade later. They will come to your funeral. They will tell their children about you.

More generally, if I may indulge in some moralism myself, I would insist that no one can be entitled to gratitude or remembrance or appreciation. Write something worth reading. Put your ideas out there, and hope that someone will make something of them. Give with an open hand, and stop thinking about the tokens with which you will be repaid. Be happy to be worth stealing from. The future owes you nothing….(More)”.

Fixing Democracy Demands the Building and Aligning of People’s Motivation and Authority to Act


Hahrie Han at SSIR: “Power operates in every domain of human life: in families and communities; in social, civic, and economic organizations; and in political states and regimes. Reclaiming democracy means contending with power.

Yet reformers are often reluctant to confront problems of power. Revealing underlying power dynamics can be complex and uncomfortable. It is often tempting to try to solve problems by instead looking for policy fixes, new technologies, and informational solutions.

In fact, some problems can be solved through policy, technology, and information. For instance, when doctors wanted to reduce the rate of Sudden Infant Death Syndrome (SIDS) in the early 1990s, they launched a campaign to teach parents to put babies to sleep on their backs instead of on their stomachs. Once parents had the knowledge that babies who sleep on their backs are less likely to suffocate, they made the necessary change and the SIDS rates dramatically declined. When scientists used technology to create the polio vaccine, they were able to basically eradicate polio. In these examples, there is an alignment, broadly speaking, between the motivation to act and the authority to act. Because parents have both the motivation to protect their children and the authority to determine how they sleep, when they had the information they needed, they adjusted their behaviors.

Problems of power, however, are different because there is usually a misalignment between motivation and authority. Either those who have the motivation to make change lack the authority or capacity to act, or those who have the authority lack the motivation. Solving problems of power, then, requires bringing motivation and authority into alignment.

Recasting challenges of democracy as problems of power makes visible a distinct set of solutions. Considered in this frame, the embrace of antidemocratic authoritarian ideologies around the world is not just a rejection of particular candidates, parties, or policies. Instead, it is a reflection of the profound mismatch between the motivations or interests of the public and the actions of those with authority to act. If people are left feeling powerless, they might believe they have no choice but to blow up the system.

But giving up on democracy is not the only solution. Reformers can also seek to strengthen the capacity of people to exercise their voices in the democratic process—and instantiate the authority they have to hold economic and political leaders accountable within institutions. Realizing democracy must be about building the motivation, capacity, and authority that people of all kinds need to act as a source of countervailing power to institutions of the economy and the state. That is realizing the promise of democracy….(More)”.