Might social intelligence save Latin America from its governments in times of Covid-19?


Essay by Thamy Pogrebinschi: “…In such scenarios, it seems relevant to acknowledge the limits of the state to deal with huge and unpredictable challenges and thus the need to resort to civil society. State capacity cannot be built overnight, but social intelligence is an unlimited and permanently available resource. In recent years, digital technology has multiplied what has been long called social intelligence (Dewey) and is now more often known as collective intelligence (Lévy), the wisdom of crowds (Surowiecki), or democratic reason (Landemore).

Taken together, these concepts point to the most powerful tool available to governments facing hard problems and unprecedented challenges: the sourcing and sharing of knowledge, information, skills, resources, and data from citizens in order to address social and political problems.

The Covid-19 pandemic presents an opportunity to test the potential of social intelligence as fuel for processes of creative collaboration that may aid governments to reinvent themselves and prepare for the challenges that will remain after the virus is gone. By creative collaboration, I mean a range of forms of communication, action, and connection among citizens themselves, between citizens and civil society organizations (CSOs), and between the latter two and their governments, all with the common aim of addressing problems that affect all and that the state for various reasons cannot (satisfactorily) respond to alone.

While several Latin American countries have been stuck in the Covid-19 crisis with governments unable or unwilling to contain it or to reduce its damages, a substantial number of digital democratic innovations have been advanced by civil society in the past few months. These comprise institutions, processes, and mechanisms that rely on digital citizen participation as a means to address social and political problems – and, more recently, also problems of a humanitarian nature….

Between March 16 and July 1 of this year, at least 400 digital democratic innovations were created across 18 countries in Latin America with the specific aim of handling the Covid-19 crisis and mitigating its impact, according to recent data from the LATINNO project. These innovations are essentially mechanisms and processes in which citizens, with the aid of digital tools, are enabled to address social, political, and humanitarian problems related to the pandemic.

Citizens engage in and contribute to three levels of responses, which are based on information, connection, and action. About one-fourth of these digital democratic innovations clearly rely on crowdsourcing social intelligence.

The great majority of those digital innovations have been developed by CSOs. Around 75% of them have no government involvement at all, which is striking in a region known for implementing state-driven citizen participation as a result of the democratization processes that took place in the late 20th century. Civil society has stepped in in most countries, particularly where government responses were absent (Brazil and Nicaragua), slow (Mexico), insufficient due to lack of economic resources (Argentina) or infrastructure (Peru), or simply inefficient (Chile).

Based on these data from 18 Latin American countries, one can observe that digital democratic innovations address challenges posed by the Covid-19 outbreak in five main ways: first, generating verified information and reliable data; second, geolocating problems, needs, and demands; third, mobilizing resources, skills, and knowledge to address those problems, needs, and demands; fourth, connecting demand (individuals and organizations in need) and supply (individuals and organizations willing to provide whatever is needed); and fifth and finally, implementing and monitoring public policies and actions. In some countries, there is a sixth use that cuts across the other five: assisting vulnerable groups such as the elderly, women, children and youth, indigenous peoples, and Afro-descendants….(More)”

COVID Data Failures Create Pressure for Public Health System Overhaul


Kaiser Health News: “After terrorists slammed a plane into the Pentagon on 9/11, ambulances rushed scores of the injured to community hospitals, but only three of the patients were taken to specialized trauma wards. The reason: The hospitals and ambulances had no real-time information-sharing system.

Nineteen years later, there is still no national data network that enables the health system to respond effectively to disasters and disease outbreaks. Many doctors and nurses must fill out paper forms on COVID-19 cases and available beds and fax them to public health agencies, causing critical delays in care and hampering the effort to track and block the spread of the coronavirus.

There are signs the COVID-19 pandemic has created momentum to modernize the nation’s creaky, fragmented public health data system, in which nearly 3,000 local, state and federal health departments set their own reporting rules and vary greatly in their ability to send and receive data electronically.

Sutter Health and UC Davis Health, along with nearly 30 other provider organizations around the country, recently launched a collaborative effort to speed and improve the sharing of clinical data on individual COVID cases with public health departments.

But even that platform, which contains information about patients’ diagnoses and response to treatments, doesn’t yet include data on the availability of hospital beds, intensive care units or supplies needed for a seamless pandemic response.

The federal government spent nearly $40 billion over the past decade to equip hospitals and physicians’ offices with electronic health record systems for improving treatment of individual patients. But no comparable effort has emerged to build an effective system for quickly moving information on infectious disease from providers to public health agencies.

In March, Congress approved $500 million over 10 years to modernize the public health data infrastructure. But the amount falls far short of what’s needed to update data systems and train staff at local and state health departments, said Brian Dixon, director of public health informatics at the Regenstrief Institute in Indianapolis….(More)”.

A need for open public data standards and sharing in light of COVID-19


Lauren Gardner, Jeremy Ratcliff, Ensheng Dong and Aaron Katz at the Lancet: “The disjointed public health response to the COVID-19 pandemic has demonstrated one clear truth: the value of timely, publicly available data. The John Hopkins University (JHU) Center for Systems Science and Engineering’s COVID-19 dashboard exists to provide this information. What grew from a modest effort to track a novel cause of pneumonia in China quickly became a mainstay symbol of the pandemic, receiving over 1 billion hits per day within weeks of its creation, primarily driven by the general public seeking information on the emerging health crisis. Critically, the data supporting the visualisation were provided in a publicly accessible repository and eagerly adopted by policy makers and the research community for purposes of modelling and planning, as evidenced by the more than 1200 citations in the first 4 months of its publication. 6 months into the pandemic, the JHU COVID-19 dashboard still stands as the authoritative source of global COVID-19 epidemiological data.

Similar commendable efforts to facilitate public understanding of COVID-19 have since been introduced by various academic, industry, and public health entities. These costly and disparate efforts around the world were necessary to fill the gap left by the lack of an established infrastructure for real-time reporting and open data sharing during an ongoing public health crisis…

Although existing systems were in place to achieve such objectives, they were not empowered or equipped to fully meet the public’s expectation for timely open data at an actionable level of spatial resolution. Moving forward, it is imperative that a standardised reporting system for systematically collecting, visualising, and sharing high-quality data on emerging infectious and notifiable diseases in real-time is established. The data should be made available at a spatial and temporal scale that is granular enough to prove useful for planning and modelling purposes. Additionally, a critical component of the proposed system is the democratisation of data; all collected information (observing necessary privacy standards) should be made publicly available immediately upon release, in machine-readable formats, and based on open data standards..(More)”. (See also https://data4covid19.org/)

How Philanthropy Can Help Governments Accelerate a Real Recovery


Essay by Michele Jolin and David Medina: “The cracks and design flaws of our nation’s public systems have been starkly exposed as governments everywhere struggle to respond to health and economic crises that disproportionately devastate Black residents and communities of color. As government leaders respond to the immediate emergencies, they also operate within a legacy of government practices, policies and systems that have played a central role in creating and maintaining racial inequity. 

Philanthropy can play a unique and catalytic role in accelerating a real recovery by helping government leaders make smarter decisions, helping them develop and effectively use the data-and-evidence capacity they need to spotlight and understand root causes of community challenges, especially racial disparities, and increase the impact of government investments that could close racial gaps and accelerate economic opportunity. Philanthropy can uniquely support leaders within government who are best positioned to redesign and reimagine public systems to deliver equity and impact.

We are already seeing that the growing number of governments that have built data-driven “Moneyball” muscles are better positioned both to manage through this crisis and to dismantle racist government practices. While we recognize that data and evidence can sometimes reinforce biases, we also know that government decision-makers who have access to more and better information—and who are trained to navigate the nuance and possible bias in this information—can use data to identify disparate racial outcomes, understand the core problems and target resources to close gaps. Government decision-makers who have the skills to test, learn, and improve government programs can prioritize resource allocation toward programs that both deliver better results and address the complexity of social problems.

Philanthropy can accelerate this public sector transformation by supporting change led by internal government champions who are challenging the status quo. By doing so, philanthropic leaders can increase the impact of the trillions of dollars invested by governments each year. Philanthropies such as Ballmer Group, Bloomberg Philanthropies, Blue Meridian Partners, the Bill & Melinda Gates Foundation, and Arnold Ventures understand this and are already putting their money where their mouths are. By helping governments make smarter budget and policy decisions, they can ensure that public dollars flow toward solutions that make a meaningful, measurable difference on our biggest challenges, whether it’s increasing economic mobility, reducing racial disparities in health and other outcomes, or addressing racial bias in government systems.

We need other donors to join them in prioritizing this kind of systems change….(More)”.

Why Personal Data Is a National Security Issue


Article by Susan Ariel Aaronson: “…Concerns about the national security threat from personal data held by foreigners first emerged in 2013. Several U.S. entities, including Target, J.P. Morgan, and the U.S. Office of Personnel Management were hacked. Many attributed the hacking to Chinese entities. Administration officials concluded that the Chinese government could cross-reference legally obtained and hacked-data sets to reveal information about U.S. objectives and strategy. 

Personal data troves can also be cross-referenced to identify individuals, putting both personal security as well as national security at risk. Even U.S. firms pose a direct and indirect security threat to individuals and the nation because of their failure to adequately protect personal data. For example, Facebook has a disturbing history of sharing personal data without consent and allowing its clients to use that data to manipulate users. Some app designers have enabled functionality unnecessary for their software’s operation, while others, like Anomaly 6, embedded their software in mobile apps without the permission of users or firms. Other companies use personal data without user permission to create new products. Clearview AI scraped billions of images from major web services such as Facebook, Google, and YouTube, and sold these images to law enforcement agencies around the world. 

Firms can also inadvertently aggregate personal data and in so doing threaten national security. Strava, an athletes’ social network, released a heat map of its global users’ activities in 2018. Savvy analysts were able to use the heat map to reveal secret military bases and patrol routes. Chinese-owned data firms could be a threat to national security if they share data with the Chinese government. But the problem lies in the U.S.’s failure to adequately protect personal data and police the misuse of data collected without the permission of users….(More)”.

From Desert Battlefields To Coral Reefs, Private Satellites Revolutionize The View


NPR Story: “As the U.S. military and its allies attacked the last Islamic State holdouts last year, it wasn’t clear how many civilians were still in the besieged desert town of Baghouz, Syria.

So Human Rights Watch asked a private satellite company, Planet, for its regular daily photos and also made a special request for video.

“That live video actually was instrumental in convincing us that there were thousands of civilians trapped in this pocket,” said Josh Lyons of Human Rights Watch. “Therefore, the coalition forces absolutely had an obligation to stop and to avoid bombardment of that pocket at that time.”

Which they did until the civilians fled.

Lyons, who’s based in Geneva, has a job title you wouldn’t expect at a human rights group: director of geospatial analysis. He says satellite imagery is increasingly a crucial component of human rights investigations, bolstering traditional eyewitness accounts, especially in areas where it’s too dangerous to send researchers.

“Then we have this magical sort of fusion of data between open-source, eyewitness testimony and data from space. And that becomes essentially a new gold standard for investigations,” he said.

‘A string of pearls’

Satellite photos used to be restricted to the U.S. government and a handful of other nations. Now such imagery is available to everyone, creating a new world of possibilities for human rights groups, environmentalists and researchers who monitor nuclear programs.

They get those images from a handful of private, commercial satellite companies, like Planet and Maxar….(More)”.

Building and maintaining trust in research


Daniel Nunan at the International Journal of Market Research: “One of the many indirect consequences of the COVID pandemic for the research sector may be the impact upon consumers’ willingness to share data. This is reflected in concerns that government mandated “apps” designed to facilitate COVID testing and tracking schemes will undermine trust in the commercial collection of personal data (WARC, 2020). For example, uncertainty over the consequences of handing over data and the ways in which it might be used could reduce consumers’ willingness to share data with organizations, and reverse a trend that has seen growing levels of consumer confidence in Data Protection Regulations (Data & Direct Marketing Association [DMA], 2020). This highlights how central the role of trust has become in contemporary research practice, and how fragile the process of building trust can be due to the ever competing demands of public and private data collectors.

For researchers, there are two sides to trust. One relates to building sufficient trust with research participants to be facilitate data collection, and the second is building trust with the users of research. Trust has long been understood as a key factor in effective research relationships, with trust between researchers and users of research the key factor in determining the extent to which research is actually used (Moorman et al., 1993). In other words, a trusted messenger is just as important as the contents of the message. In recent years, there has been growing concern over declining trust in research from research participants and the general public, manifested in declining response rates and challenges in gaining participation. Understanding how to build consumer trust is more important than ever, as the shift of communication and commercial activity to digital platforms alter the mechanisms through which trust is built. Trust is therefore essential both for ensuring that accurate data can be collected, and that insights from research have necessary legitimacy to be acted upon. The two research notes in this issue provide an insight into new areas where the issue of trust needs to be considered within research practice….(More)”.

How open data could tame Big Tech’s power and avoid a breakup


Patrick Leblond at The Conversation: “…Traditional antitrust approaches such as breaking up Big Tech firms and preventing potential competitor acquisitions are never-ending processes. Even if you break them up and block their ability to acquire other, smaller tech firms, Big Tech will start growing again because of network effects and their data advantage.

And how do we know when a tech firm is big enough to ensure competitive markets? What are the size or scope thresholds for breaking up firms or blocking mergers and acquisitions?

A small startup acquired for millions of dollars can be worth billions of dollars for a Big Tech acquirer once integrated in its ecosystem. A series of small acquisitions can result in a dominant position in one area of the digital economy. Knowing this, competition/antitrust authorities would potentially have to examine every tech transaction, however small.

Not only would this be administratively costly or burdensome on resources, but it would also be difficult for government officials to assess with some precision (and therefore legitimacy), the likely future economic impact of an acquisition in a rapidly evolving technological environment.

Open data access, level the playing field

Given that mass data collection is at the core of Big Tech’s power as gatekeepers to customers, a key solution is to open up data access for other firms so that they can compete better.

Anonymized data (to protect an individual’s privacy rights) about people’s behaviour, interests, views, etc., should be made available for free to anyone wanting to pursue a commercial or non-commercial endeavour. Data about a firm’s operations or performance would, however, remain private.

Using an analogy from the finance world, Big Tech firms act as insider traders. Stock market insiders often possess insider (or private) information about companies that the public does not have. Such individuals then have an incentive to profit by buying or selling shares in those companies before the public becomes aware of the information.

Big Tech’s incentives are no different than stock market insiders. They trade on exclusively available private information (data) to generate extraordinary profits.

Continuing the finance analogy, financial securities regulators forbid the use of inside or non-publicly available information for personal benefit. Individuals found to illegally use such information are punished with jail time and fines.

They also require companies to publicly report relevant information that affects or could significantly affect their performance. Finally, they oblige insiders to publicly report when they buy and sell shares in a company in which they have access to privileged information.

Transposing stock market insider trading regulation to Big Tech implies that data access and use should be monitored under an independent regulatory body — call it a Data Market Authority. Such a body would be responsible for setting and enforcing principles, rules and standards of behaviour among individuals and organizations in the data-driven economy.

For example, a Data Market Authority would require firms to publicly report how they acquire and use personal data. It would prohibit personal data hoarding by ensuring that data is easily portable from one platform, network or marketplace to another. It would also prohibit the buying and selling of personal data as well as protect individuals’ privacy by imposing penalties on firms and individuals in cases of non-compliance.

Data openly and freely available under a strict regulatory environment would likely be a better way to tame Big Tech’s power than breaking them up and having antitrust authorities approving every acquisition that they wish to make….(More)”.

Numbers are arguably humankind’s most useful technology.


Introduction by Jay Tolson to a Special Issue of HedgeHog Review: “At a time when distraction and mendacity degrade public discourse, the heartbreaking toll of the current pandemic should at least remind us that quantification—data, numbers, statistics—are vitally important to policy, governance, and decision-making more broadly.

Confounding as they may be to some of us, numbers are arguably humankind’s most useful technology—our greatest discovery, or possibly our greatest invention. But the current global crisis should also remind us of something equally important: Good numbers, like good science, can only do so much to inform wise decisions about our personal and collective good. They cannot, in any true sense, make those decisions for us. Let the numbers speak for themselves is the rhetoric of the naïf or the con artist, and should long ago have been consigned to the dustbin of pernicious hokum. Yet how seldom in these Big Data days, in our Big Data daze, does it go unchallenged.

Or—to consider the flip side of the current bedazzlement—how often it goes challenged in exactly the wrong way, in a way that declares all facts, all data, all science to be nothing but relative, your facts versus our facts, “alternative facts.” That is the way of sophistry, where cynicism rules and might alone makes right.

Excessive or misplaced faith in the tools that should assist us in arriving at truth—a faith that can engender dangerously unreasoning or cynical reactions—is the theme of this issue. In six essays, we explore the ways the quantitative imperative has insinuated itself into various corners of our culture and society, asserting primacy if not absolute authority in matters where it should tread modestly. In the name of numbers that measure everything from GDP to personal well-being, technocrats and other masters of the postmodern economy have engineered an increasingly soulless, instrumentalizing culture whose denizens either submit to its dictates or flail darkly and destructively against them.

The origins of this nightmare version of modernity, a version that grows increasingly real, dates from at least the first stirrings of modern science in the fifteenth and sixteenth centuries, but its distinctive institutional features emerged most clearly in the early part of the last century, when progressive thinkers and leaders in politics, business, and other walks of life sought to harness humankind’s physical and mental energies to the demands of an increasingly technocratic, consumerist society.

The subjugation of human vitality to the quantifying schedules and metrics of modernity is the story that historian Jackson Lears limns in the opening essay, “Quantifying Vitality: The Progressive Paradox.” As he explains, “The emergence of statistical selves was not simply a rationalization of everyday life, a search for order…. The reliance on statistical governance coincided with and complemented a pervasive revaluation of primal spontaneity and vitality, an effort to unleash hidden strength from an elusive inner self. The collectivization epitomized in the quantitative turn was historically compatible with radically individualist agendas for personal regeneration—what later generations would learn to call positive thinking.”…(More)”.

Race and America: why data matters


Federica Cocco and Alan Smith at the Financial Times: “… To understand the historical roots of black data activism, we have to return to October 1899. Back then, Thomas Calloway, a clerk in the War Department, wrote to the educator Booker T Washington about his pitch for an “American Negro Exhibit” at the 1900 Exposition Universelle in Paris. It was right in the middle of the scramble for Africa and Europeans had developed a morbid fascination with the people they were trying to subjugate.

To Calloway, the Paris exhibition offered a unique venue to sway the global elite to acknowledge “the possibilities of the Negro” and to influence cultural change in the US from an international platform.

It is hard to overstate the importance of international fairs at the time. They were a platform to bolster the prestige of nations. In Delivering Views: Distant Cultures in Early Postcards, Robert Rydell writes that fairs had become “a vehicle that, perhaps next to the church, had the greatest capacity to influence a mass audience”….

For the Paris World Fair, Du Bois and a team of Atlanta University students and alumni designed and drew by hand more than 60 bold data portraits. A first set used Georgia as a case study to illustrate the progress made by African Americans since the Civil War.

A second set showed how “the descendants of former African slaves now in residence in the United States of America” had become lawyers, doctors, inventors and musicians. For the first time, the growth of literacy and employment rates, the value of assets and land owned by African Americans and their growing consumer power were there for everyone to see. At the 1900 World Fair, the “Exhibit of American Negroes” took up a prominent spot in the Palace of Social Economy. “As soon as they entered the building, visitors were inundated by examples of black excellence,” says Whitney Battle-Baptiste, director of the WEB Du Bois Center at the University of Massachusetts Amherst and co-author of WEB Du Bois’s Data Portraits: Visualizing Black America….(More)”

Working with students and alumni from Atlanta University, Du Bois created 60 bold data portraits for the ‘Exhibit of American Negroes’

Working with students and alumni from Atlanta University, Du Bois created 60 bold data portraits for the ‘Exhibit of American Negroes’ © Library of Congress, Prints & Photographs Division