Building and maintaining trust in research


Daniel Nunan at the International Journal of Market Research: “One of the many indirect consequences of the COVID pandemic for the research sector may be the impact upon consumers’ willingness to share data. This is reflected in concerns that government mandated “apps” designed to facilitate COVID testing and tracking schemes will undermine trust in the commercial collection of personal data (WARC, 2020). For example, uncertainty over the consequences of handing over data and the ways in which it might be used could reduce consumers’ willingness to share data with organizations, and reverse a trend that has seen growing levels of consumer confidence in Data Protection Regulations (Data & Direct Marketing Association [DMA], 2020). This highlights how central the role of trust has become in contemporary research practice, and how fragile the process of building trust can be due to the ever competing demands of public and private data collectors.

For researchers, there are two sides to trust. One relates to building sufficient trust with research participants to be facilitate data collection, and the second is building trust with the users of research. Trust has long been understood as a key factor in effective research relationships, with trust between researchers and users of research the key factor in determining the extent to which research is actually used (Moorman et al., 1993). In other words, a trusted messenger is just as important as the contents of the message. In recent years, there has been growing concern over declining trust in research from research participants and the general public, manifested in declining response rates and challenges in gaining participation. Understanding how to build consumer trust is more important than ever, as the shift of communication and commercial activity to digital platforms alter the mechanisms through which trust is built. Trust is therefore essential both for ensuring that accurate data can be collected, and that insights from research have necessary legitimacy to be acted upon. The two research notes in this issue provide an insight into new areas where the issue of trust needs to be considered within research practice….(More)”.

How open data could tame Big Tech’s power and avoid a breakup


Patrick Leblond at The Conversation: “…Traditional antitrust approaches such as breaking up Big Tech firms and preventing potential competitor acquisitions are never-ending processes. Even if you break them up and block their ability to acquire other, smaller tech firms, Big Tech will start growing again because of network effects and their data advantage.

And how do we know when a tech firm is big enough to ensure competitive markets? What are the size or scope thresholds for breaking up firms or blocking mergers and acquisitions?

A small startup acquired for millions of dollars can be worth billions of dollars for a Big Tech acquirer once integrated in its ecosystem. A series of small acquisitions can result in a dominant position in one area of the digital economy. Knowing this, competition/antitrust authorities would potentially have to examine every tech transaction, however small.

Not only would this be administratively costly or burdensome on resources, but it would also be difficult for government officials to assess with some precision (and therefore legitimacy), the likely future economic impact of an acquisition in a rapidly evolving technological environment.

Open data access, level the playing field

Given that mass data collection is at the core of Big Tech’s power as gatekeepers to customers, a key solution is to open up data access for other firms so that they can compete better.

Anonymized data (to protect an individual’s privacy rights) about people’s behaviour, interests, views, etc., should be made available for free to anyone wanting to pursue a commercial or non-commercial endeavour. Data about a firm’s operations or performance would, however, remain private.

Using an analogy from the finance world, Big Tech firms act as insider traders. Stock market insiders often possess insider (or private) information about companies that the public does not have. Such individuals then have an incentive to profit by buying or selling shares in those companies before the public becomes aware of the information.

Big Tech’s incentives are no different than stock market insiders. They trade on exclusively available private information (data) to generate extraordinary profits.

Continuing the finance analogy, financial securities regulators forbid the use of inside or non-publicly available information for personal benefit. Individuals found to illegally use such information are punished with jail time and fines.

They also require companies to publicly report relevant information that affects or could significantly affect their performance. Finally, they oblige insiders to publicly report when they buy and sell shares in a company in which they have access to privileged information.

Transposing stock market insider trading regulation to Big Tech implies that data access and use should be monitored under an independent regulatory body — call it a Data Market Authority. Such a body would be responsible for setting and enforcing principles, rules and standards of behaviour among individuals and organizations in the data-driven economy.

For example, a Data Market Authority would require firms to publicly report how they acquire and use personal data. It would prohibit personal data hoarding by ensuring that data is easily portable from one platform, network or marketplace to another. It would also prohibit the buying and selling of personal data as well as protect individuals’ privacy by imposing penalties on firms and individuals in cases of non-compliance.

Data openly and freely available under a strict regulatory environment would likely be a better way to tame Big Tech’s power than breaking them up and having antitrust authorities approving every acquisition that they wish to make….(More)”.

Numbers are arguably humankind’s most useful technology.


Introduction by Jay Tolson to a Special Issue of HedgeHog Review: “At a time when distraction and mendacity degrade public discourse, the heartbreaking toll of the current pandemic should at least remind us that quantification—data, numbers, statistics—are vitally important to policy, governance, and decision-making more broadly.

Confounding as they may be to some of us, numbers are arguably humankind’s most useful technology—our greatest discovery, or possibly our greatest invention. But the current global crisis should also remind us of something equally important: Good numbers, like good science, can only do so much to inform wise decisions about our personal and collective good. They cannot, in any true sense, make those decisions for us. Let the numbers speak for themselves is the rhetoric of the naïf or the con artist, and should long ago have been consigned to the dustbin of pernicious hokum. Yet how seldom in these Big Data days, in our Big Data daze, does it go unchallenged.

Or—to consider the flip side of the current bedazzlement—how often it goes challenged in exactly the wrong way, in a way that declares all facts, all data, all science to be nothing but relative, your facts versus our facts, “alternative facts.” That is the way of sophistry, where cynicism rules and might alone makes right.

Excessive or misplaced faith in the tools that should assist us in arriving at truth—a faith that can engender dangerously unreasoning or cynical reactions—is the theme of this issue. In six essays, we explore the ways the quantitative imperative has insinuated itself into various corners of our culture and society, asserting primacy if not absolute authority in matters where it should tread modestly. In the name of numbers that measure everything from GDP to personal well-being, technocrats and other masters of the postmodern economy have engineered an increasingly soulless, instrumentalizing culture whose denizens either submit to its dictates or flail darkly and destructively against them.

The origins of this nightmare version of modernity, a version that grows increasingly real, dates from at least the first stirrings of modern science in the fifteenth and sixteenth centuries, but its distinctive institutional features emerged most clearly in the early part of the last century, when progressive thinkers and leaders in politics, business, and other walks of life sought to harness humankind’s physical and mental energies to the demands of an increasingly technocratic, consumerist society.

The subjugation of human vitality to the quantifying schedules and metrics of modernity is the story that historian Jackson Lears limns in the opening essay, “Quantifying Vitality: The Progressive Paradox.” As he explains, “The emergence of statistical selves was not simply a rationalization of everyday life, a search for order…. The reliance on statistical governance coincided with and complemented a pervasive revaluation of primal spontaneity and vitality, an effort to unleash hidden strength from an elusive inner self. The collectivization epitomized in the quantitative turn was historically compatible with radically individualist agendas for personal regeneration—what later generations would learn to call positive thinking.”…(More)”.

Race and America: why data matters


Federica Cocco and Alan Smith at the Financial Times: “… To understand the historical roots of black data activism, we have to return to October 1899. Back then, Thomas Calloway, a clerk in the War Department, wrote to the educator Booker T Washington about his pitch for an “American Negro Exhibit” at the 1900 Exposition Universelle in Paris. It was right in the middle of the scramble for Africa and Europeans had developed a morbid fascination with the people they were trying to subjugate.

To Calloway, the Paris exhibition offered a unique venue to sway the global elite to acknowledge “the possibilities of the Negro” and to influence cultural change in the US from an international platform.

It is hard to overstate the importance of international fairs at the time. They were a platform to bolster the prestige of nations. In Delivering Views: Distant Cultures in Early Postcards, Robert Rydell writes that fairs had become “a vehicle that, perhaps next to the church, had the greatest capacity to influence a mass audience”….

For the Paris World Fair, Du Bois and a team of Atlanta University students and alumni designed and drew by hand more than 60 bold data portraits. A first set used Georgia as a case study to illustrate the progress made by African Americans since the Civil War.

A second set showed how “the descendants of former African slaves now in residence in the United States of America” had become lawyers, doctors, inventors and musicians. For the first time, the growth of literacy and employment rates, the value of assets and land owned by African Americans and their growing consumer power were there for everyone to see. At the 1900 World Fair, the “Exhibit of American Negroes” took up a prominent spot in the Palace of Social Economy. “As soon as they entered the building, visitors were inundated by examples of black excellence,” says Whitney Battle-Baptiste, director of the WEB Du Bois Center at the University of Massachusetts Amherst and co-author of WEB Du Bois’s Data Portraits: Visualizing Black America….(More)”

Working with students and alumni from Atlanta University, Du Bois created 60 bold data portraits for the ‘Exhibit of American Negroes’

Working with students and alumni from Atlanta University, Du Bois created 60 bold data portraits for the ‘Exhibit of American Negroes’ © Library of Congress, Prints & Photographs Division

Why Hundreds of Mathematicians Are Boycotting Predictive Policing


Courtney Linder at Popular Mechanics: “Several prominent academic mathematicians want to sever ties with police departments across the U.S., according to a letter submitted to Notices of the American Mathematical Society on June 15. The letter arrived weeks after widespread protests against police brutality, and has inspired over 1,500 other researchers to join the boycott.

These mathematicians are urging fellow researchers to stop all work related to predictive policing software, which broadly includes any data analytics tools that use historical data to help forecast future crime, potential offenders, and victims. The technology is supposed to use probability to help police departments tailor their neighborhood coverage so it puts officers in the right place at the right time….

a flow chart showing how predictive policing works

RAND

According to a 2013 research briefing from the RAND Corporation, a nonprofit think tank in Santa Monica, California, predictive policing is made up of a four-part cycle (shown above). In the first two steps, researchers collect and analyze data on crimes, incidents, and offenders to come up with predictions. From there, police intervene based on the predictions, usually taking the form of an increase in resources at certain sites at certain times. The fourth step is, ideally, reducing crime.

“Law enforcement agencies should assess the immediate effects of the intervention to ensure that there are no immediately visible problems,” the authors note. “Agencies should also track longer-term changes by examining collected data, performing additional analysis, and modifying operations as needed.”

In many cases, predictive policing software was meant to be a tool to augment police departments that are facing budget crises with less officers to cover a region. If cops can target certain geographical areas at certain times, then they can get ahead of the 911 calls and maybe even reduce the rate of crime.

But in practice, the accuracy of the technology has been contested—and it’s even been called racist….(More)”.

Why real-time economic data need to be treated with caution


The Economist: “The global downturn of 2020 is probably the most quantified on record. Economists, firms and statisticians seeking to gauge the depth of the collapse in economic activity and the pace of the recovery have seized upon a new dashboard of previously obscure indicators. Investors eagerly await the release of mobility statistics from tech companies such as Apple or Google, or restaurant-booking data from OpenTable, in a manner once reserved for official inflation and unemployment estimates. Central bankers pepper their speeches with novel barometers of consumer spending. Investment-bank analysts and journalists tout hot new measures of economic activity in the way that hipsters discuss the latest bands. Those who prefer to wait for official measures are regarded as being like fans of u2, a sanctimonious Irish rock group: stuck behind the curve as the rest of the world has moved on.

The main attraction of real-time data to policymakers and investors alike is timeliness. Whereas official, so-called hard data, such as inflation, employment or output measures, tend to be released with a lag of several weeks, or even months, real-time data, as the name suggests, can offer a window on today’s economic conditions. The depth of the downturns induced by covid-19 has put a premium on swift intelligence. The case for hard data has always been their quality, but this has suffered greatly during the pandemic. Compilers of official labour-market figures have struggled to account for furlough schemes and the like, and have plastered their releases with warnings about unusually high levels of uncertainty. Filling in statisticians’ forms has probably fallen to the bottom of firms’ to-do lists, reducing the accuracy of official output measures….

The value of real-time measures will be tested once the swings in economic activity approach a more normal magnitude. Mobility figures for March and April did predict the scale of the collapse in gdp, but that could have been estimated just as easily by stepping outside and looking around (at least in the places where that sort of thing was allowed during lockdown). Forecasters in rich countries are more used to quibbling over whether economies will grow at an annual rate of 2% or 3% than whether output will shrink by 20% or 30% in a quarter. Real-time measures have disappointed before. Immediately after Britain’s vote to leave the European Union in 2016, for instance, the indicators then watched by economists pointed to a sharp slowdown. It never came.

Real-time data, when used with care, have been a helpful supplement to official measures so far this year. With any luck the best of the new indicators will help official statisticians improve the quality and timeliness of their own figures. But, much like u2, the official measures have been around for a long time thanks to their tried and tested formula—and they are likely to stick around for a long time to come….(More)”.

German coronavirus experiment enlists help of concertgoers


Philip Oltermann at the Guardian: “German scientists are planning to equip 4,000 pop music fans with tracking gadgets and bottles of fluorescent disinfectant to get a clearer picture of how Covid-19 could be prevented from spreading at large indoor concerts.

As cultural mass gatherings across the world remain on hold for the foreseeable future, researchers in eastern Germany are recruiting volunteers for a “coronavirus experiment” with the singer-songwriter Tim Bendzko, to be held at an indoor stadium in the city of Leipzig on 22 August.

Participants, aged between 18 and 50, will wear matchstick-sized “contact tracer” devices on chains around their necks that transmit a signal at five-second intervals and collect data on each person’s movements and proximity to other members of the audience.

Inside the venue, they will also be asked to disinfect their hands with a fluorescent hand-sanitiser – designed to not just add a layer of protection but allow scientists to scour the venue with UV lights after the concerts to identify surfaces where a transmission of the virus through smear infection is most likely to take place.

Vapours from a fog machine will help visualise the possible spread of coronavirus via aerosols, which the scientists will try to predict via computer-generated models in advance of the event.

The €990,000 cost of the Restart-19 project will be shouldered between the federal states of Saxony and Saxony-Anhalt. The project’s organisers say the aim is to “identify a framework” for how larger cultural and sports events could be held “without posing a danger for the population” after 30 September….

To stop the Leipzig experiment from becoming the source of a new outbreak, signed-up volunteers will be sent a DIY test kit and have a swab at a doctor’s practice or laboratory 48 hours before the concert starts. Those who cannot show proof of a negative test at the door will be denied entry….(More)”.

Coronavirus: how the pandemic has exposed AI’s limitations


Kathy Peach at The Conversation: “It should have been artificial intelligence’s moment in the sun. With billions of dollars of investment in recent years, AI has been touted as a solution to every conceivable problem. So when the COVID-19 pandemic arrived, a multitude of AI models were immediately put to work.

Some hunted for new compounds that could be used to develop a vaccine, or attempted to improve diagnosis. Some tracked the evolution of the disease, or generated predictions for patient outcomes. Some modelled the number of cases expected given different policy choices, or tracked similarities and differences between regions.

The results, to date, have been largely disappointing. Very few of these projects have had any operational impact – hardly living up to the hype or the billions in investment. At the same time, the pandemic highlighted the fragility of many AI models. From entertainment recommendation systems to fraud detection and inventory management – the crisis has seen AI systems go awry as they struggled to adapt to sudden collective shifts in behaviour.

The unlikely hero

The unlikely hero emerging from the ashes of this pandemic is instead the crowd. Crowds of scientists around the world sharing data and insights faster than ever before. Crowds of local makers manufacturing PPE for hospitals failed by supply chains. Crowds of ordinary people organising through mutual aid groups to look after each other.

COVID-19 has reminded us of just how quickly humans can adapt existing knowledge, skills and behaviours to entirely new situations – something that highly-specialised AI systems just can’t do. At least yet….

In one of the experiments, researchers from the Istituto di Scienze e Tecnologie della Cognizione in Rome studied the use of an AI system designed to reduce social biases in collective decision-making. The AI, which held back information from the group members on what others thought early on, encouraged participants to spend more time evaluating the options by themselves.

The system succeeded in reducing the tendency of people to “follow the herd” by failing to hear diverse or minority views, or challenge assumptions – all of which are criticisms that have been levelled at the British government’s scientific advisory committees throughout the pandemic…(More)”.

A Letter on Justice and Open Debate


Letter in Harpers Magazine signed by 153 prominent artists and intellectuals,: “Our cultural institutions are facing a moment of trial. Powerful protests for racial and social justice are leading to overdue demands for police reform, along with wider calls for greater equality and inclusion across our society, not least in higher education, journalism, philanthropy, and the arts. But this needed reckoning has also intensified a new set of moral attitudes and political commitments that tend to weaken our norms of open debate and toleration of differences in favor of ideological conformity. As we applaud the first development, we also raise our voices against the second. The forces of illiberalism are gaining strength throughout the world and have a powerful ally in Donald Trump, who represents a real threat to democracy. But resistance must not be allowed to harden into its own brand of dogma or coercion—which right-wing demagogues are already exploiting. The democratic inclusion we want can be achieved only if we speak out against the intolerant climate that has set in on all sides.

The free exchange of information and ideas, the lifeblood of a liberal society, is daily becoming more constricted. While we have come to expect this on the radical right, censoriousness is also spreading more widely in our culture: an intolerance of opposing views, a vogue for public shaming and ostracism, and the tendency to dissolve complex policy issues in a blinding moral certainty. We uphold the value of robust and even caustic counter-speech from all quarters. But it is now all too common to hear calls for swift and severe retribution in response to perceived transgressions of speech and thought. More troubling still, institutional leaders, in a spirit of panicked damage control, are delivering hasty and disproportionate punishments instead of considered reforms. Editors are fired for running controversial pieces; books are withdrawn for alleged inauthenticity; journalists are barred from writing on certain topics; professors are investigated for quoting works of literature in class; a researcher is fired for circulating a peer-reviewed academic study; and the heads of organizations are ousted for what are sometimes just clumsy mistakes. Whatever the arguments around each particular incident, the result has been to steadily narrow the boundaries of what can be said without the threat of reprisal. We are already paying the price in greater risk aversion among writers, artists, and journalists who fear for their livelihoods if they depart from the consensus, or even lack sufficient zeal in agreement.

This stifling atmosphere will ultimately harm the most vital causes of our time. The restriction of debate, whether by a repressive government or an intolerant society, invariably hurts those who lack power and makes everyone less capable of democratic participation. The way to defeat bad ideas is by exposure, argument, and persuasion, not by trying to silence or wish them away. We refuse any false choice between justice and freedom, which cannot exist without each other. As writers we need a culture that leaves us room for experimentation, risk taking, and even mistakes. We need to preserve the possibility of good-faith disagreement without dire professional consequences. If we won’t defend the very thing on which our work depends, we shouldn’t expect the public or the state to defend it for us….(More)”.

How urban design can make or break protests


Peter Schwartzstein in Smithsonian Magazine: “If protesters could plan a perfect stage to voice their grievances, it might look a lot like Athens, Greece. Its broad, yet not overly long, central boulevards are almost tailor-made for parading. Its large parliament-facing square, Syntagma, forms a natural focal point for marchers. With a warren of narrow streets surrounding the center, including the rebellious district of Exarcheia, it’s often remarkably easy for demonstrators to steal away if the going gets rough.

Los Angeles, by contrast, is a disaster for protesters. It has no wholly recognizable center, few walkable distances, and little in the way of protest-friendly space. As far as longtime city activists are concerned, just amassing small crowds can be an achievement. “There’s really just no place to go, the city is structured in a way that you’re in a city but you’re not in a city,” says David Adler, general coordinator at the Progressive International, a new global political group. “While a protest is the coming together of a large group of people and that’s just counter to the idea of L.A.”

Among the complex medley of moving parts that guide protest movements, urban design might seem like a fairly peripheral concern. But try telling that to demonstrators from Houston to Beijing, two cities that have geographic characteristics that complicate public protest. Low urban density can thwart mass participation. Limited public space can deprive protesters of the visibility and hence the momentum they need to sustain themselves. On those occasions when proceedings turn messy or violent, alleyways, parks, and labyrinthine apartment buildings can mean the difference between detention and escape….(More)”.