Inaccurate Data, Half-Truths, Disinformation, and Mob Violence


Image for post

Image credit: Kayla Velasquez/Unsplash.

Selected Readings by Fiona Cece, Uma Kalkar, and Stefaan Verhulst: “The mob attack on the US Congress was alarming and the result of various efforts to undermine the trust in and legitimacy of longstanding democratic processes and institutions. In particular, the use of inaccurate data, half-truths, and disinformation to spread hate and division is considered a key driver behind last week’s attack. Altering data to support conspiracy theories or challenging and undermining the credibility of trusted data sources to allow for alternative narratives to flourish, if left unchallenged, has consequences — including the increased acceptance and use of violence both off-line and on-line.

Everyone working on data and information needs to be aware of the implications of altering or misusing data (including election results) to support malicious objectives. The January 6th riot is unfortunately not a unique event, nor is it contained to the US. Below, we provide a curation of findings and readings that illustrate the global danger of inaccurate data, half-truths, and willful disinformation….(Readings)”.

On the Importance of Human-Centricity and Data


World Economic Forum: “…Human-centricity is a game changer in terms of unleashing the latent potential in data to empower societies. A human-centric shift in viewpoint is a simple, yet profound, change in thinking. It offers ground-breaking consequences for the availability, usability and quality of data for any person or organization, public or private. Far from being in conflict with the aims of business, human-centricity can drive new paradigms and ideas for innovation, capitalizing on the possibilities that better data availability can unleash for societies…

Organization-centric approaches turn data into resources for companies (business), control what organizations can and cannot do with data (regulation) and aim to enable the most efficient possible data utilization for data-controlling organizations (technology) — without fully taking into account the interests of people of society. In contast, a human-centric approach looks at data as it relates to people about whom that
data exists — while still accounting for the interests of business. Human-centric business models (be they B2C, B2B, or B2G) take people’s and societies’ interests as a guiding principle; regulation is designed in a way that assumes and guarantees that data about people is being used for their and their communities’ benefit, and technology prioritizes humane values (including social and ecological ones)….(More)”.

Cognitive Science as a New People Science for the Future of Work


Brief by Frida Polli et al: “The notion of studying people in jobs as a science—in fields such as human resource management, people analytics, and industrial-organizational psychology—dates back to at least the early 20th century. In 1919, Yale psychologist Henry Charles Link wrote, “The application of science to the problem of employment is just beginning to receive serious attention,” at last providing an alternative to the “hire and fire” methods of 19th-century employers. A year later, prominent organizational theorists Ordway Teal and Henry C. Metcalf claimed, “The new focus in administration is to be the human element. The new center of attention and solicitude is the individual person, the worker.” The overall conclusion at the time was that various social and psychological factors governed differences in employee productivity and satisfaction….This Brief Proceeds in Five Sections:

● First, we review the limitations of traditional approaches to people science. In particular, we focus on four needs of the modern employer that are not satisfied by the status quo: job fit, soft skills, fairness, and flexibility.

● Second, we present the foundations of a new people science by explaining how advancements in fields like cognitive science and neuroscience can be used to understand the individual differences between humans.

● Third, we describe four best practices that should govern the application of the new people science theories to real-world employment contexts.

● Fourth, we present a case study of how one platform company has used the new people science to create hiring models for five high-growth roles.● Finally, we explain how the type of insights presented in Section IV can be made actionable in the context of retraining employees for the future of work….(More)”.

Predictive Policing and Artificial Intelligence


Book edited by John McDaniel and Ken Pease: “This edited text draws together the insights of numerous worldwide eminent academics to evaluate the condition of predictive policing and artificial intelligence (AI) as interlocked policy areas. Predictive and AI technologies are growing in prominence and at an unprecedented rate. Powerful digital crime mapping tools are being used to identify crime hotspots in real-time, as pattern-matching and search algorithms are sorting through huge police databases populated by growing volumes of data in an eff ort to identify people liable to experience (or commit) crime, places likely to host it, and variables associated with its solvability. Facial and vehicle recognition cameras are locating criminals as they move, while police services develop strategies informed by machine learning and other kinds of predictive analytics. Many of these innovations are features of modern policing in the UK, the US and Australia, among other jurisdictions.

AI promises to reduce unnecessary labour, speed up various forms of police work, encourage police forces to more efficiently apportion their resources, and enable police officers to prevent crime and protect people from a variety of future harms. However, the promises of predictive and AI technologies and innovations do not always match reality. They often have significant weaknesses, come at a considerable cost and require challenging trade- off s to be made. Focusing on the UK, the US and Australia, this book explores themes of choice architecture, decision- making, human rights, accountability and the rule of law, as well as future uses of AI and predictive technologies in various policing contexts. The text contributes to ongoing debates on the benefits and biases of predictive algorithms, big data sets, machine learning systems, and broader policing strategies and challenges.

Written in a clear and direct style, this book will appeal to students and scholars of policing, criminology, crime science, sociology, computer science, cognitive psychology and all those interested in the emergence of AI as a feature of contemporary policing….(More)”.

Rescuing Our Democracy by Rethinking New York Times Co. v. Sullivan


Paper by David Andrew Logan: “New York Times v. Sullivan (1964) is an iconic decision, foundational to modern First Amendment theory, and in a string of follow-on decisions the Court firmly grounded free speech theory and practice in the need to protect democratic discourse. To do this the Court provided broad and deep protections to the publishers of falsehoods. This article recognizes that New York Times and its progeny made sense in the “public square” of an earlier era, but the justices could never have foreseen the dramatic changes in technology and the media environment in the years since, nor predict that by making defamation cases virtually impossible to win they were harming, rather than helping self-government. In part because of New York Times, the First Amendment has been weaponized, frustrating a basic requirement of a healthy democracy: the development of a set of broadly agreed-upon facts. Instead, we are subject to waves of falsehoods that swamp the ability of citizens to effectively self-govern. As a result, and despite its iconic status, New York Times needs to be reexamined and retooled to better serve our democracy….(More)”

Blockchain and Citizenship: Uneasy Bedfellows


Paper by Oskar Josef Gstrein and Dimitry Kochenov: “…Distributed Ledger Technology can be an effective tool for resource distribution. As individuals and organisations explore innovations which allow to redefine the rules of access, possession and sharing these developments also become important for the future of self-determination. Demonstrated through credit scoring and ‘social credit systems’, the identity of an individual is intertwined with resource access, possession and transferability. A key pre-requisite for participation is formal legal status, which translates to citizenship. However, many proponents of Distributed Ledger Technology focus predominantly on technological features and capabilities, which might enable the implementation of concepts such as decentralised governance, ‘self-sovereign identity’ management, and trust-less transactions based on ‘zero-knowledge proof’. Nevertheless, such narrow consideration overlooks existing legal and political realities. Considering the lessons learned from citizenship, it becomes questionable whether Blockchain as player in the area of identity management will ultimately increase human dignity, or further manifest traditional patterns of discrimination and inequality….(More)”.

The Problem with Science: The Reproducibility Crisis and What to do About It


Book by R. Barker Bausell: “Recent events have vividly underscored the societal importance of science, yet the majority of the public are unaware that a large proportion of published scientific results are simply wrong. The Problem with Science is an exploration of the manifestations and causes of this scientific crisis, accompanied by a description of the very promising corrective initiatives largely developed over the past decade to stem the spate of irreproducible results that have come to characterize many of our sciences.

More importantly, Dr. R. Barker Bausell has designed it to provide guidance to practicing and aspiring scientists regarding how (a) to change the way in which science has come to be both conducted and reported in order to avoid producing false positive, irreproducible results in their own work and (b) to change those institutional practices (primarily but not exclusively involving the traditional journal publishing process and the academic reward system) that have unwittingly contributed to the present crisis. There is a need for change in the scientific culture itself. A culture which prioritizes conducting research correctly in order to get things right rather than simply getting it published….(More)”.

The pandemic has pushed citizen panels online


Article by Claudia Chwalisz: “…Until 2020, most assemblies took place in person. We know what they require to produce useful recommendations and gain public trust: time (usually many days over many months), access to broad and varied information, facilitated discussion, and transparency. Successful assemblies take on a pressing public issue, secure politicians’ commitment to respond, have mechanisms to ensure independence, and provide facilities such as stipends and childcare, so all can participate. The diversity of people in the room is what delivers the magic of collective intelligence.

However, the pandemic has forced new approaches. Online discussions might be in real time or asynchronous; facilitators and participants might be identifiable or anonymous. My team at the OECD is exploring how virtual deliberation works best. We have noticed a shift: from text-based interactions to video; from an emphasis on openness to one on representativeness; and from individual to group deliberation.

Some argue that online deliberation is less expensive than in-person processes, but the costs are similar when designed to be as democratic as possible. The new wave pays much more attention to inclusivity. For many online citizens’ assemblies this year (for example, in Belgium, Canada and parts of the United Kingdom), participants without equipment were given computers or smartphones, along with training and support to use them. A digital mediator is now essential for any plans to conduct online deliberation inclusively.

Experiments have also started to transcend national borders. Last October, the German Bertelsmann Stiftung, a private foundation for political reform, and the European Commission ran a Citizens’ Dialogue with 100 randomly selected citizens from Denmark, Germany, Ireland, Italy and Lithuania. They spent three days discussing Europe’s democratic, digital and green future. The Global Citizens’ Assembly on Genome Editing will take place in 2021–22, as will the Global Citizens’ Assembly for the United Nations Climate Change Conference.

However, virtual meetings do not replace in-person interactions. Practitioners adapting assemblies to the virtual world warn that online processes could push people into more linear and binary thinking through voting tools, rather than seeking a nuanced understanding of other people’s reasoning and values….(More)”.

Rising to the Challenge: how to get the best value from using prizes to drive innovation for development


Report by Cheryl Brown, Catherine Gould, Clare Stott: “An innovation inducement prize enables funders to pursue development goals without them having to know in advance which approaches or participants are most likely to succeed. Innovation prizes also often directly engage with the intended beneficiaries or those connected with them, in solving the problems.

At a time when development spending is under increasing pressure to show value for money (VFM), innovation prizes are considered as an alternative to mainstream funding options. While costs are likely to have accrued through prize design and management, no cash payments are made until the prize is successfully awarded. The funder may anticipate obtaining more results than those directly paid for through the prize award.

The purpose of this report is to answer two questions: do innovation prizes work for development, and if so, when do they offer value over other forms of funding?

To date, few evaluations have been published that would help funders answer these questions for themselves. DFID commissioned the Ideas to Impact programme, which was delivered by an IMC Worldwide-led consortium and evaluated by Itad, to fill this gap by testing a range of innovation prizes targeted at different development issues and this report synthesises the findings from the evaluations and follow-up reviews of six of these prizes….(More)”.

An Open Data Team Experiments with a New Way to Tell City Stories


Article by  Sean Finnan: “Can you see me?” says Mark Linnane, over Zoom, as he walks around a plastic structure on the floor of an office at Maynooth University. “That gives you some sense of the size of it. It’s 3.5 metres by 2.”

Linnane trails his laptop’s webcam over the surface of the off-white 3D model, giving a birds-eye view of tens of thousands of tiny buildings, the trails of roads and the clear pathway of the Liffey.

This replica of the heart of the city from Phoenix Park to Dublin Port was created to scale by the university’s Building City Dashboards team, using data from the Ordnance Survey Ireland.

In the five years since they started to grapple with the question of how to present data about the city in an engaging and accessible way, the team has experimented with virtual reality, and augmented reality – and most recently, with this new form of mapping, which blends the lego-like miniature of Dublin’s centre with changeable data projected on.

This could really come into its own as a public exhibit if they start to tell meaningful data-driven and empirical stories, says Linnane, a digital exhibition developer at Maynooth University.

Stories that are “relevant in terms of the everyday daily lives of people who will be coming to see it”, he says.

Layers of Meaning

Getting the projector that throws the visualisations onto the model to work right was Linnane’s job, he says.

He had to mesh the Ordnance Survey data with others that showed building heights for example. “Every single building down to the sheds in someone’s garden have a unique identifier,” says Linnane.

Projectors are built to project onto flat surfaces and not 3D models so that had to be finessed, too, he says. “Every step on the way was a new development. There wasn’t really a process there before.”

The printed 3D model shows 7km by 4km of Dublin and 122,355 structures, says Linnane. That includes bigger buildings but also small outbuildings, railway platforms, public toilets and glasshouses – all mocked up and serving as a canvas for a kaleidoscope of data.

“We’re just projecting data on to it and seeing what’s going on with that,” says Rob Kitchin, principal investigator at Maynooth University’s Programmable City project….(More)”

Image of model courtesy of Mark Linnane.