The Problem With Facts


Tim Hartford: “…In 1995, Robert Proctor, a historian at Stanford University who has studied the tobacco case closely, coined the word “agnotology”. This is the study of how ignorance is deliberately produced; the entire field was started by Proctor’s observation of the tobacco industry. The facts about smoking — indisputable facts, from unquestionable sources — did not carry the day. The indisputable facts were disputed. The unquestionable sources were questioned. Facts, it turns out, are important, but facts are not enough to win this kind of argument.

Agnotology has never been more important. “We live in a golden age of ignorance,” says Proctor today. “And Trump and Brexit are part of that.”

In the UK’s EU referendum, the Leave side pushed the false claim that the UK sent £350m a week to the EU. It is hard to think of a previous example in modern western politics of a campaign leading with a transparent untruth, maintaining it when refuted by independent experts, and going on to triumph anyway. That performance was soon to be eclipsed by Donald Trump, who offered wave upon shameless wave of demonstrable falsehood, only to be rewarded with the presidency. The Oxford Dictionaries declared “post-truth” the word of 2016. Facts just didn’t seem to matter any more.

The instinctive reaction from those of us who still care about the truth — journalists, academics and many ordinary citizens — has been to double down on the facts. Fact-checking organisations, such as Full Fact in the UK and PolitiFact in the US, evaluate prominent claims by politicians and journalists. I should confess a personal bias: I have served as a fact checker myself on the BBC radio programme More or Less, and I often rely on fact-checking websites. They judge what’s true rather than faithfully reporting both sides as a traditional journalist would. Public, transparent fact checking has become such a feature of today’s political reporting that it’s easy to forget it’s barely a decade old.

Mainstream journalists, too, are starting to embrace the idea that lies or errors should be prominently identified. Consider a story on the NPR website about Donald Trump’s speech to the CIA in January: “He falsely denied that he had ever criticised the agency, falsely inflated the crowd size at his inauguration on Friday . . . —” It’s a bracing departure from the norms of American journalism, but then President Trump has been a bracing departure from the norms of American politics.

Facebook has also drafted in the fact checkers, announcing a crackdown on the “fake news” stories that had become prominent on the network after the election. Facebook now allows users to report hoaxes. The site will send questionable headlines to independent fact checkers, flag discredited stories as “disputed”, and perhaps downgrade them in the algorithm that decides what each user sees when visiting the site.

We need some agreement about facts or the situation is hopeless. And yet: will this sudden focus on facts actually lead to a more informed electorate, better decisions, a renewed respect for the truth? The history of tobacco suggests not. The link between cigarettes and cancer was supported by the world’s leading medical scientists and, in 1964, the US surgeon general himself. The story was covered by well-trained journalists committed to the values of objectivity. Yet the tobacco lobbyists ran rings round them.

In the 1950s and 1960s, journalists had an excuse for their stumbles: the tobacco industry’s tactics were clever, complex and new. First, the industry appeared to engage, promising high-quality research into the issue. The public were assured that the best people were on the case. The second stage was to complicate the question and sow doubt: lung cancer might have any number of causes, after all. And wasn’t lung cancer, not cigarettes, what really mattered? Stage three was to undermine serious research and expertise. Autopsy reports would be dismissed as anecdotal, epidemiological work as merely statistical, and animal studies as irrelevant. Finally came normalisation: the industry would point out that the tobacco-cancer story was stale news. Couldn’t journalists find something new and interesting to say?

Such tactics are now well documented — and researchers have carefully examined the psychological tendencies they exploited. So we should be able to spot their re-emergence on the political battlefield.

“It’s as if the president’s team were using the tobacco industry’s playbook,” says Jon Christensen, a journalist turned professor at the University of California, Los Angeles, who wrote a notable study in 2008 of the way the tobacco industry tugged on the strings of journalistic tradition.

One infamous internal memo from the Brown & Williamson tobacco company, typed up in the summer of 1969, sets out the thinking very clearly: “Doubt is our product.” Why? Because doubt “is the best means of competing with the ‘body of fact’ that exists in the mind of the general public. It is also the means of establishing a controversy.” Big Tobacco’s mantra: keep the controversy alive.

Doubt is usually not hard to produce, and facts alone aren’t enough to dispel it. We should have learnt this lesson already; now we’re going to have to learn it all over again.

Tempting as it is to fight lies with facts, there are three problems with that strategy….(More)”

iGod


Novel by Willemijn Dicke and Dirk Helbing: “iGod is a science fiction novel with heroes, love, defeat and hope. But it is much more than that. This book aims to explore how societies may develop, given the technologies that we see at present. As Dirk Helbing describes it in his introduction:

We have come to the conclusion that neither a scientific study nor an investigative report would allow one to talk about certain things that, we believe, need to be thought and talked about. So, a science fiction story appeared to be the right approach. It seems the perfect way to think “what if scenarios” through. It is not the first time that this avenue has been taken. George Orwell’s “1984” and “Animal Farm” come to mind, or Dave Eggers “The Circle”. The film ‘The Matrix’ and the Netflix series ‘Black Mirror are good examples too.

“iGod” outlines how life could be in a couple of years from now, certainly in our lifetime. At some places, this story about our future society seems far-fetched. For example, in “iGod”, all citizens have a Social Citizen Score. This score is established based on their buying habits, their communication in social media and social contacts they maintain. It is obtained by mass-surveillance and has a major impact on everyone’s life. It determines whether you are entitled to get a loan, what jobs you are offered, and even how long you will receive medical care.

The book is set in the near future in Amsterdam, the Netherlands. Lex is an unemployed biologist. One day he is contacted by a computer which, gradually reveals the machinery behind the reality we see. It is a bleak world. Together with his girlfriend Diana and Seldon, a Professor at Amsterdam Tech, he starts the quest to regain freedom….(More) (Individual chapters)”

Google DeepMind and healthcare in an age of algorithms


Julia Powles and Hal Hodson in Health and Technology: “Data-driven tools and techniques, particularly machine learning methods that underpin artificial intelligence, offer promise in improving healthcare systems and services. One of the companies aspiring to pioneer these advances is DeepMind Technologies Limited, a wholly-owned subsidiary of the Google conglomerate, Alphabet Inc. In 2016, DeepMind announced its first major health project: a collaboration with the Royal Free London NHS Foundation Trust, to assist in the management of acute kidney injury. Initially received with great enthusiasm, the collaboration has suffered from a lack of clarity and openness, with issues of privacy and power emerging as potent challenges as the project has unfolded. Taking the DeepMind-Royal Free case study as its pivot, this article draws a number of lessons on the transfer of population-derived datasets to large private prospectors, identifying critical questions for policy-makers, industry and individuals as healthcare moves into an algorithmic age….(More)”

Digital Democracy in Belgium and the Netherlands. A Socio-Legal Analysis of Citizenlab.be and Consultatie.nl


Chapter by Koen Van Aeken in: Prins, C. et. al (eds.) Digital Democracy in a Globalized World (Edward Elgar, 2017), Forthcoming: “The research question is how technologies characterized by ubiquitous Web 2.0 interactivity may contribute to democracy. Following a case study design, two applications were evaluated: the Belgian CitizenLab, a mobile, social and local private application to support public decision making in cities, and the Dutch governmental website Internetconsultatie. Available data suggest that the Dutch consultation platform is mainly visited by the ‘usual suspects’ and lacks participatory functionalities. In contrast, CitizenLab explicitly aims at policy co-creation through broad participation. Its novelty, however, prevents making sound empirical statements.

A comprehensive conceptualization precedes the case studies. To avoid instrumentalist reduction, the social setting of the technologies is reconstructed. Since its constituents, embedding and expectations – initially represented as the nation state and representative democracy – are increasingly challenged, their transformations are consequently discussed. The new embedding emerges as a governance constellation; new expectations concern the participatory dimension of politics. Future assessments of technologies may benefit from this conceptualization….(More)”

Did artificial intelligence deny you credit?


 in The Conversation: “People who apply for a loan from a bank or credit card company, and are turned down, are owed an explanation of why that happened. It’s a good idea – because it can help teach people how to repair their damaged credit – and it’s a federal law, the Equal Credit Opportunity Act. Getting an answer wasn’t much of a problem in years past, when humans made those decisions. But today, as artificial intelligence systems increasingly assist or replace people making credit decisions, getting those explanations has become much more difficult.

Traditionally, a loan officer who rejected an application could tell a would-be borrower there was a problem with their income level, or employment history, or whatever the issue was. But computerized systems that use complex machine learning models are difficult to explain, even for experts.

Consumer credit decisions are just one way this problem arises. Similar concerns exist in health care, online marketing and even criminal justice. My own interest in this area began when a research group I was part of discovered gender bias in how online ads were targeted, but could not explain why it happened.

All those industries, and many others, who use machine learning to analyze processes and make decisions have a little over a year to get a lot better at explaining how their systems work. In May 2018, the new European Union General Data Protection Regulation takes effect, including a section giving people a right to get an explanation for automated decisions that affect their lives. What shape should these explanations take, and can we actually provide them?

Identifying key reasons

One way to describe why an automated decision came out the way it did is to identify the factors that were most influential in the decision. How much of a credit denial decision was because the applicant didn’t make enough money, or because he had failed to repay loans in the past?

My research group at Carnegie Mellon University, including PhD student Shayak Sen and then-postdoc Yair Zick created a way to measure the relative influence of each factor. We call it the Quantitative Input Influence.

In addition to giving better understanding of an individual decision, the measurement can also shed light on a group of decisions: Did an algorithm deny credit primarily because of financial concerns, such as how much an applicant already owes on other debts? Or was the applicant’s ZIP code more important – suggesting more basic demographics such as race might have come into play?…(More)”

Fighting Corruption in Health Care? There’s an App for That


Akjibek Beishebaeva at Voices (OSF): “As an industry that relies heavily on approvals from government officials, the pharmaceutical field in places like Ukraine and Kyrgyzstan—which lack strong mechanisms for public oversight—is particularly susceptible to corruption.

The problem in those countries is exacerbated by the absence of any reliable system to monitor market prices for drugs. For example, a hospital manager bribed by a pharmaceutical representative could agree to procure a drug at a price 10 times higher than at a neighboring hospital. In addition, those medicines procured by the state and meant to be dispensed freely to patients often appear for sale at hospital-based pharmacies instead.

These aren’t victimless crimes. The most needy patients are often the first to suffer when funds are diverted away from lifesaving treatments and medicines.

To tackle this issue, last year the Soros Foundation–Kyrgyzstan and the International Renaissance Foundation jointly conducted the Health Data Hackathon in the Yssyk-Kul region of Kyrgyzstan. Two teams from Ukraine and three teams from Kyrgyzstan—consisting of coders, journalists, and activists—took part. Their goal was to find innovative solutions to address corruption in public procurements and access to health services for vulnerable populations.

Over the two-and-a-half-day effort, one of the Ukrainian teams developed a prototype for a software application to improve the e-tendering platform for all public procurement in Ukraine—ProZorro.

ProZorro itself revolutionized the tender process when it first launched in 2015. It combined a centralized database of online markets and was made accessible to the public. Journalists, activists, and patients today can log in to the system and scrutinize tenders approved by the government. The transparency provided by the system has already shown savings of more than a billion UAH (US$37 million). However, the database is huge and can be tricky to navigate without training.

The application developed at the hackathon makes it even easier to monitor the purchase prices of medicines in Ukraine. Specfically, it will allow users to automatically and instantly compare prices for the same products—a process which previously took many days of manual effort.

The application also offers a more intuitive interface and improved search functionality that will help further reduce corruption and save money—savings that can be redirected towards treatments for people living with HIV, cancer, and hepatitis C. The team is now testing the software and working with the government to introduce it early this year.

Another team came up with the idea to let patients monitor supplies of medicine at facilities in real time. If a hospital representative says that a patient needs to buy drugs that should be readily available, for example, the patient can check online and hold the hospital accountable if the medicines are meant to be provided for free. The tool, called WikiLiky, has already been implemented in the Sumy region of Ukraine.

Likewise, one of the Kyrgyz teams looked at price monitoring in their own country, focusing on the inefficient and mistake-prone acquisition process. For instance, the name of one drug might be misspelled in several different ways, making it difficult to track prices accurately. The team redesigned the functionality of the government e-procurement portal called Codifier, creating uniformity across the system of names, dosages, and other medical specifications….(More)”

Big data helps Belfort, France, allocate buses on routes according to demand


 in Digital Trends: “As modern cities smarten up, the priority for many will be transportation. Belfort, a mid-sized French industrial city of 50,000, serves as proof of concept for improved urban transportation that does not require the time and expense of covering the city with sensors and cameras.

Working with Tata Consultancy Services (TCS) and GFI Informatique, the Board of Public Transportation of Belfort overhauled bus service management of the city’s 100-plus buses. The project entailed a combination of ID cards, GPS-equipped card readers on buses, and big data analysis. The collected data was used to measure bus speed from stop to stop, passenger flow to observe when and where people got on and off, and bus route density. From start to finish, the proof of concept project took four weeks.

Using the TCS Intelligent Urban Exchange system, operations managers were able to detect when and where about 20 percent of all bus passengers boarded and got off on each city bus route. Utilizing big data and artificial intelligence the city’s urban planners were able to use that data analysis to make cost-effective adjustments including the allocation of additional buses on routes and during times of greater passenger demand. They were also able to cut back on buses for minimally used routes and stops. In addition, the system provided feedback on the effect of city construction projects on bus service….

Going forward, continued data analysis will help the city budget wisely for infrastructure changes and new equipment purchases. The goal is to put the money where the needs are greatest rather than just spending and then waiting to see if usage justified the expense. The push for smarter cities has to be not just about improved services, but also smart resource allocation — in the Belfort project, the use of big data showed how to do both….(More)”

Just Change: How to Collaborate for Lasting Impact


Book by Tynesia Boyea-Robinson: “… is a collection of stories and case studies to evolve the way we think about and approach systemic causes of inequities facing low-income communities, particularly communities of color. The book successfully addresses:

  • Cross-sector collaboration as a requirement for sustainable social change;
  • Moving away from siloed programs with single-focused solutions to building systems and infrastructures that improve inequities at the population-level; and
  • Reframing how to think about and measure success in order to achieve scale and impact.

Read about leaders across the country who have successfully created sustainable, long-lasting solutions to address key root causes of inequities in their communities:

  • How the Detroit Corridor Initiative, Cincinnati, and Minneapolis-St Paul used shared results for successful cross-sector partnerships
  • How Nexus Community Partners in Minneapolis changed how they collaborate with the community they’re serving towards a more authentic community engagement
  • How Best Start for Kids in Seattle/King County effectively used cross-sector partnerships
  • How Camden City in New Jersey partnered with Campbell’s Soup for better health outcomes

Discover tested tools and strategies to implement change in your own communities, such as:

  • How the Model Behavior, Align Resources, Catalyze Change (MAC) framework harnesses intrinsic motivation for behavior change
  • How the Data Inventory helps you figure out what data needs to be collected and how to get it
  • Four components of creating effective shared results that will drive your cross-sector partnership towards success…(More)”.

Migration tracking is a mess


Huub Dijstelbloem in Nature: “As debates over migration, refugees and freedom of movement intensify, technologies are increasingly monitoring the movements of people. Biometric passports and databases containing iris scans or fingerprints are being used to check a person’s right to travel through or stay within a territory. India, for example, is establishing biometric identification for its 1.3 billion citizens.

But technologies are spreading beyond borders. Security policies and humanitarian considerations have broadened the landscape. Drones and satellite images inform policies and direct aid to refugees. For instance, the United Nations Institute for Training and Research (UNITAR), maps refugee camps in Jordan and elsewhere with its Operational Satellite Applications Programme (UNOSAT; see www.unitar.org/unosat/map/1928).

Three areas are in need of research, in my view: the difficulties of joining up disparate monitoring systems; privacy issues and concerns over the inviolability of the human body; and ‘counter-surveillance’ deployed by non-state actors to highlight emergencies or contest claims that governments make.

Ideally, state monitoring of human mobility would be bound by ethical principles, solid legislation, periodical evaluations and the checks and balances of experts and political and public debates. In reality, it is ad hoc. Responses are arbitrary, fuelled by the crisis management of governments that have failed to anticipate global and regional migration patterns. Too often, this results in what the late sociologist Ulrich Beck called organized irresponsibility: situations of inadequacy in which it is hard to blame a single actor.

Non-governmental organizations, activists and migrant groups are using technologies to register incidents and to blame and shame states. For example, the Forensic Architecture research agency at Goldsmiths, University of London, has used satellite imagery and other evidence to reconstruct the journey of a boat that left Tripoli on 27 March 2011 with 72 passengers. A fortnight later, it returned to the Libyan coast with just 9 survivors. Although the boat had been spotted by several aircraft and vessels, no rescue operation had been mounted (go.nature.com/2mbwvxi). Whether the states involved can be held accountable is still being considered.

In the end, technologies to monitor mobility are political tools. Their aims, design, use, costs and consequences should be developed and evaluated accordingly….(More)”.

UK’s Digital Strategy


Executive Summary: “This government’s Plan for Britain is a plan to build a stronger, fairer country that works for everyone, not just the privileged few. …Our digital strategy now develops this further, applying the principles outlined in the Industrial Strategy green paper to the digital economy. The UK has a proud history of digital innovation: from the earliest days of computing to the development of the World Wide Web, the UK has been a cradle for inventions which have changed the world. And from Ada Lovelace – widely recognised as the first computer programmer – to the pioneers of today’s revolution in artificial intelligence, the UK has always been at the forefront of invention. …

Maintaining the UK government as a world leader in serving its citizens online

From personalised services in health, to safer care for the elderly at home, to tailored learning in education and access to culture – digital tools, techniques and technologies give us more opportunities than ever before to improve the vital public services on which we all rely.

The UK is already a world leader in digital government,7 but we want to go further and faster. The new Government Transformation Strategy published on 9 February 2017 sets out our intention to serve the citizens and businesses of the UK with a better, more coherent experience when using government services online – one that meets the raised expectations set by the many other digital services and tools they use every day. So, we will continue to develop single cross-government platform services, including by working towards 25 million GOV.UK Verify users by 2020 and adopting new services onto the government’s GOV.UK Pay and GOV.UK Notify platforms.

We will build on the ‘Government as a Platform’ concept, ensuring we make greater reuse of platforms and components across government. We will also continue to move towards common technology, ensuring that where it is right we are consuming commodity hardware or cloud-based software instead of building something that is needlessly government specific.

We will also continue to work, across government and the public sector, to harness the potential of digital to radically improve the efficiency of our public services – enabling us to provide a better service to citizens and service users at a lower cost. In education, for example, we will address the barriers faced by schools in regions not connected to appropriate digital infrastructure and we will invest in the Network of Teaching Excellence in Computer Science to help teachers and school leaders build their knowledge and understanding of technology. In transport, we will make our infrastructure smarter, more accessible and more convenient for passengers. At Autumn Statement 2016 we announced that the National Productivity Investment Fund would allocate £450 million from 2018-19 to 2020-21 to trial digital signalling technology on the rail network. And in policing, we will enable police officers to use biometric applications to match fingerprint and DNA from scenes of crime and return results including records and alerts to officers over mobile devices at the crime scene.

Read more about digital government.

Unlocking the power of data in the UK economy and improving public confidence in its use

As part of creating the conditions for sustainable growth, we will take the actions needed to make the UK a world-leading data-driven economy, where data fuels economic and social opportunities for everyone, and where people can trust that their data is being used appropriately.

Data is a global commodity and we need to ensure that our businesses can continue to compete and communicate effectively around the world. To maintain our position at the forefront of the data revolution, we will implement the General Data Protection Regulation by May 2018. This will ensure a shared and higher standard of protection for consumers and their data.

Read more about data….(More)”