Examining the Mistrust of Science


Proceedings of a National Academies Workshop: “The Government-University-Industry Research Roundtable held a meeting on February 28 and March 1, 2017, to explore trends in public opinion of science, examine potential sources of mistrust, and consider ways that cross-sector collaboration between government, universities, and industry may improve public trust in science and scientific institutions in the future. The keynote address on February 28 was given by Shawn Otto, co-founder and producer of the U.S. Presidential Science Debates and author of The War on Science.

“There seems to be an erosion of the standing and understanding of science and engineering among the public,” Otto said. “People seem much more inclined to reject facts and evidence today than in the recent past. Why could that be?” Otto began exploring that question after the candidates in the 2008 presidential election declined an invitation to debate science-driven policy issues and instead chose to debate faith and values.

“Wherever the people are well-informed, they can be trusted with their own government,” wrote Thomas Jefferson. Now, some 240 years later, science is so complex that it is difficult even for scientists and engineers to understand the science outside of their particular fields. Otto argued,

“The question is, are people still well-enough informed to be trusted with their own government? Of the 535 members of Congress, only 11—less than 2 percent—have a professional background in science or engineering. By contrast, 218—41 percent—are lawyers. And lawyers approach a problem in a fundamentally different way than a scientist or engineer. An attorney will research both sides of a question, but only so that he or she can argue against the position that they do not support. A scientist will approach the question differently, not starting with a foregone conclusion and arguing towards it, but examining both sides of the evidence and trying to make a fair assessment.”

According to Otto, anti-science positions are now acceptable in public discourse, in Congress, state legislatures and city councils, in popular culture, and in presidential politics. Discounting factually incorrect statements does not necessarily reshape public opinion in the way some trust it to. What is driving this change? “Science is never partisan, but science is always political,” said Otto. “Science takes nothing on faith; it says, ‘show me the evidence and I’ll judge for myself.’ But the discoveries that science makes either confirm or challenge somebody’s cherished beliefs or vested economic or ideological interests. Science creates knowledge—knowledge is power, and that power is political.”…(More)”.

Big Data: A Twenty-First Century Arms Race


Report by Atlantic Council and Thomson Reuters: “We are living in a world awash in data. Accelerated interconnectivity, driven by the proliferation of internet-connected devices, has led to an explosion of data—big data. A race is now underway to develop new technologies and implement innovative methods that can handle the volume, variety, velocity, and veracity of big data and apply it smartly to provide decisive advantage and help solve major challenges facing companies and governments

For policy makers in government, big data and associated technologies like machine-learning and artificial Intelligence, have the potential to drastically improve their decision-making capabilities. How governments use big data may be a key factor in improved economic performance and national security. This publication looks at how big data can maximize the efficiency and effectiveness of government and business, while minimizing modern risks. Five authors explore big data across three cross-cutting issues: security, finance, and law.

Chapter 1, “The Conflict Between Protecting Privacy and Securing Nations,” Els de Busser
Chapter 2, “Big Data: Exposing the Risks from Within,” Erica Briscoe
Chapter 3, “Big Data: The Latest Tool in Fighting Crime,” Benjamin Dean, Fellow
Chapter 4, “Big Data: Tackling Illicit Financial Flows,” Tatiana Tropina
Chapter 5, “Big Data: Mitigating Financial Crime Risk,” Miren Aparicio….Read the Publication (PDF)

What Bhutanese hazelnuts tell us about using data for good


Bruno Sánchez-Andrade Nuño at WEForum: “How are we going to close the $2.5 trillion/year finance gap to achieve the Sustainable Development Goals (SDGs)? Whose money? What business model? How to scale it that much? If you read the recent development economics scholar literature, or Jim Kim’s new financing approach of the World Bank, you might hear the benefits of “blended finance” or “triple bottom lines.” I want to tell you instead about a real case that makes a dent. I want to tell you about Sonam.

Sonam is a 60-year old farmer in rural Bhutan. His children left for the capital, Thimphu, like many are doing nowadays. Four years ago, he decided to plant 2 acres of hazelnuts on an unused rocky piece of his land. Hazelnut saplings, training, and regular supervision all come from “Mountain Hazelnuts”, Bhutan’s only 100% foreign invested company. They fund the costs of the trees and helps him manage his orchard. In return, when the nuts come, he will sell his harvest to them above the guaranteed floor price, which will double his income; in a time when he will be too old to work in his rice field.

You could find similar impact stories for the roughly 10,000 farmers participating in this operation across the country, where the farmers are carefully selected to ensure productivity, maximize social and environmental benefits, such as vulnerable households, or reducing land erosion.

But Sonam also gets a visit from Kinzang every month. This is Kinzang’s first job. Otherwise, he would have moved to the city in hopes of finding a low paying job, but more likely joining the many unemployed youth from the countryside. Kinzang carefully records data on his smart-phone, talks to Sonam and digitally transmits the data back to the company HQ. There, if a problem is recorded with irrigation, pests, or there is any data anomaly, a team of experts (locally trained agronomists) will visit his orchard to figure out a solution.

The whole system of support, monitoring, and optimization live on a carefully crafted data platform that feeds information to and from the farmers, the monitors, the agronomist experts, and local government authorities. It ensures that all 10 million trees are healthy and productive, minimizes extra costs, tests and tracks effectiveness of new treatments….

This is also a story which demonstrates how “Data is the new oil” is not the right approach. If Data is the new oil, you extract value from the data, without much regard to feeding back value to the source of the data. However, in this system, “Data is the new soil.” Data creates a higher ground in which value flows back and forth. It lifts the source of the data -the farmers- into new income generation, it enables optimized operations; and it also helps the whole country: Much of the data (such as road quality used by the monitors) is made open for the benefit of the Bhutanese people, without contradiction or friction with the business model….(More)”.

A Road-Map To Transform The Secure And Accessible Use Of Data For High Impact Program Management, Policy Development, And Scholarship


Preface and Roadmap by Andrew Reamer and Julia Lane: “Throughout the United States, there is broadly emerging support to significantly enhance the nation’s capacity for evidence-based policymaking. This support is shared across the public and private sectors and all levels of geography. In recent years, efforts to enable evidence-based analysis have been authorized by the U.S. Congress, and funded by state and local governments, philanthropic foundations.

The potential exists for substantial change. There has been dramatic growth in technological capabilities to organize, link, and analyze massive volumes of data from multiple, disparate sources. A major resource is administrative data, which offer both advantages and challenges in comparison to data gathered through the surveys that have been the basis for much policymaking to date. To date, however, capability-building efforts have been largely “artisanal” in nature. As a result, the ecosystem of evidence-based policymaking capacity-building efforts is thin and weakly connected.

Each attempt to add a node to the system faces multiple barriers that require substantial time, effort, and luck to address. Those barriers are systemic. Too much attention is paid to the interests of researchers, rather than in the engagement of data producers. Individual projects serve focused needs and operate at a relative distance from one another Researchers, policymakers and funding agencies thus need exists to move from these artisanal efforts to new, generalized solutions that will catalyze the creation of a robust, large-scale data infrastructure for evidence-based policymaking.

This infrastructure will have be a “complex, adaptive ecosystem” that expands, regenerates, and replicates as needed while allowing customization and local control. To create a path for achieving this goal, the U.S. Partnership on Mobility from Poverty commissioned 12 papers and then hosted a day-long gathering (January 23, 2017) of over 60 experts to discuss findings and implications for action. Funded by the Gates Foundation, the papers and workshop panels were organized around three topics: privacy and confidentiality, data providers, and comprehensive strategies.

This issue of the Annals showcases those 12 papers which jointly propose solutions for catalyzing the development of a data infrastructure for evidence-based policymaking.

This preface:

  • places current evidence-based policymaking efforts in historical context
  • briefly describes the nature of multiple current efforts,
  • provides a conceptual framework for catalyzing the growth of any large institutional ecosystem,
  • identifies the major dimensions of the data infrastructure ecosystem,
  • describes key barriers to the expansion of that ecosystem, and
  • suggests a roadmap for catalyzing that expansion….(More)

(All 12 papers can be accessed here).

Rawification and the careful generation of open government data


 and  in Social Studies of Science: “Drawing on a two-year ethnographic study within several French administrations involved in open data programs, this article aims to investigate the conditions of the release of government data – the rawness of which open data policies require. This article describes two sets of phenomena. First, far from being taken for granted, open data emerge in administrations through a progressive process that entails uncertain collective inquiries and extraction work. Second, the opening process draws on a series of transformations, as data are modified to satisfy an important criterion of open data policies: the need for both human and technical intelligibility. There are organizational consequences of these two points, which can notably lead to the visibilization or the invisibilization of data labour. Finally, the article invites us to reconsider the apparent contradiction between the process of data release and the existence of raw data. Echoing the vocabulary of one of the interviewees, the multiple operations can be seen as a ‘rawification’ process by which open government data are carefully generated. Such a notion notably helps to build a relational model of what counts as data and what counts as work….(More)”.

Is Crowdsourcing Patient-Reported Outcomes the Future of Evidence-Based Medicine?


Paper by Mor Peleg, Tiffany I. Leung, Manisha Desai and Michel Dumontier: “Evidence is lacking for patient-reported effectiveness of treatments for most medical conditions and specifically for lower back pain. In this paper, we examined a consumer-based social network that collects patients’ treatment ratings as a potential source of evidence. Acknowledging the potential biases of this data set, we used propensity score matching and generalized linear regression to account for confounding variables. To evaluate validity, we compared results obtained by analyzing the patient reported data to results of evidence-based studies. Overall, there was agreement on the relationship between back pain and being obese. In addition, there was agreement about which treatments were effective or had no benefit. The patients’ ratings also point to new evidence that postural modification treatment is effective and that surgery is harmful to a large proportion of patients….(More)”.

Powerlessness and the Politics of Blame


The Jefferson Lecture in the Humanities by Martha C. Nussbaum: “… I believe the Greeks and Romans are right: anger is a poison to democratic politics, and it is all the worse when fueled by a lurking fear and a sense of helplessness. As a philosopher I have been working on these ideas for some time, first in a 2016 book called Anger and Forgiveness, and now in a book in progress called The Monarchy of Fear, investigating the relationship between anger and fear. In my work, I draw not only on the Greeks and Romans, but also on some recent figures, as I shall tonight. I conclude that we should resist anger in ourselves and inhibit its role in our political culture.

That idea, however, is radical and evokes strong opposition. For anger, with all its ugliness, is a popular emotion. Many people think that it is impossible to care for justice without anger at injustice, and that anger should be encouraged as part of a transformative process. Many also believe that it is impossible for individuals to stand up for their own self-respect without anger, that someone who reacts to wrongs and insults without anger is spineless and downtrodden. Nor are these ideas confined to the sphere of personal relations. The most popular position in the sphere of criminal justice today is retributivism, the view that the law ought to punish aggressors in a manner that embodies the spirit of justified anger. And it is also very widely believed that successful challenges against great injustice need anger to make progress.

Still, we may persist in our Aeschylean skepticism, remembering that recent years have seen three noble and successful freedom movements conducted in a spirit of non-anger: those of Mohandas Gandhi, Martin Luther King, Jr., and Nelson Mandela—surely people who stood up for their self-respect and that of others, and who did not acquiesce in injustice.

I’ll now argue that a philosophical analysis of anger can help us support these philosophies of non-anger, showing why anger is fatally flawed from a normative viewpoint—sometimes incoherent, sometimes based on bad values, and especially poisonous when people use it to deflect attention from real problems that they feel powerless to solve.  Anger pollutes democratic politics and is of dubious value in both life and the law. I’ll present my general view, and then show its relevance to thinking well about the struggle for political justice, taking our own ongoing struggle for racial justice as my example. And I’ll end by showing why these arguments make it urgent for us to learn from literature and philosophy, keeping the humanities strong in our society….(More)”

Public Data Is More Important Than Ever–And Now It’s Easier To Find


Meg Miller at Co.Design: “Public data, in theory, is meant to be accessible to everyone. But in practice, even finding it can be near impossible, to say nothing of figuring out what to do with it once you do. Government data websites are often clunky and outdated, and some data is still trapped on physical media–like CDs or individual hard drives.

Tens of thousands of these CDs and hard drives, full of data on topics from Arkansas amusement parks to fire incident reporting, have arrived at the doorstep of the New York-based start-up Enigma over the past four years. The company has obtained thousands upon thousands more datasets by way of Freedom of Information Act (FOIA) requests. Enigma specializes in open data: gathering it, curating it, and analyzing it for insights into a client’s industry, for example, or for public service initiatives.

Enigma also shares its 100,000 datasets with the world through an online platform called Public—the broadest collection of public data that is open and searchable by everyone. Public has been around since Enigma launched in 2013, but today the company is introducing a redesigned version of the site that’s fresher and more user-friendly, with easier navigation and additional features that allow users to drill further down into the data.

But while the first iteration of Public was mostly concerned with making Enigma’s enormous trove of data—which it was already gathering and reformating for client work—accessible to the public, the new site focuses more on linking that data in new ways. For journalists, researchers, and data scientists, the tool will offer more sophisticated ways of making sense of the data that they have access to through Enigma….

…the new homepage also curates featured datasets and collections to enforce a sense of discoverability. For example, an Enigma-curated collection of U.S. sanctions data from the U.S. Treasury Department’s Office of Foreign Assets Control (OFAC) shows data on the restrictions on entities or individuals that American companies can and can’t do business with in an effort to achieve specific national security or foreign policy objectives. A new round of sanctions against Russia have been in the news lately as an effort by President Trump to loosen restrictions on blacklisted businesses and individuals in Russia was overruled by the Senate last week. Enigma’s curated data selection on U.S. sanctions could help journalists contextualize recent events with data that shows changes in sanctions lists over time by presidential administration, for instance–or they could compare the U.S. sanctions list to the European Union’s….(More).

Blockchains, personal data and the challenge of governance


Theo Bass at NESTA: “…There are a number of dominant internet platforms (Google, Facebook, Amazon, etc.) that hoard, analyse and sell information about their users in the name of a more personalised and efficient service. This has become a problem.

People feel they are losing control over how their data is used and reused on the web. 500 million adblocker downloads is a symptom of a market which isn’t working well for people. As Irene Ng mentions in a recent guest blog on the Nesta website, the secondary data market is thriving (online advertising is a major player), as companies benefit from the opacity and lack of transparency about where profit is made from personal data.

It’s said that blockchain’s key characteristics could provide a foundational protocol for a fairer digital identity system on the web. Beyond its application as digital currency, blockchain could provide a new set of technical standards for transparency, openness, and user consent, on top of which a whole new generation of services might be built.

While the aim is ambitious, a handful of projects are rising to the challenge.

Blockstack is creating a global system of digital IDs, which are written into the bitcoin blockchain. Nobody can touch them other than the owner of that ID. Blockstack are building a new generation of applications on top of this infrastructure which promises to provide “a new decentralized internet where users own their data and apps run locally”.

Sovrin attempts to provide users with “self-sovereign identity”. The argument is that “centralized” systems for storing personal data make it a “treasure chest for attackers”. Sovrin argues that users should more easily be able to have “ownership” over their data, and the exchange of data should be made possible through a decentralised, tamper-proof ledger of transactions between users.

Our own DECODE project is piloting a set of collaboratively owned, local sharing economy platforms in Barcelona and Amsterdam. The blockchain aims to provide a public record of entitlements over where people’s data is stored, who can access it and for what purpose (with some additional help from new techniques in zero-knowledge cryptography to preserve people’s privacy).

There’s no doubt this is an exciting field of innovation. But the debate is characterised by a lot of hype. The following sections therefore discuss some of the challenges thrown up when we start thinking about implementations beyond bitcoin.

Blockchains and the challenge of governance

As mentioned above, bitcoin is a “bearer asset”. This is a necessary feature of decentralisation — all users maintain sole ownership over the digital money they hold on the network. If users get hacked (digital wallets sometimes do), or if a password gets lost, the money is irretrievable.

While the example of losing a password might seem trivial, it highlights some difficult questions for proponents of blockchain’s wider uses. What happens if there’s a dispute over an online transaction, but no intermediary to settle it? What happens if a someone’s digital assets or their digital identity is breached and sensitive data falls into the wrong hands? It might be necessary to assign responsibility to a governing actor to help resolve the issue, but of course this would require the introduction of a trusted middleman.

Bitcoin doesn’t try to answer these questions; its anonymous creators deliberately tried to avoid implementing a clear model of governance over the network, probably because they knew that bitcoin would be used by people as a method for subverting the law. Bitcoin still sees a lot of use in gray economies, including for the sale of drugs and gambling.

But if blockchains are set to enter the mainstream, providing for businesses, governments and nonprofits, then they won’t be able to function irrespective of the law. They will need to find use-cases that can operate alongside legal frameworks and jurisdictional boundaries. They will need to demonstrate regulatory compliance, create systems of rules and provide accountability when things go awry. This cannot just be solved through increasingly sophisticated coding.

All of this raises a potential paradox recently elaborated in a post by Vili Lehdonvirta of the Oxford Internet Institute: is it possible to successfully govern blockchains without undermining their entire purpose?….

If blockchain advocates only work towards purely technical solutions and ignore real-world challenges of trying to implement decentralisation, then we’ll only ever see flawed implementations of the technology. This is already happening in the form of centrally administered, proprietary or ‘half-baked’ blockchains, which don’t offer much more value than traditional databases….(More)”.

The Age of Customer.gov: Can the Tech that Drives 311 Help Government Deliver an Amazon-like Experience?


Tod Newcombe  at GovTech: “The Digital Communities Special … June 2017 report explores the idea that the tech that drives 311 can help government deliver an Amazon-like experience.

PART 1: 311: FROM A HOTLINE TO A PLATFORM FOR CITIZEN ENGAGEMENT

PART 2: CLOUD 311 POPULARITY GROWS AS CITIES OF ALL SIZES MOVE TO REMOTELY HOSTED CRM

PART 3: THE FUTURE OF CRM AND CUSTOMER SERVICE: LOOK TO BOSTON

PART 4: CRM USE IS GAINING TRACTION IN LOCAL GOVERNMENT — HERE ARE THE NUMBERS TO PROVE IT…(More)”.