Explore our articles
View All Results

Stefaan Verhulst

Open Access Book by Justin Parkhurst: “There has been an enormous increase in interest in the use of evidence for public policymaking, but the vast majority of work on the subject has failed to engage with the political nature of decision making and how this influences the ways in which evidence will be used (or misused) within political areas. This book provides new insights into the nature of political bias with regards to evidence and critically considers what an ‘improved’ use of evidence would look like from a policymaking perspective.

Part I describes the great potential for evidence to help achieve social goals, as well as the challenges raised by the political nature of policymaking. It explores the concern of evidence advocates that political interests drive the misuse or manipulation of evidence, as well as counter-concerns of critical policy scholars about how appeals to ‘evidence-based policy’ can depoliticise political debates. Both concerns reflect forms of bias – the first representing technical bias, whereby evidence use violates principles of scientific best practice, and the second representing issue bias in how appeals to evidence can shift political debates to particular questions or marginalise policy-relevant social concerns.

Part II then draws on the fields of policy studies and cognitive psychology to understand the origins and mechanisms of both forms of bias in relation to political interests and values. It illustrates how such biases are not only common, but can be much more predictable once we recognise their origins and manifestations in policy arenas.

Finally, Part III discusses ways to move forward for those seeking to improve the use of evidence in public policymaking. It explores what constitutes ‘good evidence for policy’, as well as the ‘good use of evidence’ within policy processes, and considers how to build evidence-advisory institutions that embed key principles of both scientific good practice and democratic representation. Taken as a whole, the approach promoted is termed the ‘good governance of evidence’ – a concept that represents the use of rigorous, systematic and technically valid pieces of evidence within decision-making processes that are representative of, and accountable to, populations served….(More)”.

The Politics of Evidence: From Evidence-Based Policy to the Good Governance of Evidence

Paper by Michael McGann, Emma Blomkamp and Jenny M. Lewis in Policy Sciences: “Governments are increasingly turning to public sector innovation (PSI) labs to take new approaches to policy and service design. This turn towards PSI labs, which has accelerated in more recent years, has been linked to a number of trends. These include growing interest in evidence-based policymaking and the application of ‘design thinking’ to policymaking, although these trends sit uncomfortably together. According to their proponents, PSI labs are helping to create a new era of experimental government and rapid experimentation in policy design.

But what do these PSI labs do? How do they differ from other public sector change agents and policy actors? What approaches do they bring to addressing contemporary policymaking? And how do they relate to other developments in policy design such as the growing interest in evidence-based policy and design experiments? The rise of PSI labs has thus far received little attention from policy scientists. Focusing on the problems associated with conceptualising PSI labs and clearly situating them in the policy process, this paper provides an analysis of some of the most prominent PSI labs. It examines whether labs can be classified into distinct types, their relationship to government and other policy actors and the principal methodological practices and commitments underpinning their approach to policymaking. Throughout, the paper considers how the rise of PSI labs may challenge positivist framings of policymaking as an empirically driven decision process….(More)”.

The rise of public sector innovation labs: experiments in design thinking for policy

Paper by Rachel Baker, Thomas Dee, Brent Evans and June John: “While online learning environments are increasingly common, relatively little is known about issues of equity in these settings. We test for the presence of race and gender biases among postsecondary students and instructors in online classes by measuring student and instructor responses to discussion comments we posted in the discussion forums of 124 different online courses. Each comment was randomly assigned a student name connoting a specific race and gender. We find that instructors are 94% more likely to respond to forum posts by White male students. In contrast, we do not find general evidence of biases in student responses. However, we do find that comments placed by White females are more likely to receive a response from White female peers. We discuss the implications of our findings for our understanding of social identity dynamics in classrooms and the design of equitable online learning environments….(More)”.

Bias in Online Classes: Evidence from a Field Experiment

Benson S. Hsu, MD and Emily Griese in Harvard Business Review: “At Sanford Health, a $4.5 billion rural integrated health care system, we deliver care to over 2.5 million people in 300 communities across 250,000 square miles. In the process, we collect and store vast quantities of patient data – everything from admission, diagnostic, treatment and discharge data to online interactions between patients and providers, as well as data on providers themselves. All this data clearly represents a rich resource with the potential to improve care, but until recently was underutilized. The question was, how best to leverage it.

While we have a mature data infrastructure including a centralized data and analytics team, a standalone virtual data warehouse linking all data silos, and strict enterprise-wide data governance, we reasoned that the best way forward would be to collaborate with other institutions that had additional and complementary data capabilities and expertise.

We reached out to potential academic partners who were leading the way in data science, from university departments of math, science, and computer informatics to business and medical schools and invited them to collaborate with us on projects that could improve health care quality and lower costs. In exchange, Sanford created contracts that gave these partners access to data whose use had previously been constrained by concerns about data privacy and competitive-use agreements. With this access, academic partners are advancing their own research while providing real-world insights into care delivery.

The resulting Sanford Data Collaborative, now in its second year, has attracted regional and national partners and is already beginning to deliver data-driven innovations that are improving care delivery, patient engagement, and care access. Here we describe three that hold particular promise.

  • Developing Prescriptive Algorithms…
  • Augmenting Patient Engagement…
  • Improving Access to Care…(More)”.
Making Better Use of Health Care Data

Yomi Kazeem in Quartz: “On Mar. 7, elections in Sierra Leone marked a global landmark: the world’s first ever blockchain-powered presidential elections….

In Sierra Leone’s Western District, the most populous in the country, votes cast were manually recorded by Agora, a Swiss foundation offering digital voting solutions, using a permissioned blockchain. The idea was simple: just like blockchain technology helps ensure transparency with crytpocurrency transactions using public ledgers, by recording each vote on blockchain, Agora ensured transparency with votes cast in the district. While entries on permissioned blockchains can be viewed by everyone, entries can only be validated by authorized persons.

A lack of transparency has plagued many elections around the world, but particularly in some African countries where large sections of the electorate are often suspicions incumbent parties or ethnic loyalties have been responsible for the manipulation of the results in favor of one candidate or another. These suspicions remain even when there is little evidence of manipulation. A more transparent system could help restore trust.

Leonardo Gammar, CEO of Agora, says Sierra Leone’s NEC was “open minded” about the potential of blockchain in its elections after talks began late last year. “I also thought that if we can do it in Sierra Leone, we can do it everywhere else,” he says. That thinking is rooted in Sierra Leone’s developmental challenges which make electoral transparency difficult: poor network connectivity, low literacy levels and frequent electoral violence.

The big picture for Agora is to deploy solutions to automate the entire electoral process with citizens voting electronically using biometric data and personalized cryptographic keys and the votes in turn validated by blockchain. Gammar hopes Agora can replicate its work in other African elections on a larger scale but admits that doing so will require understanding the differing challenges each country faces.

Gammar says blockchain-powered electronic voting will be cheaper for African countries by cutting out the printing cost of paper-based elections but perhaps, more importantly, vastly reduce electoral violence…(More)”.

The world’s first blockchain-powered elections just happened in Sierra Leone

Fei-Fei Li in the New York Times: “For a field that was not well known outside of academia a decade ago, artificial intelligence has grown dizzyingly fast. Tech companies from Silicon Valley to Beijing are betting everything on it, venture capitalists are pouring billions into research and development, and start-ups are being created on what seems like a daily basis. If our era is the next Industrial Revolution, as many claim, A.I. is surely one of its driving forces.

It is an especially exciting time for a researcher like me. When I was a graduate student in computer science in the early 2000s, computers were barely able to detect sharp edges in photographs, let alone recognize something as loosely defined as a human face. But thanks to the growth of big data, advances in algorithms like neural networks and an abundance of powerful computer hardware, something momentous has occurred: A.I. has gone from an academic niche to the leading differentiator in a wide range of industries, including manufacturing, health care, transportation and retail.

I worry, however, that enthusiasm for A.I. is preventing us from reckoning with its looming effects on society. Despite its name, there is nothing “artificial” about this technology — it is made by humans, intended to behave like humans and affects humans. So if we want it to play a positive role in tomorrow’s world, it must be guided by human concerns.

I call this approach “human-centered A.I.” It consists of three goals that can help responsibly guide the development of intelligent machines.

First, A.I. needs to reflect more of the depth that characterizes our own intelligence….

No technology is more reflective of its creators than A.I. It has been said that there are no “machine” values at all, in fact; machine values arehuman values. A human-centered approach to A.I. means these machines don’t have to be our competitors, but partners in securing our well-being. However autonomous our technology becomes, its impact on the world — for better or worse — will always be our responsibility….(More).

How to Make A.I. That’s Good for People

Jennifer L. Gustetic et al in Space Policy: “Beginning in 2012, NASA utilized a strategic process to identify broad societal questions, or grand challenges, that are well suited to the aerospace sector and align with national priorities. This effort generated NASA’s first grand challenge, the Asteroid Grand Challenge (AGC), a large-scale effort using multi-disciplinary collaborations and innovative engagement mechanisms focused on finding and addressing asteroid threats to human populations. In April 2010, President Barack Obama announced a mission to send humans to an asteroid by 2025. This resulted in the agency’s Asteroid Redirect Mission (ARM) to leverage and maximize existing robotic and human efforts to capture and reroute an asteroid, with the goal of eventual human exploration. The AGC, initiated in 2013, complemented ARM by expanding public participation, partnerships, and other approaches to find, understand, and overcome these potentially harmful asteroids.

This paper describes a selection of AGC activities implemented from 2013 to 2017 and their results, excluding those conducted by NASA’s Near-Earth Object Observations Program and other organizations. The strategic development of the initiative is outlined as well as initial successes, strengths, and weaknesses resulting from the first four years of AGC activities and approaches. Finally, we describe lesson learned and areas for continued work and study. The AGC lessons learned and strategies could inform the work of other agencies and organizations seeking to conduct a global scientific investigation with matrixed organizational support, multiple strategic partners, and numerous internal and external open innovation approaches and audiences….(More)”.

 

NASA’s Asteroid Grand Challenge: Strategy, results, and lessons learned

Cathie Anderson in the Sacramento Bee: “Tech entrepreneurs and academic researchers are tracking the spread of flu in real-time, collecting data from social media and internet-connected devices that show startling accuracy when compared against surveillance data that public health officials don’t report until a week or two later….

Smart devices and mobile apps have the potential to reshape public health alerts and responses,…, for instance, the staff of smart thermometer maker Kinsa were receiving temperature readings that augured the surge of flu patients in emergency rooms there.

Kinsa thermometers are part of the movement toward the Internet of Things – devices that automatically transmit information to a database. No personal information is shared, unless users decide to input information such as age and gender. Using data from more than 1 million devices in U.S. homes, the staff is able to track fever as it hits and use an algorithm to estimate impact for a broader population….

Computational researcher Aaron Miller worked with an epidemiological team at the University of Iowa to assess the feasibility of using Kinsa data to forecast the spread of flu. He said the team first built a model using surveillance data from the CDC and used it to forecast the spread of influenza. Then the team created a model where they integrated the data from Kinsa along with that from the CDC.

“We got predictions that were … 10 to 50 percent better at predicting the spread of flu than when we used CDC data alone,” Miller said. “Potentially, in the future, if you had granular information from the devices and you had enough information, you could imagine doing analysis on a really local level to inform things like school closings.”

While Kinsa uses readings taken in homes, academic researchers and companies such as sickweather.com are using crowdsourcing from social media networks to provide information on the spread of flu. Siddharth Shah, a transformational health industry analyst at Frost & Sullivan, pointed to an award-winning international study led by researchers at Northeastern University that tracked flu through Twitter posts and other key parameters of flu.

When compared with official influenza surveillance systems, the researchers said, the model accurately forecast the evolution of influenza up to six weeks in advance, much earlier than prior models. Such advance warnings would give health agencies significantly more time to expand upon medical resources or to alert the public to measures they can take to prevent transmission of the disease….

For now, Shah said, technology will probably only augment or complement traditional public data streams. However, he added, innovations already are changing how diseases are tracked. Chronic disease management, for instance, is going digital with devices such as Omada health that helps people with Type 2 diabetes better manage health challenges and Noom, a mobile app that helps people stop dieting and instead work toward true lifestyle change….(More).

How tech used to track the flu could change the game for public health response

Chapter by Sheila Foster and Christian Iaione in Routledge Handbook of the Study of the Commons (Dan Cole, Blake Hudson, Jonathan Rosenbloom eds.): “If cities are the places where most of the world’s population will be living in the next century, as is predicted, it is not surprising that they have become sites of contestation over use and access to urban land, open space, infrastructure, and culture. The question posed by Saskia Sassen in a recent essay—who owns the city?—is arguably at the root of these contestations and of social movements that resist the enclosure of cities by economic elites (Sassen 2015). One answer to the question of who owns the city is that we all do. In our work we argue that the city is a common good or a “commons”—a shared resource that belongs to all of its inhabitants, and to the public more generally.

We have been writing about the urban commons for the last decade, very much inspired by the work of Jane Jacobs and Elinor Ostrom. The idea of the urban commons captures the ecological view of the city that characterizes Jane Jacobs classic work, The Death and Life of Great American Cities. (Foster 2006) It also builds on Elinor Ostrom’s finding that common resources are capable of being collectively managed by users in ways that support their needs yet sustains the resource over the long run (Ostrom 1990).

Jacobs analyzed cities as complex, organic systems and observed the activity within them at the neighborhood and street level, much like an ecologist would study natural habitats and the species interacting within them. She emphasized the diversity of land use, of people and neighborhoods, and the interaction among them as important to maintaining the ecological balance of urban life in great cities like New York. Jacob’s critique of the urban renewal slum clearance programs of the 1940s and 50s in the United States was focused not just on the destruction of physical neighborhoods, but also on the destruction of the “irreplaceable social capital”—the networks of residents who build and strengthen working relationships over time through trust and voluntary cooperation—necessary for “self-governance” of urban neighborhoods. (Jacobs 1961) As political scientist Douglas Rae has written, this social capital is the “civic fauna” of urbanism (Rae 2003)…(More)”.

Ostrom in the City: Design Principles and Practices for the Urban Commons

Christophe Koettl in the New York Times: “In mid-February a source in the human rights community told me that villages in a remote region of the Democratic Republic of Congo were being burned amid a renewal of communal fighting. People fleeing the violence told aid workers of arson attacks.

The clashes between the Hema and Lendu communities — on the eastern side of the Ituri province, bordering Uganda — started in December and escalated in early February.

Historically, these distant conflicts have been difficult to analyze. But new technologies allow us to investigate them in close to real time.

I immediately collected active-fire data from NASA — thermal anomalies, or hot spots, that are recorded daily. It showed dozens of fires on the densely forested mountain ridge and along the shoreline of Lake Albert, one of the African Great Lakes between Congo and Uganda.

(Human rights groups also used this type of data, in combination with other evidence, to document the military’s scorched-earth campaign against the Rohingya in Myanmar.)

Active-fire data does not provide the cause of a fire, so one must exercise caution in interpreting it, especially when researching violence. It is more commonly used to track wildfires and agricultural fires.

The satellites that collect this information do not provide actual images; they only record the location of active fires, and very large ones at that. So don’t get your hopes up about watching your neighbors barbecue from space — we aren’t quite there yet.

Google and other online mapping platforms often show only blurry satellite images, or have no location names for remote areas such as the small fishing villages around Lake Albert. This makes it difficult to find places where people live. To deal with this challenge, I exported residential data from the online mapping site Openstreetmap.

I then overlaid the NASA data with this new data in Google Earth to look for recorded fires that were in or near populated places. This process gave me a shortlist of 10 locations to investigate.

Photo

Location of satellite-recorded active fires (the flames) and residential area data (the white outlines) helped to identify remote locations that had possibly been burned. Credit© Google Earth/DigitalGlobe

Next, the satellite company DigitalGlobe provided me with high-resolution satellite imagery and analysis of these places. The results were disturbing: All the villages I had identified were at least partially burned, with hundreds of destroyed homes.

As this was not a comprehensive analysis of the whole area affected by violence, the actual number of burned villages is probably much higher. Aid organizations are reporting around 70 burned villages and more than 2,000 destroyed homes.

This new visual evidence provided us with a strong basis to report out the whole story. We now had details from both sides of the lake, not just at the refugee landing site in Uganda….(More)”

How We Identified Burned Villages in the Democratic Republic of Congo

Get the latest news right in you inbox

Subscribe to curated findings and actionable knowledge from The Living Library, delivered to your inbox every Friday