Explore our articles
View All Results

Stefaan Verhulst

Paper by Jonas Traub et al: “Data science and artificial intelligence are driven by a plethora of diverse data-related assets including datasets, data streams, algorithms, processing software, compute resources, and domain knowledge. As providing all these assets requires a huge investment, data sciences and artificial intelligence are currently dominated by a small number of providers who can afford these investments. In this paper, we present a vision of a data ecosystem to democratize data science and artificial intelligence. In particular, we envision a data infrastructure for fine-grained asset exchange in combination with scalable systems operation. This will overcome lock-in effects and remove entry barriers for new asset providers. Our goal is to enable companies, research organizations, and individuals to have equal access to data, data science, and artificial intelligence. Such an open ecosystem has recently been put on the agenda of several governments and industrial associations. We point out the requirements and the research challenges as well as outline an initial data infrastructure architecture for building such a data ecosystem…(More)”.

Agora: Towards An Open Ecosystem for Democratizing Data Science & Artificial Intelligence

David Spiegelhalter at Aeon: “…Many criticised the Leave campaign for its claim that Britain sends the EU £350 million a week. When Boris Johnson repeated it in 2017 – by which time he was Foreign Secretary – the chair of the UK Statistics Authority (the official statistical watchdog) rebuked him, noting it was a ‘clear misuse of official statistics’. A private criminal prosecution was even made against Johnson for ‘misconduct in a public office’, but it was halted by the High Court.

The message on the bus had a strong emotional resonance with millions of people, even though it was essentially misinformation. The episode demonstrates both the power and weakness of statistics: they can be used to amplify an entire worldview, and yet they often do not stand up to scrutiny. This is why statistical literacy is so important – in an age in which data plays an ever-more prominent role in society, the ability to spot ways in which numbers can be misused, and to be able to deconstruct claims based on statistics, should be a standard civic skill.

Statistics are not cold hard facts – as Nate Silver writes in The Signal and the Noise (2012): ‘The numbers have no way of speaking for themselves. We speak for them. We imbue them with meaning.’ Not only has someone used extensive judgment in choosing what to measure, how to define crucial ideas, and to analyse them, but the manner in which they are communicated can utterly change their emotional impact. Let’s assume that £350 million is the actual weekly contribution to the EU. I often ask audiences to suggest what they would put on the side of the bus if they were on the Remain side. A standard option for making an apparently big number look small is to consider it as a proportion of an even bigger number: for example, the UK’s GDP is currently around £2.3 trillion, and so this contribution would comprise less than 1 per cent of GDP, around six months’ typical growth. An alternative device is to break down expenditure into smaller, more easily grasped units: for example, as there are 66 million people in the UK, £350 million a week is equivalent to around 75p a day, less than $1, say about the cost of a small packet of crisps (potato chips). If the bus had said: We each send the EU the price of a packet of crisps each day, the campaign might not have been so successful.

Numbers are often used to persuade rather than inform, statistical literacy needs to be improved, and so surely we need more statistics courses in schools and universities? Well, yes, but this should not mean more of the same. After years of researching and teaching statistical methods, I am not alone in concluding that the way in which we teach statistics can be counterproductive, with an overemphasis on mathematical foundations through probability theory, long lists of tests and formulae to apply, and toy problems involving, say, calculating the standard deviation of the weights of cod. The American Statistical Association’s Guidelines for Assessment and Instruction in Statistics Education (2016) strongly recommended changing the pedagogy of statistics into one based on problemsolving, real-world examples, and with an emphasis on communication….(More)”.

Citizens need to know numbers

Paper by Albert Bravo-Biosca: “Experimental approaches are increasingly being adopted across many policy fields, but innovation policy has been lagging. This paper reviews the case for policy experimentation in this field, describes the different types of experiments that can be undertaken, discusses some of the unique challenges to the use of experimental approaches in innovation policy, and summarizes some of the emerging lessons, with a focus on randomized trials. The paper concludes describing how at the Innovation Growth Lab we have been working with governments across the OECD to help them overcome the barriers to policy experimentation in order to make their policies more impactful….(More)”.

Experimental Innovation Policy

Blog post by Jillian Campbell and David E Jensen: “A range of frontier and digital technologies have dramatically boosted the ways in which we can monitor the health of our planet. And sustain our future on it (Figure 1).

Figure 1. A range of frontier an digital technologies can be combined to monitor our planet and the sustainable use of natural resources (1)

If we can leverage this technology effectively, we will be able to assess and predict risks, increase transparency and accountability in the management of natural resources and inform markets as well as consumer choice. These actions are all required if we are to stand a better chance of achieving the Sustainable Development Goals (SDGs).

However, for this vision to become a reality, public and private sector actors must take deliberate action and collaborate to build a global digital ecosystem for the planet — one consisting of data, infrastructure, rapid analytics, and real-time insights. We are now at a pivotal moment in the history of our stewardship of this planet. A “tipping point” of sorts. And in order to guide the political action which is required to counter the speed, scope and severity of the environmental and climate crises, we must acquire and deploy these data sets and frontier technologies. Doing so can fundamentally change our economic trajectory and underpin a sustainable future.

This article shows how such a global digital ecosystem for the planet can be achieved — as well as what we risk if we do not take decisive action within the next 12 months….(More)”.

The promise and peril of a digital ecosystem for the planet

Claudia Williams at MedCityNews: “The path to value-based care is arduous. For health plans, their ability to manage care, assess quality, lower costs, and streamline reporting is directly impacted by access to clinical data. For providers, the same can be said due to their lack of access to claims data. 

Providers and health plans are increasingly demanding integrated claims and clinical data to drive and support value-based care programs. These organizations know that clinical and claims information from more than a single organization is the only way to get a true picture of patient care. From avoiding medication errors to enabling an evidence-based approach to treatment or identifying at-risk patients, the value of integrated claims and clinical data is immense — and will have far-reaching influence on both health outcomes and costs of care over time.

On July 30, Medicare announced the Data at the Point of Care pilot to share valuable claims data with Medicare providers in order to “fill in information gaps for clinicians, giving them a more structured and complete patient history with information like previous diagnoses, past procedures, and medication lists.” But that’s not the only example. To transition from fee-for-service to value-based care, providers and health plans have begun to partner with health data networks to access integrated clinical and claims data: 

Health plan adoption of integrated data strategy

A California health plan is partnering with one of the largest nonprofit health data networks in California, to better integrate clinical and claims data. …

Providers leveraging claims data to understand patient medication patterns 

Doctors using advanced health data networks typically see a full list of patients’ medications, derived from claims, when they treat them. With this information available, doctors can avoid dangerous drug to-drug interactions when they prescribe new medications. After a visit, they can also follow up and see if a patient actually filled a prescription and is still taking it….(More)”.

The business case for integrating claims and clinical data

Brief of the Data 2X Big Data and Gender Brief Series by The GovLab, UNICEF, Universidad Del Desarrollo, Telefónica R&D Center, ISI Foundation, and DigitalGlobe: “Mobility is gendered. For example, the household division of labor in many societies leads women and girls to take more multi-purpose, multi-stop trips than men. Women-headed households also tend to work more in the informal sector, with limited access to transportation subsidies, and use of public transit is further reduced by the risk of violence in public spaces.

This brief summarizes a recent analysis of gendered urban mobility in 51 (out of 52) neighborhoods of Santiago, Chile, relying on the call detail records (CDRs) of a large sample of mobile phone users over a period of three months. We found that: 1) women move less overall than men; 2) have a smaller radius of movement; and 3) tend to concentrate their time in a smaller set of locations. These mobility gaps are linked to lower average incomes and fewer public and private transportation options. These insights, taken from large volumes of passively generated, inexpensive data streaming in realtime, can help policymakers design more gender inclusive urban transit systems….(More)”.

Gender Gaps in Urban Mobility

Book edited Albert Ali Salah, Alex Pentland, Bruno Lepri and Emmanuel Letouzé: “After the start of the Syrian Civil War in 2011–12, increasing numbers of civilians sought refuge in neighboring countries. By May 2017, Turkey had received over 3 million refugees — the largest r efugee population in the world. Some lived in government-run camps near the Syrian border, but many have moved to cities looking for work and better living conditions. They faced problems of integration, income, welfare, employment, health, education, language, social tension, and discrimination. In order to develop sound policies to solve these interlinked problems, a good understanding of refugee dynamics is necessary.

This book summarizes the most important findings of the Data for Refugees (D4R) Challenge, which was a non-profit project initiated to improve the conditions of the Syrian refugees in Turkey by providing a database for the scientific community to enable research on urgent problems concerning refugees. The database, based on anonymized mobile call detail records (CDRs) of phone calls and SMS messages of one million Turk Telekom customers, indicates the broad activity and mobility patterns of refugees and citizens in Turkey for the year 1 January to 31 December 2017. Over 100 teams from around the globe applied to take part in the challenge, and 61 teams were granted access to the data.

This book describes the challenge, and presents selected and revised project reports on the five major themes: unemployment, health, education, social integration, and safety, respectively. These are complemented by additional invited chapters describing related projects from international governmental organizations, technological infrastructure, as well as ethical aspects. The last chapter includes policy recommendations, based on the lessons learned.

The book will serve as a guideline for creating innovative data-centered collaborations between industry, academia, government, and non-profit humanitarian agencies to deal with complex problems in refugee scenarios. It illustrates the possibilities of big data analytics in coping with refugee crises and humanitarian responses, by showcasing innovative approaches drawing on multiple data sources, information visualization, pattern analysis, and statistical analysis.It will also provide researchers and students working with mobility data with an excellent coverage across data science, economics, sociology, urban computing, education, migration studies, and more….(More)”.

Guide to Mobile Data Analytics in Refugee Scenarios

Henry Farrell and Abraham L. Newman in International Security: “Liberals claim that globalization has led to fragmentation and decentralized networks of power relations. This does not explain how states increasingly “weaponize interdependence” by leveraging global networks of informational and financial exchange for strategic advantage. The theoretical literature on network topography shows how standard models predict that many networks grow asymmetrically so that some nodes are far more connected than others. This model nicely describes several key global economic networks, centering on the United States and a few other states. Highly asymmetric networks allow states with (1) effective jurisdiction over the central economic nodes and (2) appropriate domestic institutions and norms to weaponize these structural advantages for coercive ends. In particular, two mechanisms can be identified. First, states can employ the “panopticon effect” to gather strategically valuable information. Second, they can employ the “chokepoint effect” to deny network access to adversaries. Tests of the plausibility of these arguments across two extended case studies that provide variation both in the extent of U.S. jurisdiction and in the presence of domestic institutions—the SWIFT financial messaging system and the internet—confirm the framework’s expectations. A better understanding of the policy implications of the use and potential overuse of these tools, as well as the response strategies of targeted states, will recast scholarly debates on the relationship between economic globalization and state coercion….(More)”

Weaponized Interdependence: How Global Economic Networks Shape State Coercion

Press Release: “JFF, a national nonprofit driving transformation in the American workforce and education systems, and Persistence Plus, which pairs behavioral insights with intelligent text messaging to improve student success, today released the findings from an analysis that examined the effects of personalized nudging on nearly 10,000 community college students. The study, conducted over two years at four community colleges, found that behavioral nudging had a significant impact on student persistence rates—with strong improvements among students of color and older adult learners, who are often underrepresented among graduates of STEM (science, technology, engineering, and math) programs.

“These results offer powerful evidence on the potential, and imperative, of using technology to support students during the most in-demand, and often most challenging, courses and majors,” said Maria Flynn, president and CEO of JFF. “With millions of STEM jobs going unfilled, closing the gap in STEM achievement has profound economic—and equity—implications.” 

In a multiyear initiative called “Nudging to STEM Success, which was funded by the Helmsley Charitable Trust, JFF and Persistence Plus selected four colleges to implement the nudging initiative campuswide:Lakeland Community College in Kirtland, Ohio; Lorain County Community College in Elyria, Ohio; Stark State College in North Canton, Ohio; and John Tyler Community College in Chester, Virginia.

A randomized control trial in the summer of 2017 showed that the nudges increased first-to-second-year persistence for STEM students by 10 percentage points. The results of that trial will be presented in an upcoming peer-reviewed paper titled “A Summer Nudge Campaign to Motivate Community College STEM Students to Reenroll.” The paper will be published in AERA Open, an open-access journal published by the American Educational Research Association. 

Following the 2017 trial, the four colleges scaled the support to nearly 10,000 students, and over the next two years, JFF and Persistence Plus found that the nudging support had a particularly strong impact on students of color and students over the age of 25—two groups that have historically had lower persistence rates than other students….(More)”.

Community Colleges Boost STEM Student Success Through Behavioral Nudging

Philip Zelikow at the Texas National Security Review: “Policymaking is a discipline, a craft, and a profession. Policymakers apply specialized knowledge — about other countries, politics, diplomacy, conflict, economics, public health, and more — to the practical solution of public problems. Effective policymaking is difficult. The “hardware” of policymaking — the tools and structures of government that frame the possibilities for useful work — are obviously important. Less obvious is that policy performance in practice often rests more on the “software” of public problem-solving: the way people size up problems, design actions, and implement policy. In other words, the quality of the policymaking.

Like policymaking, engineering is a discipline, a craft, and a profession. Engineers learn how to apply specialized knowledge — about chemistry, physics, biology, hydraulics, electricity, and more — to the solution of practical problems. Effective engineering is similarly difficult. People work hard to learn how to practice it with professional skill. But, unlike the methods taught for engineering, the software of policy work is rarely recognized or studied. It is not adequately taught. There is no canon or norms of professional practice. American policymaking is less about deliberate engineering, and is more about improvised guesswork and bureaucratized habits.

My experience is as a historian who studies the details of policy episodes and the related staff work, but also as a former official who has analyzed a variety of domestic and foreign policy issues at all three levels of American government, including federal work from different bureaucratic perspectives in five presidential administrations from Ronald Reagan to Barack Obama. From this historical and contemporary vantage point, I am struck (and a bit depressed) that the quality of U.S. policy engineering is actually much, much worse in recent decades than it was throughout much of the 20th century. This is not a partisan observation — the decline spans both Republican and Democratic administrations.

I am not alone in my observations. Francis Fukuyama recently concluded that, “[T]he overall quality of the American government has been deteriorating steadily for more than a generation,” notably since the 1970s. In the United States, “the apparently irreversible increase in the scope of government has masked a large decay in its quality.”1 This worried assessment is echoed by other nonpartisan and longtime scholars who have studied the workings of American government.2 The 2003 National Commission on Public Service observed,

The notion of public service, once a noble calling proudly pursued by the most talented Americans of every generation, draws an indifferent response from today’s young people and repels many of the country’s leading private citizens. … The system has evolved not by plan or considered analysis but by accretion over time, politically inspired tinkering, and neglect. … The need to improve performance is urgent and compelling.3

And they wrote that as the American occupation of Iraq was just beginning.

In this article, I offer hypotheses to help explain why American policymaking has declined, and why it was so much more effective in the mid-20th century than it is today. I offer a brief sketch of how American education about policy work evolved over the past hundred years, and I argue that the key software qualities that made for effective policy engineering neither came out of the academy nor migrated back into it.

I then outline a template for doing and teaching policy engineering. I break the engineering methods down into three interacting sets of analytical judgments: about assessment, design, and implementation. In teaching, I lean away from new, cumbersome standalone degree programs and toward more flexible forms of education that can pair more easily with many subject-matter specializations. I emphasize the value of practicing methods in detailed and more lifelike case studies. I stress the significance of an organizational culture that prizes written staff work of the quality that used to be routine but has now degraded into bureaucratic or opinionated dross….(More)”.

To Regain Policy Competence: The Software of American Public Problem-Solving

Get the latest news right in you inbox

Subscribe to curated findings and actionable knowledge from The Living Library, delivered to your inbox every Friday