Explore our articles
View All Results

Stefaan Verhulst

Testimony by Stefaan Verhulst before New York City Council Committee on Technology and the Commission on Public Information and Communication (COPIC): “We live in challenging times. From climate change to economic inequality, the difficulties confronting New York City, its citizens, and decision-makers are unprecedented in their variety, and also in their complexity and urgency. Our standard policy toolkit increasingly seems stale and ineffective. Existing governance institutions and mechanisms seem outdated and distrusted by large sections of the population.

To tackle today’s problems we need not only new solutions but also new methods for arriving at solutions. Data can play a central role in this task. Access to and the use of data in a trusted and responsible manner is central to meeting the challenges we face and enabling public innovation.

This hearing, called by the Technology Committee and the Commission on Public Information and Communication, is therefore timely and very important. It is my firm belief that rapid progress on developing an effective data sharing framework is among the most important steps our New York City leaders can take to tackle the myriad of 21st challenges....

I am joined today by some of my distinguished NYU colleagues, Prof. Julia Lane and Prof. Julia Stoyanovich, who have worked extensively on the technical and privacy challenges associated with data sharing. I will, therefore, avoid duplicating our testimonies and won’t focus on issues of privacy, trust and how to establish a responsible data sharing infrastructure, while these are central considerations for the type of data-driven approaches I will discuss. I am, of course, happy to elaborate on these topics during the question and answer session.

Instead, I want to focus on four core issues associated with data collaboration. I phrase these issues as answers to four questions. For each of these questions, I also provide a set of recommended actions that this Committee could consider undertaking or studying.

The four core questions are:

  • First, why should NYC care about data and data sharing?
  • Second, if you build a data-sharing framework, will they come?
  • Third, how can we best engage the private sector when it comes to sharing and using their data?
  • And fourth, is technology is the main (or best) answer?…(More)”.
Leveraging and Sharing Data for Urban Flourishing

Springwise: “UK-based Maynard Design Consultancy has developed a system to help people navigate the changing landscape of city neighbourhoods. A prototype of a wayfinding solution for districts in London combines smart physical markers and navigational apps. The physical markers, inspired by traditional mile markers, include a digital screen. They provide real-time information, including daily news and messages from local businesses. The markers also track how people use the park, providing valuable information to the city and urban planners. The partnering apps provide up-to-date information about the changing environment in the city, such as on-going construction and delays due to large-scale events.

Unlike traditional, smartphone based navigational apps, this concept uses technology to help us reconnect with our surroundings, Maynard Design said.

The proposal won the Smart London District Challenge competition set by the Institute for Sustainability. Maynard is currently looking for partner companies to pilot its concept.

Takeaway: The Maynard design represents the latest efforts to use smartphones to amplify public safety announcements, general information and local businesses. The concept moves past traditional wayfinding markers to link people to a smart-city grid. By tracking how people use parks and other urban spaces, the markers will provide valuable insight for city officials. We expect more innovations like this as cities increasingly move toward seamless communication between services and city residents, aided by smart technologies. Over the past several months, we have seen technology to connect drivers to parking spaces and a prototype pavement that can change functions based on people’s needs….(More)”

Digital mile-markers provide navigation in cities

Paper by Daniel Berliner, Alex Ingrams and Suzanne J. Piotrowski: “July 4, 2016 marked the fiftieth anniversary of the 1966 Freedom of Information Act of the United States. Freedom of Information (FOI) has become a vital element of the American political process, become recognized as a core value of democracy, and helped to inspire similar laws and movements around the world. FOI has always faced myriad challenges, including resistance, evasion, and poor implementation and enforcement. Yet the last decade has brought a change of a very different form to the evolution of FOI policy—the emergence of another approach to transparency that is in some ways similar to FOI, and in other ways distinct: open government. The open government agenda, driven by technological developments and motivated by a broader conception of transparency, today rivals, or by some measures, even eclipses FOI in terms of political attention and momentum. What have been the consequences of these trends? How does the advent of new technologies and new agendas shape the transparency landscape?

The political and policy contexts for FOI have fundamentally shifted due to the rise of the open government reform agenda. FOI was at one point the primary tool used to promote governance transparency. FOI is now just one good governance tool in an increasingly crowded field of transparency policy areas. Focus is increasingly shifting toward technology-enabled open data reforms. While many open government reformers see these as positive developments, many traditional FOI proponents have raised concerns. With a few notable exceptions, the academic literature has been silent on this issue. We offer a systematic framework for understanding the potential consequences—both positive and negative—of the open government agenda for FOI policy and implementation….(More)”.

The Future of FOIA in an Open Government World: Implications of the Open Government Agenda for Freedom of Information Policy and Implementation

Paper by Andreas Rasche, Mette Morsing and Erik Wetter in Business and Society: “This article examines the legitimacy attached to different types of multi-stakeholder data partnerships occurring in the context of sustainable development. We develop a framework to assess the democratic legitimacy of two types of data partnerships: open data partnerships (where data and insights are mainly freely available) and closed data partnerships (where data and insights are mainly shared within a network of organizations). Our framework specifies criteria for assessing the legitimacy of relevant partnerships with regard to their input legitimacy as well as their output legitimacy. We demonstrate which particular characteristics of open and closed partnerships can be expected to influence an analysis of their input and output legitimacy….(More)”.

Assessing the Legitimacy of “Open” and “Closed” Data Partnerships for Sustainable Development

Report by James G. McGann: “The Think Tanks and Civil Societies Program (TTCSP) of the Lauder Institute at the University of Pennsylvania conducts research on the role policy institutes play in governments and civil societies around the world. Often referred to as the “think tanks’ think tank,” TTCSP examines the evolving role and character of public policy research organizations. Over the last 27 years, the TTCSP has developed and led a series of global initiatives that have helped bridge the gap between knowledge and policy in critical policy areas such as international peace and security, globalization and governance, international economics, environmental issues, information and society, poverty alleviation, and healthcare and global health. These international collaborative efforts are designed to establish regional and international networks of policy institutes and communities that improve policy making while strengthening democratic institutions and civil societies around the world.

The TTCSP works with leading scholars and practitioners from think tanks and universities in a variety of collaborative efforts and programs, and produces the annual Global Go To think Tank Index that ranks the world’s leading think tanks in a variety of categories. This is achieved with the help of a panel of over 1,796 peer institutions and experts from the print and electronic media, academia, public and private donor institutions, and governments around the world. We have strong relationships with leading think tanks around the world, and our annual think Tank Index is used by academics, journalists, donors and the public to locate and connect with the leading centers of public policy research around the world. Our goal is to increase the profile and performance of think tanks and raise the public awareness of the important role think tanks play in governments and civil societies around the globe.”…(More)”.

2018 Global Go To Think Tank Index Report

Paper by Huimin Xia et al in at Nature Medicine: “Artificial intelligence (AI)-based methods have emerged as powerful tools to transform medical care. Although machine learning classifiers (MLCs) have already demonstrated strong performance in image-based diagnoses, analysis of diverse and massive electronic health record (EHR) data remains challenging. Here, we show that MLCs can query EHRs in a manner similar to the hypothetico-deductive reasoning used by physicians and unearth associations that previous statistical methods have not found. Our model applies an automated natural language processing system using deep learning techniques to extract clinically relevant information from EHRs. In total, 101.6 million data points from 1,362,559 pediatric patient visits presenting to a major referral center were analyzed to train and validate the framework.

Our model demonstrates high diagnostic accuracy across multiple organ systems and is comparable to experienced pediatricians in diagnosing common childhood diseases. Our study provides a proof of concept for implementing an AI-based system as a means to aid physicians in tackling large amounts of data, augmenting diagnostic evaluations, and to provide clinical decision support in cases of diagnostic uncertainty or complexity. Although this impact may be most evident in areas where healthcare providers are in relative shortage, the benefits of such an AI system are likely to be universal….(More)”.

Evaluation and accurate diagnoses of pediatric diseases using artificial intelligence

Jon Askonas at The New Atlantis: “The rumors spread like wildfire: Muslims were secretly lacing a Sri Lankan village’s food with sterilization drugs. Soon, a video circulated that appeared to show a Muslim shopkeeper admitting to drugging his customers — he had misunderstood the question that was angrily put to him. Then all hell broke loose. Over a several-day span, dozens of mosques and Muslim-owned shops and homes were burned down across multiple towns. In one home, a young journalist was trapped, and perished.

Mob violence is an old phenomenon, but the tools encouraging it, in this case, were not. As the New York Times reported in April, the rumors were spread via Facebook, whose newsfeed algorithm prioritized high-engagement content, especially videos. “Designed to maximize user time on site,” as the Times article describes, the newsfeed algorithm “promotes whatever wins the most attention. Posts that tap into negative, primal emotions like anger or fear, studies have found, produce the highest engagement, and so proliferate.” On Facebook in Sri Lanka, posts with incendiary rumors had among the highest engagement rates, and so were among the most highly promoted content on the platform. Similar cases of mob violence have taken place in India, Myanmar, Mexico, and elsewhere, with misinformation spread mainly through Facebook and the messaging tool WhatsApp.

Follow The New AtlantisThis is in spite of Facebook’s decision in January 2018 to tweak its algorithm, apparently to prevent the kind of manipulation we saw in the 2016 U.S. election, when posts and election ads originating from Russia reportedly showed up in newsfeeds of up to 126 million American Facebook users. The company explained that the changes to its algorithm will mean that newsfeeds will be “showing more posts from friends and family and updates that spark conversation,” and “less public content, including videos and other posts from publishers or businesses.” But these changes, which Facebook had tested out in countries like Sri Lanka in the previous year, may actually have exacerbated the problem — which is that incendiary content, when posted by friends and family, is guaranteed to “spark conversation” and therefore to be prioritized in newsfeeds. This is because “misinformation is almost always more interesting than the truth,” as Mathew Ingram provocatively put it in the Columbia Journalism Review.

How did we get here, from Facebook’s mission to “give people the power to build community and bring the world closer together”? Riot-inducing “fake news” and election meddling are obviously far from what its founders intended for the platform. Likewise, Google’s founders surely did not build their search engine with the intention of its being censored in China to suppress free speech, and yet, after years of refusing this demand from Chinese leadership, Google has recently relented rather than pull their search engine from China entirely. And YouTube’s creators surely did not intend their feature that promotes “trending” content to help clickbait conspiracy-theory videos go viral.

These outcomes — not merely unanticipated by the companies’ founders but outright opposed to their intentions — are not limited to social media. So far, Big Tech companies have presented issues of incitement, algorithmic radicalization, and “fake news” as merely bumps on the road of progress, glitches and bugs to be patched over. In fact, the problem goes deeper, to fundamental questions of human nature. Tools based on the premise that access to information will only enlighten us and social connectivity will only make us more humane have instead fanned conspiracy theories, information bubbles, and social fracture. A tech movement spurred by visions of libertarian empowerment and progressive uplift has instead fanned a global resurgence of populism and authoritarianism.

Despite the storm of criticism, Silicon Valley has still failed to recognize in these abuses a sharp rebuke of its sunny view of human nature. It remains naïvely blind to how its own aspirations for social engineering are on a spectrum with the tools’ “unintended” uses by authoritarian regimes and nefarious actors….(More)”.

How Tech Utopia Fostered Tyranny

Blog post by Bridget Konadu Gyamfi and Bethany Park…:”Researchers are often invested in disseminating the results of their research to the practitioners and policymakers who helped enable it—but disseminating a paper, developing a brief, or even holding an event may not truly empower decision-makers to make changes based on the research.  

Disseminate results in stages and determine next steps

Mapping evidence to real-world decisions and processes in order to determine the right course of action can be complex. Together with our partners, we gather the troops—researchers, implementers, and IPA’s research and policy team—and have a discussion around what the implications of the research are for policy and practice.

This staged dissemination is critically important: having private discussions first helps partners digest the results and think through their reactions in a lower-stakes setting. We help the partners think about not only the results, but how their stakeholders will respond to the results, and how we can support their ongoing learning, whether results are “good” or not as hoped. Later, we hold larger dissemination events to inform the public. But we try to work closely with researchers and implementers to think through next steps right after results are available—before the window of opportunity passes.

Identify & prioritize policy opportunities

Many of our partners have already written smart advice about how to identify policy opportunities (windows, openings… etc.), so there’s no need for us to restate all that great thinking (go read it!). However, we get asked frequently how we prioritize policy opportunities, and we do have a clear internal process for making that decision. Here are our criteria:

High Impact Policy Activities.png
  1. A body of evidence to build on: One single study doesn’t often present the best policy opportunities. This is a generalization, of course, and there are exceptions, but typically our policy teams pay the most attention to bodies of evidence that are coming to a consensus. These are the opportunities for which we feel most able to recommend next steps related to policy and practice—there is a clearer message to communicate and research conclusions we can state with greater confidence.
  2. Relationships to open doors: Our long-term in-country presence and deep involvement with partners through research projects means that we have many relationships and doors open to us. Yet some of these relationships are stronger than others, and some partners are more influential in the processes we want to impact. We use stakeholder mapping tools to clarify who is invested and who has influence. We also track our stakeholder outreach to make sure our relationships stay strong and mutually beneficial.
  3. A concrete decision or process that we can influence: This is the typical understanding of a “policy opening,” and it’s an important one. What are the partner’s priorities, felt needs, and open questions? Where do those create opportunities for our influence? If the evidence would indicate one course of action, but that course isn’t even an option our partner would consider or be able to consider (for cost or other practical reasons), we have to give the opportunity a pass.
  4. Implementation funding: In the countries where we work, even when we have strong relationships, strong evidence, and the partner is open to influence, there is still one crucial ingredient missing: implementation funding. Addressing this constraint means getting evidence-based programming onto the agenda of major donors.

Get partners on board

Forming a coalition of partners and funders who will partner with us as we move forward is crucial. As a research and policy organization, we can’t scale effective solutions alone—nor is that the specialty that we want to develop, since there are others to fill that role. We need partners like Evidence Action Beta to help us pressure test solutions as they move towards scale, or partners like Living Goods who already have nationwide networks of community health workers who can reach communities efficiently and effectively. And we need governments who are willing to make public investments and decisions based on evidence….(More)”.

How to keep good research from dying a bad death: Strategies for co-creating research with impact

Paper by Q. Dos Santos et al: “To test the impact of a nudge strategy (dish of the day strategy) and the factors associated with vegetable dish choice, upon food selection by European adolescents in a real foodservice setting.

A cross-sectional quasi-experimental study was implemented in restaurants in four European countries: Denmark, France, Italy and United Kingdom. In total, 360 individuals aged 12-19 years were allocated into control or intervention groups, and asked to select from meat-based, fish-based, or vegetable-based meals. All three dishes were identically presented in appearance (balls with similar size and weight) and with the same sauce (tomato sauce) and side dishes (pasta and salad). In the intervention condition, the vegetable-based option was presented as the “dish of the day” and numbers of dishes chosen by each group were compared using the Pearson chi-square test. Multivariate logistic regression analysis was run to assess associations between choice of vegetable-based dish and its potential associated factors (adherence to Mediterranean diet, food neophobia, attitudes towards nudging for vegetables, food choice questionnaire, human values scale, social norms and self-estimated health, country, gender and belonging to control or intervention groups). All analyses were run in SPSS 22.0.

The nudging strategy (dish of the day) did not show a difference on the choice of the vegetable-based option among adolescents tested (p = 0.80 for Denmark and France and p = 0.69 and p = 0.53 for Italy and UK, respectively). However, natural dimension of food choice questionnaire, social norms and attitudes towards vegetable nudging were all positively associated with the choice of the vegetable-based dish. Being male was negatively associated with choosing the vegetable-based dish.

The “dish of the day” strategy did not work under the study conditions. Choice of the vegetable-based dish was predicted by natural dimension, social norms, gender and attitudes towards vegetable nudging. An understanding of factors related to choosing vegetable based dishes is necessary for the development and implementation of public policy interventions aiming to increase the consumption of vegetables among adolescents….(More)”

Impact of a nudging intervention and factors associated with vegetable dish choice among European adolescents

Paper by André Eberhardt and Milene Selbach Silveira: “During the last years many government organizations have adopted Open Government Data policies to make their data publicly available. Although governments are having success on publishing their data, the availability of the datasets is not enough to people to make use of it due to lack of technical expertise such as programming skills and knowledge on data management. In this scenario, Visualization Techniques can be applied to Open Government Data in order to help to solve this problem.

In this sense, we analyzed previously published papers related to Open Government Data Visualization in order to provide an overview about how visualization techniques are being applied to Open Government Data and which are the most common challenges when dealing with it. A systematic mapping study was conducted to survey the papers that were published in this area. The study found 775 papers and, after applying all inclusion and exclusion criteria, 32 papers were selected. Among other results, we found that datasets related to transportation are the main ones being used and Map is the most used visualization technique. Finally, we report that data quality is the main challenge being reported by studies that applied visualization techniques to Open Government Data…(More)”.

Show me the Data! A Systematic Mapping on Open Government Data Visualization

Get the latest news right in you inbox

Subscribe to curated findings and actionable knowledge from The Living Library, delivered to your inbox every Friday