Explore our articles
View All Results

Stefaan Verhulst

Introduction by Julia Lane and Andrew Reamer of a Special Issue of the Annals of the American Academy of Political and Social Science: “Throughout the United States, there is broad interest in expanding the nation’s capacity to design and implement public policy based on solid evidence. That interest has been stimulated by the new types of data that are available that can transform the way in which policy is designed and implemented. Yet progress in making use of sensitive data has been hindered by the legal, technical, and operational obstacles to access for research and evaluation. Progress has also been hindered by an almost exclusive focus on the interest and needs of the data users, rather than the interest and needs of the data providers. In addition, data stewardship is largely artisanal in nature.

There are very real consequences that result from lack of action. State and local governments are often hampered in their capacity to effectively mount and learn from innovative efforts. Although jurisdictions often have treasure troves of data from existing programs, the data are stove-piped, underused, and poorly maintained. The experience reported by one large city public health commissioner is too common: “We commissioners meet periodically to discuss specific childhood deaths in the city. In most cases, we each have a thick file on the child or family. But the only time we compare notes is after the child is dead.”1 In reality, most localities lack the technical, analytical, staffing, and legal capacity to make effective use of existing and emerging resources.

It is our sense that fundamental changes are necessary and a new approach must be taken to building data infrastructures. In particular,

  1. Privacy and confidentiality issues must be addressed at the beginning—not added as an afterthought.
  2. Data providers must be involved as key stakeholders throughout the design process.
  3. Workforce capacity must be developed at all levels.
  4. The scholarly community must be engaged to identify the value to research and policy….

To develop a roadmap for the creation of such an infrastructure, the Bill and Melinda Gates Foundation, together with the Laura and John Arnold Foundation, hosted a day-long workshop of more than sixty experts to discuss the findings of twelve commissioned papers and their implications for action. This volume of The ANNALS showcases those twelve articles. The workshop papers were grouped into three thematic areas: privacy and confidentiality, the views of data producers, and comprehensive strategies that have been used to build data infrastructures in other contexts. The authors and the attendees included computer scientists, social scientists, practitioners, and data producers.

This introductory article places the research in both an historical and a current context. It also provides a framework for understanding the contribution of the twelve articles….(More)”.

A Roadmap to a Nationwide Data Infrastructure for Evidence-Based Policymaking

Book by Jerry Z. Muller on “How the obsession with quantifying human performance threatens our schools, medical care, businesses, and government…

Today, organizations of all kinds are ruled by the belief that the path to success is quantifying human performance, publicizing the results, and dividing up the rewards based on the numbers. But in our zeal to instill the evaluation process with scientific rigor, we’ve gone from measuring performance to fixating on measuring itself. The result is a tyranny of metrics that threatens the quality of our lives and most important institutions. In this timely and powerful book, Jerry Muller uncovers the damage our obsession with metrics is causing–and shows how we can begin to fix the problem.

Filled with examples from education, medicine, business and finance, government, the police and military, and philanthropy and foreign aid, this brief and accessible book explains why the seemingly irresistible pressure to quantify performance distorts and distracts, whether by encouraging “gaming the stats” or “teaching to the test.” That’s because what can and does get measured is not always worth measuring, may not be what we really want to know, and may draw effort away from the things we care about. Along the way, we learn why paying for measured performance doesn’t work, why surgical scorecards may increase deaths, and much more. But metrics can be good when used as a complement to—rather than a replacement for—judgment based on personal experience, and Muller also gives examples of when metrics have been beneficial…(More)”.

The Tyranny of Metrics

Essay by Charles Ess at Javnost: “In the late 1980s and early 1990s, the emerging internet and World Wide Web inspired both popular and scholarly optimism that these new communication technologies would inevitably “democratise”—in local organisations, larger civic and political institutions, and, indeed, the world itself. The especially Habermas- and feminist-inspired notions of deliberative democracy in an electronic public sphere at work here are subsequently challenged, however, by both theoretical and empirical developments such as the Arab Winter and platform imperialism. Nonetheless, a range of other developments—from Edward Snowden to the emergence of virtue ethics and slow tech as increasingly central to the design of ICTs—argue that resistance in the name of democracy and emancipation is not futile….(More)”.

Democracy and the Internet: A Retrospective

Paper by Oskar Josef Gstrein and Gerard Jan Ritsema van Eck: “Various smartphone apps and services are available which encourage users to report where and when they feel they are in an unsafe or threatening environment. This user generated content may be used to build datasets, which can show areas that are considered ‘bad,’ and to map out ‘safe’ routes through such neighbourhoods.

Despite certain advantages, this data inherently carries the danger that streets or neighbourhoods become stigmatized and already existing prejudices might be reinforced. Such stigmas might also result in negative consequences for property values and businesses, causing irreversible damage to certain parts of a municipality. Overcoming such an “evidence-based stigma” — even if based on biased, unreviewed, outdated, or inaccurate data — becomes nearly impossible and raises the question how such data should be managed….(More)”.

Mobile Devices as Stigmatizing Security Sensors: The GDPR and a Future of Crowdsourced ‘Broken Windows’

Ramy Ghorayeb at Medium: “…Why can’t governments move fast like the private industry? We have seen the rise of big corporations and startups adapting and evolve with the needs of their customers. Why can’t the governments adapt to the needs of their population? Why is it so slow?

Truth is, innovating in the public sector cannot and should not be like in the private one. Startups and corporations are about maximizing their internalities while public administrations are also about minimizing the externalities.

Straight-forward example: Let’s imagine the US authorize online voting in addition to physical for the next presidential elections. Obviously, it is a good way to incentivize people to vote. You will be able to vote from anywhere at any time, and more importantly, the cost to make one hundred people vote will be the same as one thousand or one million.

But on the other side, you are favoring the population with easy access to the Internet, meaning the middle and upper classes. And more, you are also favoring the younger generations over the older.
These populations have different known political opinions. Ultimately, you are deliberately modifying the voting repartition power in the country. It is not necessarily a bad or a good thing (keeping only physical voting is also favoriting a specific demographic segment) but there are a lot of issues that need to be worked on thoroughly before making any change on the democratic balance. I’d like to call this the participatory bias.

This participatory bias is the reason why the public side will always have a latency to adopt technology.

On the private side, when a business wants to work on a new product, it will only focus on its customer. The goal of a startup is even to find a specific segment of the population with its own needs and problems, a niche, to test innovative solutions in order to improve their experience and optimize the acquisition and retention of this population. In other words, he will maximize the internalities.
But the public side needs to look at the externalities that its new products can create. It cannot isolate a population, but has to look at what are the negative effects on the others. And more, like a big corporation, it cannot experiment and fail like a startup does, because it has to preserve its reputation and trust legacy.

Now the situation isn’t locked. Thanks to the civic tech ecosystem, governments have found a way to externalize innovation and learn from experimentations, failure and successes without doing it themselves. Startups and labs are handling the difficult role of inventor, and are showing the good way to use tech for citizens, iteration by iteration. More interesting, they are also showing that they are not threaten by public side replications. In fact, they are finding their complementarity….(More)”

Governments are not startups

 at The Conversation: “We all take weather forecasts for granted, so why isn’t there a ‘nature forecast’ to answer these questions? Enter the new scientific field of ecological forecasting. Ecologists have long sought to understand the natural world, but only recently have they begun to think systematically about forecasting.

Much of the current research in ecological forecasting is focused on long-term projections. It considers questions that play out over decades to centuries, such as how species may shift their ranges in response to climate change, or whether forests will continue to take up carbon dioxide from the atmosphere.

However, in a new article that I co-authored with 18 other scientists from universities, private research institutes and the U.S. Geological Survey, we argue that focusing on near-term forecasts over spans of days, seasons and years will help us better understand, manage and conserve ecosystems. Developing this ability would be a win-win for both science and society….

Big data is driving many of the advances in ecological forecasting. Today ecologists have orders of magnitude more data compared to just a decade ago, thanks to sustained public funding for basic science and environmental monitoring. This investment has given us better sensors, satellites and organizations such as the National Ecological Observatory Network, which collects high-quality data from 81 field sites across the United States and Puerto Rico. At the same time, cultural shifts across funding agencies, research networks and journals have made that data more open and available.

Digital technologies make it possible to access this information more quickly than in the past. Field notebooks have given way to tablets and cell networks that can stream new data into supercomputers in real time. Computing advances allow us to build better models and use more sophisticated statistical methods to produce forecasts….(More)”.

Can scientists learn to make ‘nature forecasts’ just as we forecast the weather?

Springwise: “Beam is a new crowdfunding website that has been launched to try and help homeless people based in London get back into work. The idea behind the website is that people donate money online to homeless individuals, which will then be used to give them the qualifications and training that they will need to become fully employed and therefore able to also get themselves secure in their own accommodation.

A new member of Beam is allocated a Member Manager, who works with them to find the best employment avenue for them to take based on their likes and experience. The Member Manager helps them prepare their fundraising campaign, and even helps members with childcare costs while they’re training.

Beam was founded by Alex Stephany, who is also board advisor of car parking app, JustPark, with a pilot scheme in September 2017 and has the full support of Sadiq Khan, the Mayor of London, and the innovation promotors, Nesta.

There are many ways that technology is helping the homeless. Recently, blockchain tech has been used to connect the homeless in New York with valuable services. And new keycard-accessed vending machines are to be installed in safe spaces to provide the homeless with 24-hour access to essential items….(More)”.

Crowdfunding site aims to get homeless back into work

 et al in PeerJ Computer Science: “We describe best practices for providing convenient, high-speed, secure access to large data via research data portals. We capture these best practices in a new design pattern, the Modern Research Data Portal, that disaggregates the traditional monolithic web-based data portal to achieve orders-of-magnitude increases in data transfer performance, support new deployment architectures that decouple control logic from data storage, and reduce development and operations costs.

We introduce the design pattern; explain how it leverages high-performance data enclaves and cloud-based data management services; review representative examples at research laboratories and universities, including both experimental facilities and supercomputer sites; describe how to leverage Python APIs for authentication, authorization, data transfer, and data sharing; and use coding examples to demonstrate how these APIs can be used to implement a range of research data portal capabilities. Sample code at a companion web site, https://docs.globus.org/mrdp, provides application skeletons that readers can adapt to realize their own research data portals….(More)”.

The Modern Research Data Portal: a design pattern for networked, data-intensive science

Antoine DuboisEmilio ZagheniKiran Garimella, and Ingmar Weber at arXiv: “Migrants’ assimilation is a major challenge for European societies, in part because of the sudden surge of refugees in recent years and in part because of long-term demographic trends. In this paper, we use Facebook’s data for advertisers to study the levels of assimilation of Arabic-speaking migrants in Germany, as seen through the interests they express online. Our results indicate a gradient of assimilation along demographic lines, language spoken and country of origin. Given the difficulty to collect timely migration data, in particular for traits related to cultural assimilation, the methods that we develop and the results that we provide open new lines of research that computational social scientists are well-positioned to address….(More)”.

Studying Migrant Assimilation Through Facebook Interests

Beth Noveck at the Yale Human Rights and Development Journal: “Open data policy mandates that government proactively publish its data online for the public to reuse. It is a radically different approach to transparency than traditional right-to-know strategies as embodied in Freedom of Information Act (FOIA) legislation in that it involves ex ante rather than ex post disclosure of whole datasets. Although both open data and FOIA deal with information sharing, the normative essence of open data is participation rather than litigation. By fostering public engagement, open data shifts the relationship between state and citizen from a monitorial to a collaborative one, centered around using information to solve problems together. This Essay explores the theory and practice of open data in comparison to FOIA and highlights its uses as a tool for advancing human rights, saving lives, and strengthening democracy. Although open data undoubtedly builds upon the fifty-year legal tradition of the right to know about the workings of one’s government, open data does more than advance government accountability. Rather, it is a distinctly twenty-first century governing practice borne out of the potential of big data to help solve society’s biggest problems. Thus, this Essay charts a thoughtful path toward a twenty-first century transparency regime that takes advantage of and blends the strengths of open data’s collaborative and innovation-centric approach and the adversarial and monitorial tactics of freedom of information regimes….(More)”.

Rights-Based and Tech-Driven: Open Data, Freedom of Information, and the Future of Government Transparency

Get the latest news right in you inbox

Subscribe to curated findings and actionable knowledge from The Living Library, delivered to your inbox every Friday