Stefaan Verhulst
Chapter by Michiel de Lange in The Right to the Smart City: “The current datafication of cities raises questions about what Lefebvre and many after him have called “the right to the city.” In this contribution, I investigate how the use of data for civic purposes may strengthen the “right to the datafied city,” that is, the degree to which different people engage and participate in shaping urban life and culture, and experience a sense of ownership. The notion of the commons acts as the prism to see how data may serve to foster this participatory “smart citizenship” around collective issues. This contribution critically engages with recent attempts to theorize the city as a commons. Instead of seeing the city as a whole as a commons, it proposes a more fine-grained perspective of the “commons-as-interface.” The “commons-as-interface,” it is argued, productively connects urban data to the human-level political agency implied by “the right to the city” through processes of translation and collectivization. The term is applied to three short case studies, to analyze how these processes engender a “right to the datafied city.” The contribution ends by considering the connections between two seemingly opposed discourses about the role of data in the smart city – the cybernetic view versus a humanist view. It is suggested that the commons-as-interface allows for more detailed investigations of mediation processes between data, human actors, and urban issues….(More)”.
Kiona N. Smith at Forbes: “What could the 107-year-old tragedy of the Titanic possibly have to do with modern problems like sustainable agriculture, human trafficking, or health insurance premiums? Data turns out to be the common thread. The modern world, for better or or worse, increasingly turns to algorithms to look for patterns in the data and and make predictions based on those patterns. And the basic methods are the same whether the question they’re trying to answer is “Would this person survive the Titanic sinking?” or “What are the most likely routes for human trafficking?”
An Enduring Problem
Predicting survival at sea based on the Titanic dataset is a standard practice problem for aspiring data scientists and programmers. Here’s the basic challenge: feed your algorithm a portion of the Titanic passenger list, which includes some basic variables describing each passenger and their fate. From that data, the algorithm (if you’ve programmed it well) should be able to draw some conclusions about which variables made a person more likely to live or die on that cold April night in 1912. To test its success, you then give the algorithm the rest of the passenger list (minus the outcomes) and see how well it predicts their fates.
Online communities like Kaggle.com have held competitions to see who can develop the algorithm that predicts survival most accurately, and it’s also a common problem presented to university classes. The passenger list is big enough to be useful, but small enough to be manageable for beginners. There’s a simple set out of outcomes — life or death — and around a dozen variables to work with, so the problem is simple enough for beginners to tackle but just complex enough to be interesting. And because the Titanic’s story is so famous, even more than a century later, the problem still resonates.
“It’s interesting to see that even in such a simple problem as the Titanic, there are nuggets,” said Sagie Davidovich, Co-Founder & CEO of SparkBeyond, who used the Titanic problem as an early test for SparkBeyond’s AI platform and still uses it as a way to demonstrate the technology to prospective customers….(More)”.
Chapter by Matt Laessig, Bryon Jacob and Carla AbouZahr in The Palgrave Handbook of Global Health Data Methods for Policy and Practice: “…provide best practices for organizations to adopt to disseminate data openly for others to use. They describe development of the open data movement and its rapid adoption by governments, non-governmental organizations, and research groups. The authors provide examples from the health sector—an early adopter—but acknowledge concerns specific to health relating to informed consent, intellectual property, and ownership of personal data. Drawing on their considerable contributions to the open data movement, Laessig and Jacob share their Open Data Progression Model. They describe six stages to make data open: from data collection, documentation of the data, opening the data, engaging the community of users, making the data interoperable, to finally linking the data….(More)”
Announcement: “Healthcare technologies are rapidly evolving, producing new data sources, data types, and data uses, which precipitate more rapid and complex data sharing. Novel technologies—such as artificial intelligence tools and new internet of things (IOT) devices and services—are providing benefits to patients, doctors, and researchers. Data-driven products and services are deepening patients’ and consumers’ engagement and helping to improve health outcomes. Understanding the evolving health data ecosystem presents new challenges for policymakers and industry. There is an increasing need to better understand and document the stakeholders, the emerging data types and their uses.
The Future of Privacy Forum (FPF) and the Information Accountability Foundation (IAF) partnered to form the FPF-IAF Joint Health Initiative in 2018. Today, the Initiative is releasing A Taxonomy of Definitions for the Health Data Ecosystem; the publication is intended to enable a more nuanced, accurate, and common understanding of the current state of the health data ecosystem. The Taxonomy outlines the established and emerging language of the health data ecosystem. The Taxonomy includes definitions of:
- The stakeholders currently involved in the health data ecosystem and examples of each;
- The common and emerging data types that are being collected, used, and shared across the health data ecosystem;
- The purposes for which data types are used in the health data ecosystem; and
- The types of actions that are now being performed and which we anticipate will be performed on datasets as the ecosystem evolves and expands.
This report is as an educational resource that will enable a deeper understanding of the current landscape of stakeholders and data types….(More)”.
Jukka Vahti at Sitra: “The Finnish tradition of establishing, maintaining and developing data registers goes back to the 1600s, when parish records were first kept.
When this old custom is combined with the opportunities afforded by digitisation, the positive approach Finns have towards research and technology, and the recently updated legislation enabling the data economy, Finland and the Finnish people can lead the way as Europe gradually, or even suddenly, switches to a fair data economy.
The foundations for a fair data economy already exist
The fair data economy is a natural continuation of the former projects promoting e-services that were undertaken in Finland.
For example, the Data Exchange Layer is already speeding up the transfer of data from one system to another in Finland and in Estonia, the country where the system originated, and a system unique to just these two countries.
In May 2019 Finland also saw the entry into force of the Act on the Secondary Use of Health and Social Data, according to which the information on social welfare and healthcare held in registers may be used for purposes of statistics, research, education, knowledge management, control and supervision conducted by authorities, and development and innovation activity.
The new law will make the work of researchers and service developers more effective, as the business of acquiring a permit will take place through a one-stop-shop principle and it will be possible to use data from more than one source more readily than before….(More)”.
TedX Talk by Gianluca Sgueo: “How does gaming link people in today society? In business, companies use gamification as a marketing tool to attract customer; while government and non-governmental organizations deploy it to connect citizens and public powers. Gianluca Sgueo, a global professor major in public law and policy analyst, tells us how a gamified government facilitates and engages the citizens in the policy-making process; as well as its inconspicuous but important impacts brought to our lives. Gianluca Sgueo is Global Media Professor at New York University in Florence, Visiting Professor at HEC Paris and Research Associate at the Center of Social Studies of the University of Coimbra. His area of expertise is the public sector, to which he provides professional services. His academic work focuses on participatory democracy, lobbying and globalization and he is author of a recent work about Games, Powers & Democracies….(More)

Press Release: “The Governance Lab at the NYU Tandon School of Engineering announced the launch of the 100 Questions Initiative — an effort to identify the most important societal questions whose answers can be found in data and data science if the power of data collaboratives is harnessed.
The initiative, launched with initial support from Schmidt Futures, seeks to address challenges on numerous topics, including migration, climate change, poverty, and the future of work.
For each of these areas and more, the initiative will seek to identify questions that could help unlock the potential of data and data science with the broader goal of fostering positive social, environmental, and economic transformation. These questions will be sourced by leveraging “bilinguals” — practitioners across disciplines from all over the world who possess both domain knowledge and data science expertise.
The 100 Questions Initiative starts by identifying 10 key questions related to migration. These include questions related to the geographies of migration, migrant well-being, enforcement and security, and the vulnerabilities of displaced people. This inaugural effort involves partnerships with the International Organization for Migration (IOM) and the European Commission, both of which will provide subject-matter expertise and facilitation support within the framework of the Big Data for Migration Alliance (BD4M).
“While there have been tremendous efforts to gather and analyze data relevant to many of the world’s most pressing challenges, as a society, we have not taken the time to ensure we’re asking the right questions to unlock the true potential of data to help address these challenges,” said Stefaan Verhulst, co-founder and chief research and development officer of The GovLab. “Unlike other efforts focused on data supply or data science expertise, this project seeks to radically improve the set of questions that, if answered, could transform the way we solve 21st century problems.”
In addition to identifying key questions, the 100 Questions Initiative will also focus on creating new data collaboratives. Data collaboratives are an emerging form of public-private partnership that help unlock the public interest value of previously siloed data. The GovLab has conducted significant research in the value of data collaboration, identifying that inter-sectoral collaboration can both increase access to information (e.g., the vast stores of data held by private companies) as well as unleash the potential of that information to serve the public good….(More)”.
Amy Maxmen in Nature: “After an earthquake tore through Haiti in 2010, killing more than 100,000 people, aid agencies spread across the country to work out where the survivors had fled. But Linus Bengtsson, a graduate student studying global health at the Karolinska Institute in Stockholm, thought he could answer the question from afar. Many Haitians would be using their mobile phones, he reasoned, and those calls would pass through phone towers, which could allow researchers to approximate people’s locations. Bengtsson persuaded Digicel, the biggest phone company in Haiti, to share data from millions of call records from before and after the quake. Digicel replaced the names and phone numbers of callers with random numbers to protect their privacy.
Bengtsson’s idea worked. The analysis wasn’t completed or verified quickly enough to help people in Haiti at the time, but in 2012, he and his collaborators reported that the population of Haiti’s capital, Port-au-Prince, dipped by almost one-quarter soon after the quake, and slowly rose over the next 11 months1. That result aligned with an intensive, on-the-ground survey conducted by the United Nations.
Humanitarians and researchers were thrilled. Telecommunications companies scrutinize call-detail records to learn about customers’ locations and phone habits and improve their services. Researchers suddenly realized that this sort of information might help them to improve lives. Even basic population statistics are murky in low-income countries where expensive household surveys are infrequent, and where many people don’t have smartphones, credit cards and other technologies that leave behind a digital trail, making remote-tracking methods used in richer countries too patchy to be useful.
Since the earthquake, scientists working under the rubric of ‘data for good’ have analysed calls from tens of millions of phone owners in Pakistan, Bangladesh, Kenya and at least two dozen other low- and middle-income nations. Humanitarian groups say that they’ve used the results to deliver aid. And researchers have combined call records with other information to try to predict how infectious diseases travel, and to pinpoint locations of poverty, social isolation, violence and more (see ‘Phone calls for good’)….(More)”.
Report by the Caixa Foundation: “…The Work4Progress programme thus supports the creation of “Open Innovation Platforms for the creation of employment in Peru, India and Mozambique” by means of collaborative partnerships between local civil society organisations, private sector, administration, universities and Spanish NGOs.
The main innovation of this programme is the incorporation of new tools and methodologies in: (1) listening and identification of community needs, (2) the co-creation and prototyping of new solutions, (3) the exploration of instruments for scaling, (4) governance, (5) evolving evaluation systems and (6) financing strategies. The goal of all of the above is to try to incorporate innovation strategies comprehensively in all components.
Work4Progress has been designed with a Think-and-Do-Tank mentality. The
member organisations of the platforms are experimenting in the field, while a group of international experts helps us to obtain this knowledge and share it with centres of thought and action at international level. In fact, this is the objective of this publication: to share the theoretical framework of the programme, to connect these ideas with concrete examples and to continue to strengthen the meeting point between social innovation and development cooperation.
Work4Progress is offered as a ‘living lab’ to test new methodologies that may be useful for other philanthropic institutions, governments or entities specialising in international development….(More)”.
Paper by Eric Rosenbach and Katherine Mansted: “Information is now the world’s most consequential and contested geopolitical resource. The world’s most profitable businesses have asserted for years that data is the “new oil.” Political campaigns—and foreign intelligence operatives—have shown over the past two American presidential elections that data-driven social media is the key to public opinion. Leading scientists and technologists understand that good datasets, not just algorithms, will give them a competitive edge.
Data-driven innovation is not only disrupting economies and societies; it is reshaping relations between nations. The pursuit of information power—involving states’ ability to use information to influence, decide, create and communicate—is causing states to rewrite their terms of engagement with markets and citizens, and to redefine national interests and strategic priorities. In short, information power is altering the nature and behavior of the fundamental building block of international relations, the state, with potentially seismic consequences.
Authoritarian governments recognize the strategic importance of information and over the past five years have operationalized powerful domestic and international information strategies. They are cauterizing their domestic information environments and shutting off their citizens from global information flows, while weaponizing information to attack and destabilize democracies. In particular, China and Russia believe that strategic competition in the 21st century is characterized by a zero-sum contest for control of data, as well as the technology and talent needed to convert data into useful information.
Democracies remain fundamentally unprepared for strategic competition in the Information Age. For the United States in particular, as the importance of information as a geopolitical resource has waxed, its information dominance has waned. Since the end of the Cold War, America’s supremacy in information technologies seemed unassailable—not least because of its central role in creating the Internet and overall economic primacy. Democracies have also considered any type of information strategy to be largely unneeded: government involvement in the domestic information environment feels Orwellian, while democracies believed that their “inherently benign” foreign policy didn’t need extensive influence operations.
However, to compete and thrive in the 21st century, democracies, and the United States in particular, must develop new national security and economic strategies that address the geopolitics of information. In the 20th century, market capitalist democracies geared infrastructure, energy, trade, and even social policy to protect and advance that era’s key source of power—manufacturing. In this century, democracies must better account for information geopolitics across all dimensions of domestic policy and national strategy….(More)”.