Using mobile money data and call detail records to explore the risks of urban migration in Tanzania


Paper by Rosa Lavelle-Hill: “Understanding what factors predict whether an urban migrant will end up in a deprived neighbourhood or not could help prevent the exploitation of vulnerable individuals. This study leveraged pseudonymized mobile money interactions combined with cell phone data to shed light on urban migration patterns and deprivation in Tanzania. Call detail records were used to identify individuals who migrated to Dar es Salaam, Tanzania’s largest city. A street survey of the city’s subwards was used to determine which individuals moved to more deprived areas. t-tests showed that people who settled in poorer neighbourhoods had less money coming into their mobile money account after they moved, but not before. A machine learning approach was then utilized to predict which migrants will move to poorer areas of the city, making them arguably more vulnerable to poverty, unemployment and exploitation. Features indicating the strength and location of people’s social connections in Dar es Salaam before they moved (‘pull factors’) were found to be most predictive, more so than traditional ‘push factors’ such as proxies for poverty in the migrant’s source region…(More)”.

A Consumer Price Index for the 21st Century


Press Release by the National Academies of Sciences, Engineering, and Medicine: “The Bureau of Labor Statistics (BLS) should undertake a new strategy to modernize the Consumer Price Index by accelerating its use of new data sources and developing price indexes based on different income levels, says a new report from the National Academies of Sciences, Engineering, and Medicine.

The Consumer Price Index is the most widely used measure of inflation in the U.S. It is used to determine cost-of-living allowances and, importantly, influences monetary policy, among many other private- and public-sector applications. The new report, Modernizing the Consumer Price Index for the 21st Century, says the index has traditionally relied on field-generated data, such as prices observed in person at grocery stores or major retailers. These data have become more challenging and expensive to collect, and the availability of vast digital sources of consumer price data presents an opportunity. BLS has begun tapping into these data and has said its objective is to switch a significant portion of its measurement to nontraditional and digital data sources by 2024.

“The enormous economic disruption of the COVID-19 pandemic presents a perfect case study for the need to rapidly employ new data sources for the Consumer Price Index,” said Daniel E. Sichel, professor of economics at Wellesley College, and chair of the committee that wrote the report. “Modernizing the Consumer Price Index can help our measurement of household costs and inflation be more accurate, timelier, and ultimately more useful for policymakers responding to rapidly changing economic conditions.”..
The report says BLS should embark on a strategy of accelerating and enhancing the use of scanner, web-scraped, and digital data directly from retailers in compiling the Consumer Price Index. Scanner data — recorded at the point of sale or by consumers in their homes — can expand the variety of products represented in the Consumer Price Index, and better detect shifts in buying patterns. Web-scraped data can more nimbly track the prices of online goods, and goods where one company dominates the market. Permanently automating web-scraping of price data should be a high priority for the Consumer Price Index program, especially for food, electronics, and apparel, the report says.

Embracing these alternative data sources now will ensure that the accuracy and timeliness of the Consumer Price Index will not be compromised in the future, the report adds. Moreover, accelerating this process will give BLS time to carefully assess new data sources and methodologies before taking the decision to incorporate them in the official index….(More)”

European Health Union: A European Health Data Space for people and science


Press Release: “Today, the European Commission launched the European Health Data Space (EHDS), one of the central building blocks of a strong European Health Union. The EHDS will help the EU to achieve a quantum leap forward in the way healthcare is provided to people across Europe. It will empower people to control and utilise their health data in their home country or in other Member States. It fosters a genuine single market for digital health services and products. And it offers a consistent, trustworthy and efficient framework to use health data for research, innovation, policy-making and regulatory activities, while ensuring full compliance with the EU’s high data protection standards…

Putting people in control of their own health data, in their country and cross-border

  • Thanks to the EHDS, people will have immediate, and easy access to the data in electronic form, free of charge. They can easily share these data with other health professionals in and across Member States to improve health care delivery. Citizens will be in full control of their data and will be able to add information, rectify wrong data, restrict access to others and obtain information on how their data are used and for which purpose.
  • Member States will ensure that patient summaries, ePrescriptions, images and image reports, laboratory results, discharge reports are issued and accepted in a common European format.
  • Interoperability and security will become mandatory requirements. Manufacturers of electronic health record systems will need to certify compliance with these standards.
  • To ensure that citizens’ rights are safeguarded, all Member States have to appoint digital health authorities. These authorities will participate in the cross-border digital infrastructure (MyHealth@EU) that will support patients to share their data across borders.

Improving the use of health data for research, innovation and policymaking

  • The EHDS creates a strong legal framework for the use of health data for research, innovation, public health, policy-making and regulatory purposes. Under strict conditions, researchers, innovators, public institutions or industry will have access to large amounts of high-quality health data, crucial to develop life-saving treatments, vaccines or medical devices and ensuring better access to healthcare and more resilient health systems.  
  • The access to such data by researchers, companies or institutions will require a permit from a health data access body, to be set up in all Member States. Access will only be granted if the requested data is used for specific purposesin closed, secure environments and without revealing the identity of the individual. It is also strictly prohibited to use the data for decisions, which are detrimental to citizens such as designing harmful products or services or increasing an insurance premium.
  • The health data access bodies will be connected to the new decentralised EU-infrastructure for secondary use (HealthData@EU) which will be set up to support cross-border projects…(More)”

Data scientists are using the most annoying feature on your phones to save lives in Ukraine


Article by Bernhard Warner: “In late March, five weeks into Russia’s war on Ukraine, an international team of researchers, aid agency specialists, public health experts, and data nerds gathered on a Zoom call to discuss one of the tragic by-products of the war: the refugee crisis.

The numbers discussedweregrim. The United Nations had just declared Ukraine was facing the biggest humanitarian crisis to hit Europe since World War II as more than 4 million Ukrainians—roughly 10% of the population—had been forced to flee their homes to evade Russian President Vladimir Putin’s deadly and indiscriminate bombing campaign. That total has since swelled to 5.5 million, the UN estimates.

What the aid specialists on the call wanted to figure out was how many Ukrainian refugees still remained in the country (a population known as “internally displaced people”) and how many had crossed borders to seek asylum in the neighboring European Union countries of Poland, Slovakia, and Hungary, or south into Moldova. 

Key to an effective humanitarian response of this magnitude is getting accurate and timely data on the flow of displaced people traveling from a Point A danger zone to a Point B safe space. And nobody on the call, which was organized by CrisisReady, an A-team of policy experts and humanitarian emergency responders, had anything close to precise numbers.

But they did have a kind of secret weapon: mobility data.

“The importance of mobility data is often overstated,” Rohini Sampoornam Swaminathan, a crisis specialist at Unicef, told her colleagues on the call. Such anonymized data—pulled from social media feeds, geolocation apps like Google Maps, cell phone towers and the like—may not give the precise picture of what’s happening on the ground in a moment of extreme crisis, “but it’s valuable” as it can fill in points on a map. ”It’s important,” she added, “to get a picture for where people are moving, especially in the first days.”

Ukraine, a nation of relatively tech-savvy social media devotees and mobile phone users, is rich in mobility data, and that’s profoundly shaped the way the world sees and interprets the deadly conflict. The CrisisReady group believes the data has an even higher calling—that it can save lives.

Since the first days of Putin’s bombing campaign, various international teams have been tapping publicly available mobility data to map the refugee crisis and coordinate an effective response. They believe the data can reveal where war-torn Ukrainians are now, and even where they’re heading. In the right hands, the data can provide local authorities the intel they need to get essential aid—medical care, food, and shelter—to the right place at the right time…(More)”

Data sharing between humanitarian organisations and donors


Report by Larissa Fast: “This report investigates issues related to data sharing between humanitarian actors and donors, with a focus on two key questions:

  • What formal or informal frameworks govern the collection and sharing of disaggregated humanitarian data between humanitarian actors and donors?
  • How are these frameworks and the related requirements understood or perceived by humanitarian actors and donors?

Drawing on interviews with donors and humanitarians about data sharing practices and examination of formal documents, the research finds that, overall and perhaps most importantly, references to ‘data’ in the context of humanitarian operations are usually generic and lack a consistent definition or even a shared terminology. Complex regulatory frameworks, variability among donor expectations, both among and within donor governments (e.g., at the country or field/headquarters levels), and among humanitarian experiences of data sharing all complicate the nature and handling of data sharing requests. Both the lack of data literacy and the differing perceptions of operational data management risks exacerbate many issues related to data sharing and create inconsistent practice (see full summary of findings in Table 3).

More specifically, while much formal documentation about data sharing between humanitarians and donors is available in the public domain, few contain explicit policies or clauses on data sharing, instead referring only to financial or compliance data and programme reporting requirements. Additionally, the justifications for sharing disaggregated humanitarian data are framed most often in terms of accountability, compliance, efficiency, and programme design. Most requests for data are linked to monitoring and compliance, as well as requests for data as ‘assurances’. Even so, donors indicated that although they request detailed/disaggregated data, they may not have the time, or human and/or technical capacity to deal with it properly. In general, donor interviewees insisted that no record level data is shared within their governments, but only aggregated or in low or no sensitivity formats….(More)”.

Selected Readings on Digital Self-Determination for Migrants


By Uma Kalkar, Marine Ragnet, and Stefaan Verhulst

Digital self-determination (DSD) is a multidisciplinary concept that extends self-determination to the digital sphere. Self-determination places humans (and their ability to make ‘moral’ decisions) at the center of decision-making actions. While self-determination is considered as a jus cogens rule (i.e. a global norm), the concept of digital self-determination came only to light in the early 2010s as a result of the increasing digitization of most aspects of society. 

While digitalization has opened up new opportunities for self-expression and communication for individuals across the globe, its reach and benefits have not been evenly distributed. For instance, migrants and refugees are particularly vulnerable to the deepening inequalities and power structures brought on by increased digitization, and the subsequent datafication. Further, non-traditional data, such as social media and telecom data, have brought great potential to improve our understanding of the migration experience and patterns of mobility that can provide more targeted migration policies and services yet it also has brought new concerns related to the lack of agency to determine how the data is being used and who determines the migration narrative.

These selected readings look at DSD in light of the growing ubiquity of technology applications and specifically focus on their impacts on migrants. They were produced to inform the first studio on DSD and migration co-hosted by the Big Data for Migration Alliance and the International Digital Self Determination Network. The readings are listed in alphabetical order.

These readings serve as a primer to offer base perspectives on DSD and its manifestations, as well as provide a better understanding of how migration data is managed today to advance or hinder life for those on the move. Please alert us of any other publication we should include moving forward.

Berens, Jos, Nataniel Raymond, Gideon Shimshon, Stefaan Verhulst, and Lucy Bernholz. “The Humanitarian Data Ecosystem: the Case for Collective Responsibility.” Stanford Center for Philanthropy and Civil Society, 2017.

  • The authors explore the challenges to, and potential solutions for, the responsible use of digital data in the context of international humanitarian action. Data governance is related to DSD because it oversees how the information extracted from an individual—understood by DSD as an extension of oneself in the digital sphere—is handled.
  • They argue that in the digital age, the basic service provision activities of NGOs and aid organizations have become data collection processes. However, the ecosystem of actors is “uncoordinated” creating inefficiencies and vulnerabilities in the humanitarian space.
  • The paper presents a new framework for responsible data use in the humanitarian domain. The authors advocate for data users to follow three steps: 
  1. “[L]ook beyond the role they take up in the ‘data-lifecycle’ and consider previous and following steps and roles;
  2. Develop sound data responsibility strategies not only to prevent harm to their own operations but also to other organizations in the ‘data-lifecycle;’ and, 
  3. Collaborate with and learn from other organizations, both in the humanitarian field and beyond, to establish broadly supported guidelines and standards for humanitarian data use.”

Currion, Paul. “The Refugee Identity.Caribou Digital (via Medium), March 13, 2018.

  • Developed as part of a DFID-funded initiative, this essay outlines the Data Requirements for Service Delivery within Refugee Camps project that investigated current data standards and design of refugee identity systems.
  • Currion finds that since “the digitisation of aid has already begun…aid agencies must therefore pay more attention to the way in which identity systems affect the lives and livelihoods of the forcibly displaced, both positively and negatively.” He argues that an interoperable digital identity for refugees is essential to access financial, social, and material resources while on the move but also to tap into IoT services.
  • However, many refugees are wary of digital tracking and data collection services that could further marginalize them as they search for safety. At present, there are no sector-level data standards around refugee identity data collection, combination, and centralization. How can regulators balance data protection with government and NGO requirements to serve refugees in the ways they want to uphold their DSD?
  • Currion argues that a Responsible Data approach, as opposed to a process defined by a Data Minimization principle, provides “useful guidelines” but notes that data responsibility “still needs to be translated into organizational policy, then into institutional processes, and finally into operational practice. He further adds that “the digitization of aid, if approached from a position that empowers the individual as much as the institution, offers a chance to give refugees back their voices.”

Decker, Rianne, Paul Koot, S. Ilker Birbil, Mark van Embden Andres. “Co-designing algorithms for governance: Ensuring responsible and accountable algorithmic management of refugee camp supplies” Big Data and Society, April 2022. 

  • While recent literature has looked at the negative impacts of big data and algorithms in public governance, claiming they may reinforce existing biases and defy scrutiny by public officials, this paper argues that designing algorithms with relevant government and society stakeholders might be a way to make them more accountable and transparent. 
  • It presents a case study of the development of an algorithmic tool to estimate the populations of refugee camps to manage the delivery of emergency supplies. The algorithms included in this tool were co-designed with relevant stakeholders. 
  • This may provide a way to uphold DSD by  contributing to the “accountability of the algorithm by making the estimations transparent and explicable to its users.”
  • The authors found that the co-design process enabled better accuracy and responsibility and fostered collaboration between partners, creating a suitable purpose for the tool and making the algorithm understandable to its users. This enabled algorithmic accountability. 
  • The authors note, however, that the beneficiaries of the tools were not included in the design process, limiting the legitimacy of the initiative. 

European Migration Network. “The Use of Digitalisation and Artificial Intelligence in Migration Management.” EMN-OECD Inform Series, February 2022.

  • This paper explores the role of new digital technologies in the management of migration and asylum, focusing specifically on where digital technologies, such as online portals, blockchain, and AI-powered speech and facial recognition systems are being used across Europe to navigate the processes of obtaining visas, claiming asylum, gaining citizenship,  and deploying border control management. 
  • Further, it points to friction between GDPR and new technologies like blockchain—which by decision does not allow for the right to be forgotten—and potential workarounds, such as two-step pseudonymisation.
  • As well, it highlights steps taken to oversee and open up data protection processes for immigration. Austria, Belgium, and France have begun to conduct Data Protection Impact Assessments; France has a portal that allows one to request the right to be forgotten; Ireland informs online service users on how data can be shared or used with third-party agencies; and Spain outlines which personal data are used in immigration as per the Registry Public Treatment Activities.
  • Lastly, the paper points out next steps for policy development that upholds DSD, including universal access and digital literacy, trust in digital systems, willingness for government digital transformations, and bias and risk reduction.

Martin, Aaron, Gargi Sharma, Siddharth Peter de Souza, Linnet Taylor, Boudewijn van Eerd, Sean Martin McDonald, Massimo Marelli, Margie Cheesman, Stephan Scheel, and Huub Dijstelbloem. “Digitisation and Sovereignty in Humanitarian Space: Technologies, Territories and Tensions.” Geopolitics (2022): 1-36.

  • This paper explores how digitisation and datafication are reshaping sovereign authority, power, and control in humanitarian spaces.
  • Building on the notion that technology is political, Martin et al. discuss three cases where digital tools powered by partnerships between international organizations and NGOs and private firms such as Palantir and Facebook have raised concerns for data to be “repurposed” to undermine national sovereignty and distort humanitarian aims with for-profit motivations.
  • The authors draw attention to how cyber dependencies threaten international humanitarian organizations’ purported digital sovereignty. They touch on the tensions between national and digital sovereignty and self-governance.
  • The paper further argues that the rise of digital technologies in the governance of international mobility and migration policies “has all kinds of humanitarian and security consequences,” including (but not limited to) surveillance, privacy infringement, profiling, selection, inclusion/exclusion, and access barriers. Specifically, Scheel introduces the notion of function creep—the use of digital data beyond initially defined purposes—and emphasizes its common use in the context of migration as part “of the modus operandi of sovereign power.”

McAuliffe, Marie, Jenna Blower, and Ana Beduschi. “Digitalization and Artificial Intelligence in Migration and Mobility: Transnational Implications of the COVID-19 Pandemic.” Societies 11, no. 135 (2021): 1-13.

  • This paper critically examines the implications of intensifying digitalization and AI for migration and mobility systems in a post- COVID transnational context. 
  • The authors first situate digitalization and AI in migration by analyzing its uptake throughout the Migration Cycle, i.e. to verify identities and visas, “enable “smart” border processing,” and understand travelers’ adherence to legal frameworks. It then evaluates the current challenges and opportunities to migrants and migration systems brought about by deepening digitalization due to COVID-19. For example, contact tracing, infection screening, and quarantining procedures generate increased data about an individual and are meant, by design, to track and trace people, which raises concerns about migrants’ safety, privacy, and autonomy.
  • This essay argues that recent changes show the need for further computational advances that incorporate human rights throughout the design and development stages, “to mitigate potential risks to migrants’ human rights.” AI is severely flawed when it comes to decision-making around minority groups because of biased training data and could further marginalize vulnerable populations and intrusive data collection for public health could erode the power of one’s universal right to privacy. Leaving migrants at the mercy of black-box AI systems fails to uphold their right to DSD because it forces them to relinquish their agency and power to an opaque system.

Ponzanesi, Sandra. “Migration and Mobility in a Digital Age: (Re)Mapping Connectivity and Belonging.” Television & New Media 20, no. 6 (2019): 547-557.

  • This article explores the role of new media technologies in rethinking the dynamics of migration and globalization by focusing on the role of migrant users as “connected” and active participants, as well as “screened” and subject to biometric datafication, visualization, and surveillance.
  • Elaborating on concepts such as “migration” and “mobility,” the article analyzes the paradoxes of intermittent connectivity and troubled belonging, which are seen as relational definitions that are always fluid, negotiable, and porous.
  • It states that a city’s digital infrastructures are “complex sociotechnical systems” that have a functional side related to access and connectivity and a performative side where people engage with technology. Digital access and action represent areas of individual and collective manifestations of DSD. For migrants, gaining digital access and skills and “enacting citizenship” are important for resettlement. Ponzanesi advocates for further research conducted both from the bottom-up that leans on migrant experiences with technology to resettle and remain in contact with their homeland and a top-down approach that looks at datafication, surveillance, digital/e-governance as a part of the larger technology application ecosystem to understand contemporary processes and problems of migration.

Remolina, Nydia, and Mark James Findlay. “The Paths to Digital Self-Determination — A Foundational Theoretical Framework.” SMU Centre for AI & Data Governance Research Paper No. 03 (2021): 1-34.

  • Remolina and Findlay stress that self-determination is the vehicle by which people “decide their own destiny in the international order.” Decision-making ability powers humans to be in control of their own lives and excited to pursue a set of actions. Collective action, or the ability to make decisions as a part of a group—be it based on ethnicity, nationality, shared viewpoints, etc.—further motivates oneself.
  • The authors discuss how the European Union and European Court of Human Rights’ “principle of subsidiarity” aligns with self-determination because it advocates for power to be placed at the lowest level possible to preserve bottom-up agency with a “reasonable level of efficiency.” In practice, the results of subsidiarity have been disappointing.
  • The paper provides examples of indigenous populations’ fight for self-determination, offline and online. Here, digital self-determination refers to the challenges indigenous peoples face in accessing growing government uses of technology for unlocking innovative solutions because of a lack of physical infrastructure due to structural and social inequities between settler and indigenous communities.
  • Understanding self-determination—and by extension, digital self-determination as a human right, the report investigates how autonomy, sovereignty, the legal definition of a ‘right,’ inclusion, agency, data governance, data ownership, data control, and data quality.
  • Lastly, the paper presents a foundational theoretical framework that goes beyond just protecting personal data and privacy. Understanding that DSD “cannot be detached from duties for responsible data use,” the authors present a collective and individual dimension to DSD. They extend the individual dimension of DSD to include both my data and data about me that can be used to influence a person’s actions through micro-targeting and nudge techniques. They update the collective dimension of DSD to include the views and influences of organizations, businesses, and communities online and call for a better way of visualizing the ‘social self’ and its control over data.

Ziebart, Astrid, and Jessica Bither. “AI, Digital Identities, Biometrics, Blockchain: A Primer on the Use of Technology in Migration Management.” Migration Strategy Group on International Cooperation and Development, June 2020.

  • Ziebart and Bither note the implications of increasingly sophisticated use of technology and data collection by governments with respect to their citizens. They note that migrants and refugees “often are exposed to particular vulnerabilities” during these processes and underscore the need to bring migrants into data gathering and use policy conversations.  
  • The authors discuss the promise of technology—i.e., to predict migration through AI-powered analyses, employ technologies to reduce friction in the asylum-seeking processes, and the power of digital identities for those on the move. However, they stress the need to combine these tools with informational self-determination that allows migrants to own and control what data they share and how and where the data are used.
  • The migration and refugee policy space faces issues of “tech evangelism,” where technologies are being employed just because they exist, rather than because they serve an actual policy need or provide an answer to a particular policy question. This supply-driven policy implementation signals the need for more migrant voices to inform policymakers on what tools are actually useful for the migratory experience. In order to advance the digital agency of migrants, the paper offers recommendations for some of the ethical challenges these technologies might pose and ultimately advocates for greater participation of migrants and refugees in devising technology-driven policy instruments for migration issues.

On-the-go interesting resources 

  • Empowering Digital Self-Determination, mediaX at Stanford University: This short video presents definitions of DSD, and digital personhood, identity, and privacy and an overview of their applications across ethics, law, and the private sector.
  • Digital Self-Determination — A Living Syllabus: This syllabus and assorted materials have been created and curated from the 2021 Research Sprint run by the Digital Asia Hub and Berkman Klein Center for Internet Society at Harvard University. It introduces learners to the fundamentals of DSD across a variety of industries to enrich understanding of its existing and potential applications.
  • Digital Self-Determination Wikipedia Page: This Wikipedia page was developed by the students who took part in the Berkman Klein Center research sprint on digital self-determination. It provides a comprehensive overview of DSD definitions and its key elements, which include human-centered design, robust privacy mandates and data governance, and control over data use to give data subjects the ability to choose how algorithms manipulate their data for autonomous decision-making.
  • Roger Dubach on Digital Self-Determination: This short video presents DSD in the public sector and the dangers of creating a ‘data-protected’ world, but rather on understanding how governments can efficiently use data and protect privacy. Note: this video is part of the Living Syllabus course materials (Digital Self-Determination/Module 1: Beginning Inquiries).

Responsiveness of open innovation to COVID-19 pandemic: The case of data for good


Paper by Francesco Scotti, Francesco Pierri, Giovanni Bonaccorsi, and Andrea Flori: “Due to the COVID-19 pandemic, countries around the world are facing one of the most severe health and economic crises of recent history and human society is called to figure out effective responses. However, as current measures have not produced valuable solutions, a multidisciplinary and open approach, enabling collaborations across private and public organizations, is crucial to unleash successful contributions against the disease. Indeed, the COVID-19 represents a Grand Challenge to which joint forces and extension of disciplinary boundaries have been recognized as main imperatives. As a consequence, Open Innovation represents a promising solution to provide a fast recovery. In this paper we present a practical application of this approach, showing how knowledge sharing constitutes one of the main drivers to tackle pressing social needs. To demonstrate this, we propose a case study regarding a data sharing initiative promoted by Facebook, the Data For Good program. We leverage a large-scale dataset provided by Facebook to the research community to offer a representation of the evolution of the Italian mobility during the lockdown. We show that this repository allows to capture different patterns of movements on the territory with increasing levels of detail. We integrate this information with Open Data provided by the Lombardy region to illustrate how data sharing can also provide insights for private businesses and local authorities. Finally, we show how to interpret Data For Good initiatives in light of the Open Innovation Framework and discuss the barriers to adoption faced by public administrations regarding these practices…(More)”.

The ethical imperative to identify and address data and intelligence asymmetries


Article by Stefaan Verhulst in AI & Society: “The insight that knowledge, resulting from having access to (privileged) information or data, is power is more relevant today than ever before. The data age has redefined the very notion of knowledge and information (as well as power), leading to a greater reliance on dispersed and decentralized datasets as well as to new forms of innovation and learning, such as artificial intelligence (AI) and machine learning (ML). As Thomas Piketty (among others) has shown, we live in an increasingly stratified world, and our society’s socio-economic asymmetries are often grafted onto data and information asymmetries. As we have documented elsewhere, data access is fundamentally linked to economic opportunity, improved governance, better science and citizen empowerment. The need to address data and information asymmetries—and their resulting inequalities of political and economic power—is therefore emerging as among the most urgent ethical challenges of our era, yet often not recognized as such.

Even as awareness grows of this imperative, society and policymakers lag in their understanding of the underlying issue. Just what are data asymmetries? How do they emerge, and what form do they take? And how do data asymmetries accelerate information and other asymmetries? What forces and power structures perpetuate or deepen these asymmetries, and vice versa? I argue that it is a mistake to treat this problem as homogenous. In what follows, I suggest the beginning of a taxonomy of asymmetries. Although closely related, each one emerges from a different set of contingencies, and each is likely to require different policy remedies. The focus of this short essay is to start outlining these different types of asymmetries. Further research could deepen and expand the proposed taxonomy as well help define solutions that are contextually appropriate and fit for purpose….(More)”.

When Launching a Collaboration, Keep It Agile


Essay by the Stakeholder Alignment Collaborative: “Conventional wisdom holds that large-scale societal challenges require large-scale responses. By contrast, we argue that progress on major societal challenges can and often should begin with small, agile initiatives—minimum viable consortia (MVC)—that learn and adapt as they build the scaffolding for large-scale change. MVCs can address societal challenges by overcoming institutional inertia, opposition, capability gaps, and other barriers because they require less energy for activation, reveal dead ends early on, and can more easily adjust and adapt over time.

Large-scale societal challenges abound, and organizations and institutions are increasingly looking for ways to deal with them. For example, the National Academy of Engineering (NAE) has identified 14 Grand Societal Challenges for “sustaining civilization’s continuing advancement while still improving the quality of life” in the 21st century. They include making solar energy economical, developing carbon sequestration methods, advancing health informatics, and securing cyberspace. The United Nations has set 17 Sustainable Development Goals (SDGs) to achieve by 2030 for a better future for humanity. They include everything from eliminating hunger to reducing inequality.

Tackling such universal goals requires large-scale cooperation, because existing organizations and institutions simply do not have the ability to resolve these challenges independently. Further note that the NAE’s announcement of the challenges stated that “governmental and institutional, political and economic, and personal and social barriers will repeatedly arise to impede the pursuit of solutions to problems.” The United Nations included two enabling SDGs: “peace, justice, and strong institutions” and “partnership for the goals.” The question is how to bring such large-scale partnerships and institutional change into existence.

We are members of the Stakeholder Alignment Collaborative, a research consortium of scholars at different career stages, spanning multiple fields and disciplines. We study collaboration collaboratively and maintain a very flat structure. We have published on multistakeholder consortia associated with science1 and provided leadership and facilitation for the launch and sustainment of many of these consortia. Based on our research into the problem of developing large-scale, multistakeholder partnerships, we believe that MVCs provide an answer.

MVCs are less vulnerable to the many barriers to large-scale solutions, better able to forge partnerships and a more agile framework for making needed adjustments. To demonstrate these points, we focus on examples of MVCs in the domain of scientific research data and computing infrastructure. Research data are essential for virtually all societal challenges, and an upsurge of multistakeholder consortia has occurred in this domain. But the MVC concept is not limited to these challenges, nor to digitally oriented settings. We have chosen this sphere because it offers a diversity of MVC examples for illustration….(More)”. (See also “The Potential and Practice of Data Collaboratives for Migration“).

Mapping of exposed water tanks and swimming pools based on aerial images can help control dengue


Press Release by Fundação de Amparo à Pesquisa do Estado de São Paulo: “Brazilian researchers have developed a computer program that locates swimming pools and rooftop water tanks in aerial photographs with the aid of artificial intelligence to help identify areas vulnerable to infestation by Aedes aegypti, the mosquito that transmits dengue, zika, chikungunya and yellow fever. 

The innovation, which can also be used as a public policy tool for dynamic socio-economic mapping of urban areas, resulted from research and development work by professionals at the University of São Paulo (USP), the Federal University of Minas Gerais (UFMG) and the São Paulo State Department of Health’s Endemic Control Superintendence (SUCEN), as part of a project supported by FAPESP. An article about it is published in the journal PLOS ONE

“Our work initially consisted of creating a model based on aerial images and computer science to detect water tanks and pools, and to use them as a socio-economic indicator,” said Francisco Chiaravalloti Neto, last author of the article. He is a professor in the Epidemiology Department at USP’s School of Public Health (FSP), with a first degree in engineering. 

As the article notes, previous research had already shown that dengue tends to be most prevalent in deprived urban areas, so that prevention of dengue, zika and other diseases transmitted by the mosquito can be made considerably more effective by use of a relatively dynamic socio-economic mapping model, especially given the long interval between population censuses in Brazil (ten years or more). 

“This is one of the first steps in a broader project,” Chiaravalloti Neto said. Among other aims, he and his team plan to detect other elements of the images and quantify real infestation rates in specific areas so as to be able to refine and validate the model. 

“We want to create a flow chart that can be used in different cities to pinpoint at-risk areas without the need for inspectors to call on houses, buildings and other breeding sites, as this is time-consuming and a waste of the taxpayer’s money,” he added…(More)”.