U.S. Government Effort to Tap Private Weather Data Moves Along Slowly


Article by Isabelle Bousquette: “The U.S. government’s six-year-old effort to improve its weather forecasting ability by purchasing data from private-sector satellite companies has started to show results, although the process is moving more slowly than anticipated.

After a period of testing, the National Oceanic and Atmospheric Administration, a scientific, service and regulatory arm of the Commerce Department, began purchasing data from two satellite companies, Spire Global Inc. of Vienna, Va., and GeoOptics Inc. of Pasadena, Calif.

The weather data from these two companies fills gaps in coverage left by NOAA’s own satellites, the agency said. NOAA also began testing data from a third company this year.

Beyond these companies, new entrants to the field offering weather data based on a broader range of technologies have been slow to emerge, the agency said.

“We’re getting a subset of what we hoped,” said Dan St. Jean, deputy director of the Office of System Architecture and Advanced Planning at NOAA’s Satellite and Information Service.

NOAA’s weather forecasts help the government formulate hurricane evacuation plans and make other important decisions. The agency began seeking out private sources of satellite weather data in 2016. The idea was to find a more cost-effective alternative to funding NOAA’s own satellite constellations, the agency said. It also hoped to seed competition and innovation in the private satellite sector.

It isn’t yet clear whether there is a cost benefit to using private data, in part because the relatively small number of competitors in the market has made it challenging to determine a steady market price, NOAA said.

“All the signs in the nascent ‘new space’ industry indicated that there would be a plethora of venture capitalists wanting to compete for NOAA’s commercial pilot/purchase dollars. But that just never materialized,” said Mr. St. Jean…(More)”.

Reorganise: 15 stories of workers fighting back in a digital age 


Book edited by Hannah O’Rourke & Edward Saperia: “In only a decade, the labour market has changed beyond all recognition – from zero-hour contracts to platform monopolies. As capitalism has re-created itself for the digital age, so too must the workers whose labour underpins it.

From a union for instagram influencers to roadworkers organising through a Facebook Group, former WSJ journalist Lucy Harley-McKeown takes us on a journey to discover how workers are fighting back in the 21st century…(More)”.

Mapping community resources for disaster preparedness: humanitarian data capability and automated futures


Report by Anthony McCosker et al: “This report details the rationale, background research and design for a platform to help local communities map resources for disaster preparedness. It sets out a first step in improving community data capability through resource mapping to enhance humanitarian action before disaster events occur.The project seeks to enable local community disaster preparedness and thus build community resilience by improving the quality of data about community strengths, resources and assets.

In this report, the authors define a gap in existing humanitarian mapping approaches and the uses of open, public and social media data in humanitarian contexts. The report surveys current knowledge and present a selection of case studies delivering data and humanitarian mapping in local communities.

Drawing on this knowledge and practice review and stakeholder workshops throughout 2021, the authors also define a method and toolkit for the effective use of community assets data…(More)”

Measuring human rights: facing a necessary challenge


Essay by Eduardo Burkle: “Given the abundance of data available today, many assume the world already has enough accurate metrics on human rights performance. However, the political sensitivity of human rights has proven a significant barrier to access. Governments often avoid producing and sharing this type of information.

States’ compliance with their human rights obligations often receives a lot of attention. But there is still much discussion about how to measure it. At the same time, statistics and data increasingly drive political and bureaucratic decisions. This, in turn, brings some urgency to the task of ensuring the best possible data are available.

Establishing cross-national human rights measures is vital for research, advocacy, and policymaking. It can also have a direct effect on people’s enjoyment of human rights. Good data allow states and actors to evaluate how well their country is performing. It also lets them make comparisons that highlight which policies and institutions are truly effective in promoting human rights.

Good human rights data does more than simply evaluate how well a country is performing – it also identifies which policies and institutions are truly effective in promoting human rights

Such context makes it crucial to arm researchers, journalists, advocates, practitioners, investors, and companies with reliable information when raising human rights issues in their countries, and around the world…(More)”.

To Fix Tech, Democracy Needs to Grow Up


Article by Divya Siddarth: “There isn’t much we can agree on these days. But two sweeping statements that might garner broad support are “We need to fix technology” and “We need to fix democracy.”

There is growing recognition that rapid technology development is producing society-scale risks: state and private surveillance, widespread labor automation, ascending monopoly and oligopoly power, stagnant productivity growth, algorithmic discrimination, and the catastrophic risks posed by advances in fields like AI and biotechnology. Less often discussed, but in my view no less important, is the loss of potential advances that lack short-term or market-legible benefits. These include vaccine development for emerging diseases and open source platforms for basic digital affordances like identity and communication.

At the same time, as democracies falter in the face of complex global challenges, citizens (and increasingly, elected leaders) around the world are losing trust in democratic processes and are being swayed by autocratic alternatives. Nation-state democracies are, to varying degrees, beset by gridlock and hyper-partisanship, little accountability to the popular will, inefficiency, flagging state capacity, inability to keep up with emerging technologies, and corporate capture. While smaller-scale democratic experiments are growing, locally and globally, they remain far too fractured to handle consequential governance decisions at scale.

This puts us in a bind. Clearly, we could be doing a better job directing the development of technology towards collective human flourishing—this may be one of the greatest challenges of our time. If actually existing democracy is so riddled with flaws, it doesn’t seem up to the task. This is what rings hollow in many calls to “democratize technology”: Given the litany of complaints, why subject one seemingly broken system to governance by another?…(More)”.

(Re)making data markets: an exploration of the regulatory challenges


Paper by Linnet Taylor, Hellen Mukiri-Smith, Tjaša Petročnik, Laura Savolainen & Aaron Martin: “Regulating the data market will be one of the major challenges of the twenty-first century. In order to think about regulating this market, however, we first need to make its dimensions and dynamics more accessible to observation and analysis. In this paper we explore what the state of the sociological and legal research on markets can tell us about the market for data: what kind of market it is, the practices and configurations of actors that constitute it, and what kinds of data are traded there. We start from the subjective opacity of this market to researchers interested in regulation and governance, review conflicting positions on its extent, diversity and regulability, and then explore comparisons from food and medicine regulation to understand the possible normative and practical implications and aims inherent in attempting to regulate how data is shared and traded. We conclude that there is a strong argument for a normative shift in the aims of regulation with regard to the data market, away from a prioritisation of the economic value of data and toward a more nuanced approach that aims to align the uses of data with the needs and rights of the communities reflected in it…(More)”

The fear of technology-driven unemployment and its empirical base


Article by Kerstin Hötte, Melline Somers and Angelos Theodorakopoulos:”New technologies may replace human labour, but can simultaneously create jobs if workers are needed to use these technologies or if new economic activities emerge. At the same time, technology-driven productivity growth may increase disposable income, stimulating a demand-induced employment expansion. Based on a systematic review of the empirical literature on technological change and its impact on employment published in the past four decades, this column suggests that the empirical support for the labour-creating effects of technological change dominates that for labour-replacement…(More)”.

Toward a Demand-Driven, Collaborative Data Agenda for Adolescent Mental Health


Paper by Stefaan Verhulst et al: “Existing datasets and research in the field of adolescent mental health do not always meet the needs of practitioners, policymakers, and program implementers, particularly in the context of vulnerable populations. Here, we introduce a collaborative, demand-driven methodology for the development of a strategic adolescent mental health research agenda. Ultimately, this agenda aims to guide future data sharing and collection efforts that meet the most pressing data needs of key stakeholders…

We conducted a rapid literature search to summarize common themes in adolescent mental health research into a “topic map”. We then hosted two virtual workshops with a range of international experts to discuss the topic map and identify shared priorities for future collaboration and research…

Our topic map identifies 10 major themes in adolescent mental health, organized into system-level, community-level, and individual-level categories. The engagement of cross-sectoral experts resulted in the validation of the mapping exercise, critical insights for refining the topic map, and a collaborative list of priorities for future research…

This innovative agile methodology enables a focused deliberation with diverse stakeholders and can serve as the starting point for data generation and collaboration practices, both in the field of adolescent mental health and other topics…(More)”.

Forest data governance as a reflection of forest governance: Institutional change and endurance in Finland and Canada


Paper by Salla Rantala, Brent Swallow, Anu Lähteenmäki-Uutela and Riikka Paloniemi: “The rapid development of new digital technologies for natural resource management has created a need to design and update governance regimes for effective and transparent generation, sharing and use of digital natural resource data. In this paper, we contribute to this novel area of investigation from the perspective of institutional change. We develop a conceptual framework to analyze how emerging natural resource data governance is shaped by related natural resource governance; complex, multilevel systems of actors, institutions and their interplay. We apply this framework to study forest data governance and its roots in forest governance in Finland and Canada. In Finland, an emphasis on open forest data and the associated legal reform represents the instutionalization of a mixed open data-bioeconomy discourse, pushed by higher-level institutional requirements towards greater openness and shaped by changing actor dynamics in relation to diverse forest values. In Canada, a strong institutional lock-in around public-private partnerships in forest management has engendered an approach that is based on voluntary data sharing agreements and fragmented data management, conforming with the entrenched interests of autonomous sub-national actors and thus extending the path-dependence of forest governance to forest data governance. We conclude by proposing how the framework could be further developed and tested to help explain which factors condition the formation of natural resource data institutions and subsequently the (re-)distribution of benefits they govern. Transparent and efficient data approaches can be enabled only if the analysis of data institutions is given equal attention to the technological development of data solutions…(More)”.

Who Should Represent Future Generations in Climate Planning?


Paper by Morten Fibieger Byskov and Keith Hyams: “Extreme impacts from climate change are already being felt around the world. The policy choices that we make now will affect not only how high global temperatures rise but also how well-equipped future economies and infrastructures are to cope with these changes. The interests of future generations must therefore be central to climate policy and practice. This raises the questions: Who should represent the interests of future generations with respect to climate change? And according to which criteria should we judge whether a particular candidate would make an appropriate representative for future generations? In this essay, we argue that potential representatives of future generations should satisfy what we call a “hypothetical acceptance criterion,” which requires that the representative could reasonably be expected to be accepted by future generations. This overarching criterion in turn gives rise to two derivative criteria. These are, first, the representative’s epistemic and experiential similarity to future generations, and second, his or her motivation to act on behalf of future generations. We conclude that communities already adversely affected by climate change best satisfy these criteria and are therefore able to command the hypothetical acceptance of future generations…(More)”.