Pre-Publication Paper by Douglas R. Leasure et al: “In times of crisis, real-time data mapping population displacements are invaluable for targeted humanitarian response. The Russian invasion of Ukraine on February 24, 2022 forcibly displaced millions of people from their homes including nearly 6m refugees flowing across the border in just a few weeks, but information was scarce regarding displaced and vulnerable populations who remained inside Ukraine. We leveraged near real-time social media marketing data to estimate sub-national population sizes every day disaggregated by age and sex. Our metric of internal displacement estimated that 5.3m people had been internally displaced away from their baseline administrative region by March 14. Results revealed four distinct displacement patterns: large scale evacuations, refugee staging areas, internal areas of refuge, and irregular dynamics. While this innovative approach provided one of the only quantitative estimates of internal displacement in virtual real-time, we conclude by acknowledging risks and challenges for the future…(More)”.
China May Be Chasing Impossible Dream by Trying to Harness Internet Algorithms
Article by Karen Hao: “China’s powerful cyberspace regulator has taken the first step in a pioneering—and uncertain—government effort to rein in the automated systems that shape the internet.
Earlier this month, the Cyberspace Administration of China published summaries of 30 core algorithms belonging to two dozen of the country’s most influential internet companies, including TikTok owner ByteDance Ltd., e-commerce behemoth Alibaba Group Holding Ltd. and Tencent Holdings Ltd., owner of China’s ubiquitous WeChat super app.
The milestone marks the first systematic effort by a regulator to compel internet companies to reveal information about the technologies powering their platforms, which have shown the capacity to radically alter everything from pop culture to politics. It also puts Beijing on a path that some technology experts say few governments, if any, are equipped to handle….
One important question the effort raises, algorithm experts say, is whether direct government regulation of algorithms is practically possible.
The majority of today’s internet platform algorithms are based on a technology called machine learning, which automates decisions such as ad-targeting by learning to predict user behaviors from vast repositories of data. Unlike traditional algorithms that contain explicit rules coded by engineers, most machine-learning systems are black boxes, making it hard to decipher their logic or anticipate the consequences of their use.
Beijing’s interest in regulating algorithms started in 2020, after TikTok sought an American buyer to avoid being banned in the U.S., according to people familiar with the government’s thinking. When several bidders for the short-video platform lost interest after Chinese regulators announced new export controls on information-recommendation technology, it tipped off Beijing to the importance of algorithms, the people said…(More)”.
Closing the Data Divide for a More Equitable U.S. Digital Economy
Report by Gillian Diebold: “In the United States, access to many public and private services, including those in the financial, educational, and health-care sectors, are intricately linked to data. But adequate data is not collected equitably from all Americans, creating a new challenge: the data divide, in which not everyone has enough high-quality data collected about them or their communities and therefore cannot benefit from data-driven innovation. This report provides an overview of the data divide in the United States and offers recommendations for how policymakers can address these inequalities…(More)”.
Top-Down and Bottom-Up Solutions to the Problem of Political Ignorance
Chapter by Hana Samaržija and Quassim Cassam: “There is broad, though not universal, agreement that widespread voter ignorance and irrational evaluation of evidence are serious threats to democracy. But there is deep disagreement over strategies for mitigating the danger. ‘Top-down’ approaches, such as epistocracy and lodging more authority in the hands of experts, seek to mitigate ignorance by concentrating more political power in the hands of the more knowledgeable segments of the population. By contrast, ‘bottom-up’ approaches seek to either raise the political competence of the general public or empower ordinary people in ways that give them better incentives to make good decisions than conventional ballot-box voting does. Examples of bottom-up strategies include increasing voter knowledge through education, various ‘sortition’ proposals, and also shifting more decisions to institutions where citizens can ‘vote with their feet’.
This chapter surveys and critiques a range of both top-down and bottom-up strategies. I conclude that top-down strategies have systematic flaws that severely limit their potential. While they should not be categorically rejected, we should be wary of adopting them on a large scale. Bottom-up strategies have significant limitations of their own. But expanding foot voting opportunities holds more promise than any other currently available option. The idea of paying voters to increase their knowledge also deserves serious consideration…(More)”.
OSTP Issues Guidance to Make Federally Funded Research Freely Available Without Delay
The White House: “Today, the White House Office of Science and Technology Policy (OSTP) updated U.S. policy guidance to make the results of taxpayer-supported research immediately available to the American public at no cost. In a memorandum to federal departments and agencies, Dr. Alondra Nelson, the head of OSTP, delivered guidance for agencies to update their public access policies as soon as possible to make publications and research funded by taxpayers publicly accessible, without an embargo or cost. All agencies will fully implement updated policies, including ending the optional 12-month embargo, no later than December 31, 2025.
This policy will likely yield significant benefits on a number of key priorities for the American people, from environmental justice to cancer breakthroughs, and from game-changing clean energy technologies to protecting civil liberties in an automated world.
For years, President Biden has been committed to delivering policy based on the best available science, and to working to ensure the American people have access to the findings of that research. “Right now, you work for years to come up with a significant breakthrough, and if you do, you get to publish a paper in one of the top journals,” said then-Vice President Biden in remarks to the American Association for Cancer Research in 2016. “For anyone to get access to that publication, they have to pay hundreds, or even thousands, of dollars to subscribe to a single journal. And here’s the kicker — the journal owns the data for a year. The taxpayers fund $5 billion a year in cancer research every year, but once it’s published, nearly all of that taxpayer-funded research sits behind walls. Tell me how this is moving the process along more rapidly.” The new public access guidance was developed with the input of multiple federal agencies over the course of this year, to enable progress on a number of Biden-Harris Administration priorities.
“When research is widely available to other researchers and the public, it can save lives, provide policymakers with the tools to make critical decisions, and drive more equitable outcomes across every sector of society,” said Dr. Alondra Nelson, head of OSTP. “The American people fund tens of billions of dollars of cutting-edge research annually. There should be no delay or barrier between the American public and the returns on their investments in research.”..(More)“.
U.S. Government Effort to Tap Private Weather Data Moves Along Slowly
Article by Isabelle Bousquette: “The U.S. government’s six-year-old effort to improve its weather forecasting ability by purchasing data from private-sector satellite companies has started to show results, although the process is moving more slowly than anticipated.
After a period of testing, the National Oceanic and Atmospheric Administration, a scientific, service and regulatory arm of the Commerce Department, began purchasing data from two satellite companies, Spire Global Inc. of Vienna, Va., and GeoOptics Inc. of Pasadena, Calif.
The weather data from these two companies fills gaps in coverage left by NOAA’s own satellites, the agency said. NOAA also began testing data from a third company this year.
Beyond these companies, new entrants to the field offering weather data based on a broader range of technologies have been slow to emerge, the agency said.
“We’re getting a subset of what we hoped,” said Dan St. Jean, deputy director of the Office of System Architecture and Advanced Planning at NOAA’s Satellite and Information Service.
NOAA’s weather forecasts help the government formulate hurricane evacuation plans and make other important decisions. The agency began seeking out private sources of satellite weather data in 2016. The idea was to find a more cost-effective alternative to funding NOAA’s own satellite constellations, the agency said. It also hoped to seed competition and innovation in the private satellite sector.
It isn’t yet clear whether there is a cost benefit to using private data, in part because the relatively small number of competitors in the market has made it challenging to determine a steady market price, NOAA said.
“All the signs in the nascent ‘new space’ industry indicated that there would be a plethora of venture capitalists wanting to compete for NOAA’s commercial pilot/purchase dollars. But that just never materialized,” said Mr. St. Jean…(More)”.
Reorganise: 15 stories of workers fighting back in a digital age
Book edited by Hannah O’Rourke & Edward Saperia: “In only a decade, the labour market has changed beyond all recognition – from zero-hour contracts to platform monopolies. As capitalism has re-created itself for the digital age, so too must the workers whose labour underpins it.
From a union for instagram influencers to roadworkers organising through a Facebook Group, former WSJ journalist Lucy Harley-McKeown takes us on a journey to discover how workers are fighting back in the 21st century…(More)”.
Public preferences for governing AI technology: Comparative evidence
Paper by Soenke Ehret: “Citizens’ attitudes concerning aspects of AI such as transparency, privacy, and discrimination have received considerable attention. However, it is an open question to what extent economic consequences affect preferences for public policies governing AI. When does the public demand imposing restrictions on – or even prohibiting – emerging AI technologies? Do average citizens’ preferences depend causally on normative and economic concerns or only on one of these causes? If both, how might economic risks and opportunities interact with assessments based on normative factors? And to what extent does the balance between the two kinds of concerns vary by context? I answer these questions using a comparative conjoint survey experiment conducted in Germany, the United Kingdom, India, Chile, and China. The data analysis suggests strong effects regarding AI systems’ economic and normative attributes. Moreover, I find considerable cross-country variation in normative preferences regarding the prohibition of AI systems vis-a-vis economic concerns…(More)”.
Mapping community resources for disaster preparedness: humanitarian data capability and automated futures
Report by Anthony McCosker et al: “ The project seeks to enable local community disaster preparedness and thus build community resilience by improving the quality of data about community strengths, resources and assets.
In this report, the authors define a gap in existing humanitarian mapping approaches and the uses of open, public and social media data in humanitarian contexts. The report surveys current knowledge and present a selection of case studies delivering data and humanitarian mapping in local communities.
Drawing on this knowledge and practice review and stakeholder workshops throughout 2021, the authors also define a method and toolkit for the effective use of community assets data…(More)”
Measuring human rights: facing a necessary challenge
Essay by Eduardo Burkle: “Given the abundance of data available today, many assume the world already has enough accurate metrics on human rights performance. However, the political sensitivity of human rights has proven a significant barrier to access. Governments often avoid producing and sharing this type of information.
States’ compliance with their human rights obligations often receives a lot of attention. But there is still much discussion about how to measure it. At the same time, statistics and data increasingly drive political and bureaucratic decisions. This, in turn, brings some urgency to the task of ensuring the best possible data are available.
Establishing cross-national human rights measures is vital for research, advocacy, and policymaking. It can also have a direct effect on people’s enjoyment of human rights. Good data allow states and actors to evaluate how well their country is performing. It also lets them make comparisons that highlight which policies and institutions are truly effective in promoting human rights.
Good human rights data does more than simply evaluate how well a country is performing – it also identifies which policies and institutions are truly effective in promoting human rights
Such context makes it crucial to arm researchers, journalists, advocates, practitioners, investors, and companies with reliable information when raising human rights issues in their countries, and around the world…(More)”.