“How Dare They Peep into My Private Life”


Report by Human Rights Watch on “Children’s Rights Violations by Governments that Endorsed Online Learning During the Covid-19 Pandemic”: “The coronavirus pandemic upended the lives and learning of children around the world. Most countries pivoted to some form of online learning, replacing physical classrooms with EdTech websites and apps; this helped fill urgent gaps in delivering some form of education to many children.

But in their rush to connect children to virtual classrooms, few governments checked whether the EdTech they were rapidly endorsing or procuring for schools were safe for children. As a result, children whose families were able to afford access to the internet and connected devices, or who made hard sacrifices in order to do so, were exposed to the privacy practices of the EdTech products they were told or required to use during Covid-19 school closures.

Human Rights Watch conducted its technical analysis of the products between March and August 2021, and subsequently verified its findings as detailed in the methodology section. Each analysis essentially took a snapshot of the prevalence and frequency of tracking technologies embedded in each product on a given date in that window. That prevalence and frequency may fluctuate over time based on multiple factors, meaning that an analysis conducted on later dates might observe variations in the behavior of the products…(More)”

Use of Population-Level Administrative Data in Developmental Science


Paper by Barry J. Milne: “Population-level administrative data—data on individuals’ interactions with administrative systems (e.g., health, criminal justice, and education)—have substantially advanced our understanding of life-course development. In this review, we focus on five areas where research using these data has made significant contributions to developmental science: (a) understanding small or difficult-to-study populations, (b) evaluating intergenerational and family influences, (c) enabling estimation of causal effects through natural experiments and regional comparisons, (d) identifying individuals at risk for negative developmental outcomes, and (e) assessing neighborhood and environmental influences. Further advances will be made by linking prospective surveys to administrative data to expand the range of developmental questions that can be tested; supporting efforts to establish new linked administrative data resources, including in developing countries; and conducting cross-national comparisons to test findings’ generalizability. New administrative data initiatives should involve consultation with population subgroups including vulnerable groups, efforts to obtain social license, and strong ethical oversight and governance arrangements…(More)”.

Public Management in an Information Age


Book by Albert Meijer, Alex Ingrams and Stavros Zouridis: “New information and communication technologies have drastically changed public management. Public managers are increasingly dependent on information gathered form complex systems and they need to be able to put in place sound IT and communication structures.

This accessible text, aimed specifically at those studying and working in public management, offers readers a comprehensive understanding of ICTs and their implications for public management. It provides aspiring and current public managers a framework for the development of strategic public information management across the full range of public organizations.

Written by leading experts in this area, Public Management in an Information Age offers:

– A thorough grounding in the latest research
– Examples of issues and practices from different contexts and types of organizations around the world
– A range of tools and techniques to help readers analyse concrete situations and develop appropriate solutions
– Summary boxes on key ICTs in non-technical language..(More)”.

Making IP a force-enabler for solving big problems


Article by Hossein Nowbar: “The world continues to confront compounding health, economic and humanitarian crises. We face urgent challenges like carbon in our atmosphere and declining growth of the working age population in developed countries. Microsoft believes that technology – particularly artificial intelligence (AI) – has great potential to help address these problems. The ability to uncover new insights in large datasets will drive new advances in climate science and improve workforce productivity. But success requires more innovation in more fields in less time than any other technological era in human history. And this innovation will be distributed. No one person or company will invent all of the advances in technology necessary to solve these complex problems. It will take collaboration and the fostering of community.

To address these challenges, we need an IP system that promotes pragmatic and practical mechanisms with a focus on how the system can enable innovation, not impede it…

I suggested some ideas the IP community can consider in evolving our IP systems to enable faster progress towards a better future:

  1. Adopt new licensing mechanisms to enable widespread and friction-free use of technology to solve important problems and help inventors obtain economic benefit for their IP. For example, there should be a rate court that establishes license fees for standards-essential patents that would eliminate the ambiguity and uncertainty around licensing such technologies.
  2. Promote exceptions to IP that improve knowledge-sharing, collaboration and development of new technologies like machine learning, such as the text and data mining exceptions adopted in Europe and Japan.
  3. Improve transparency and information flow about IP, including improving patent quality, standardizing licensing models, promoting multiparty cross-licensing, and making economic terms of licenses transparent to everyone in the innovation ecosystem.
  4. Provide economic incentives for collaboration, rewarding those who make their patents freely available for use to address important social problems. We need to promote widespread and friction-free use of technology to take on these important challenges…(More)”.

Farmer-Centric Data Governance: Towards A New Paradigm


Report, six Deep Dives, and nine Case Studies by The Development Gateway: “..provide user-centric approaches to data governance that places farmers and their communities at the center of data gathering initiatives and aims to reduce the negative effects of centralized power. The findings are based on literature, interviews, and workshops, to gather the experiences of change-makers and aims to:
• Raise awareness around the current political economy of agricultural data and its implications;
• Identify user-centric data governance models and mechanisms, particularly in LMICs;
• Demonstrate the purpose, value, benefits, and challenges of these models for all stakeholders; and
• Identify appropriate and relevant actionable principles, recommendations, and considerations related to user-centric data governance in the agriculture sector for the donor community…(More)”

COVID isn’t going anywhere, neither should our efforts to increase responsible access to data


Article by Andrew J. Zahuranec, Hannah Chafetz and Stefaan Verhulst: “..Moving forward, institutions will need to consider how to embed non-traditional data capacity into their decision-making to better understand the world around them and respond to it.

For example, wastewater surveillance programmes that emerged during the pandemic continue to provide valuable insights about outbreaks before they are reported by clinical testing and have the potential to be used for other emerging diseases.

We need these and other programmes now more than ever. Governments and their partners need to maintain and, in many cases, strengthen the collaborations they established through the pandemic.

To address future crises, we need to institutionalize new data capacities – particularly those involving non-traditional datasets that may capture digital information that traditional health surveys and statistical methods often miss.

The figure above summarizes the types and sources of non-traditional data sources that stood out most during the COVID-19 response.

The types and sources of non-traditional data sources that stood out most during the COVID-19 response. Image: The GovLab

In our report, we suggest four pathways to advance the responsible access to non-traditional data during future health crises…(More)”.

Data solidarity: why sharing is not always caring 


Essay by Barbara Prainsack: “To solve these problems, we need to think about data governance in new ways. It is no longer enough to assume that asking people to consent to how their data is used is sufficient to prevent harm. In our example of telehealth, and in virtually all data-related scandals of the last decade, from Cambridge Analytica to Robodebt, informed consent did not, or could not, have avoided the problem. We all regularly agree to data uses that we know are problematic – not because we do not care about privacy. We agree because this is the only way to get access to benefits, a mortgage, or teachers and health professionals. In a world where face-to-face assessments are unavailable or excessively expensive, opting out of digital practices would no longer be an option (Prainsack, 2017, pp. 126-131; see also Oudshoorn, 2011).

Solidarity-based data governance (in short: data solidarity) can help us to distribute the risks and the benefits of digital practices more equitably. The details of the framework are spelled out in full elsewhere (Prainsack et al., 2022a, b). In short, data solidarity seeks to facilitate data uses that create significant public value, and at the same time prevent and mitigate harm (McMahon et al., 2020). One important step towards both goals is to stop ascribing risks to data types, and to distinguish between different types of data use instead. In some situations, harm can be prevented by making sure that data is not used for harmful purposes, such as online tracking. In other contexts, however, harm prevention can require that we do not collect the data in the first place. Not recording something, making it invisible and uncountable to others, can be the most responsible way to act in some contexts.

This means that recording and sharing data should not become a default. More data is not always better. Instead, policymakers need to consider carefully – in a dialogue with the people and communities that have a stake in it – what should be recorded, where it will be stored and who governs the data once it has been collected – if at all (see also Kukutai and Taylor, 2016)…(More)”.

Researchers scramble as Twitter plans to end free data access


Article by Heidi Ledford: “Akin Ünver has been using Twitter data for years. He investigates some of the biggest issues in social science, including political polarization, fake news and online extremism. But earlier this month, he had to set aside time to focus on a pressing emergency: helping relief efforts in Turkey and Syria after the devastating earthquake on 6 February.

Aid workers in the region have been racing to rescue people trapped by debris and to provide health care and supplies to those displaced by the tragedy. Twitter has been invaluable for collecting real-time data and generating crucial maps to direct the response, says Ünver, a computational social scientist at Özyeğin University in Istanbul.

So when he heard that Twitter was about to end its policy of providing free access to its application programming interface (API) — a pivotal set of rules that allows people to extract and process large amounts of data from the platform — he was dismayed. “Couldn’t come at a worse time,” he tweeted. “Most analysts and programmers that are building apps and functions for Turkey earthquake aid and relief, and are literally saving lives, are reliant on Twitter API.”..

Twitter has long offered academics free access to its API, an unusual approach that has been instrumental in the rise of computational approaches to studying social media. So when the company announced on 2 February that it would end that free access in a matter of days, it sent the field into a tailspin. “Thousands of research projects running over more than a decade would not be possible if the API wasn’t free,” says Patty Kostkova, who specializes in digital health studies at University College London…(More)”.

How ChatGPT Hijacks Democracy


Article by Nathan E. Sanders and Bruce Schneier:”…But for all the consternation over the potential for humans to be replaced by machines in formats like poetry and sitcom scripts, a far greater threat looms: artificial intelligence replacing humans in the democratic processes — not through voting, but through lobbying.

ChatGPT could automatically compose comments submitted in regulatory processes. It could write letters to the editor for publication in local newspapers. It could comment on news articles, blog entries and social media posts millions of times every day. It could mimic the work that the Russian Internet Research Agency did in its attempt to influence our 2016 elections, but without the agency’s reported multimillion-dollar budget and hundreds of employees.Automatically generated comments aren’t a new problem. For some time, we have struggled with bots, machines that automatically post content. Five years ago, at least a million automatically drafted comments were believed to have been submitted to the Federal Communications Commission regarding proposed regulations on net neutrality. In 2019, a Harvard undergraduate, as a test, used a text-generation program to submit 1,001 comments in response to a government request for public input on a Medicaid issue. Back then, submitting comments was just a game of overwhelming numbers…(More)”

The Economics of Digital Privacy


Paper by Avi Goldfarb & Verina F. Que: “There has been increasing attention to privacy in the media and in regulatory discussions. This is a consequence of the increased usefulness of digital data. The literature has emphasized the benefits and costs of digital data flows to consumers and firms. The benefits arise in the form of data-driven innovation, higher quality products and services that match consumer needs, and increased profits. The costs relate to intrinsic and instrumental values of privacy. Under standard economic assumptions, this framing of a cost-benefit tradeoff might suggest little role for regulation beyond ensuring consumers are appropriately informed in a robust competitive environment. The empirical literature thus far has focused on this direct cost-benefit assessment, examining how privacy regulations have affected various market outcomes. However, an increasing body of theory work emphasizes externalities related to data flows. These externalities, both positive and negative, suggest benefits to the targeted regulation of digital privacy…(More)”.