Law Enforcement and Technology: Using Social Media


Congressional Research Service Report: “As the ways in which individuals interact continue to evolve, social media has had an increasing role in facilitating communication and the sharing of content online—including moderated and unmoderated, user-generated content. Over 70% of U.S. adults are estimated to have used social media in 2021. Law enforcement has also turned to social media to help in its operations. Broadly, law enforcement relies on social media as a tool for information sharing as well as for gathering information to assist in investigations.


Social Media as a Communications Tool. Social media is one of many tools law enforcement can use to connect with the community. They may use it, for instance, to push out bulletins on wanted persons and establish tip lines to crowdsource potential investigative leads. It provides degrees of speed and reach unmatched by many other forms of communication law enforcement can use to connect with the public. Officials and researchers have highlighted social media as a tool that, if used properly, can enhance community policing.

Social Media and Investigations. Social media is one tool in agencies’ investigative toolkits to help establish investigative leads and assemble evidence on potential suspects. There are no federal laws that specifically govern law enforcement agencies’ use of information obtained from social media sites, but their ability to obtain or use certain information may be influenced by social media companies’ policies as well as law enforcement agencies’ own social media policies and the rules of criminal procedure. When individuals post content on social media platforms without audience restrictions, anyone— including law enforcement—can access this content without court authorization. However, some information that individuals post on social media may be restricted—by user choice or platform policies—in the scope of audience that may access it. In the instances where law enforcement does not have public access to information, they may rely on a number of tools and techniques, such as informants or undercover operations, to gain access to it. Law enforcement may also require social media platforms to provide access to certain restricted information through a warrant, subpoena, or other court order.

Social Media and Intelligence Gathering. The use of social media to gather intelligence has generated particular interest from policymakers, analysts, and the public. Social media companies have weighed in on the issue of social media monitoring by law enforcement, and some platforms have modified their policies to expressly prohibit their user data from being used by law enforcement to monitor social media. Law enforcement agencies themselves have reportedly grappled with the extent to which they should gather and rely on information and intelligence gleaned from social media. For instance, some observers have suggested that agencies may be reluctant to regularly analyze public social media posts because that could be viewed as spying on the American public and could subsequently chill free speech protected under the First Amendment…(More)”.

Surveillance Publishing


Working paper by Jefferson D. Pooley: “…This essay lingers on a prediction too: Clarivate’s business model is coming for scholarly publishing. Google is one peer, but the company’s real competitors are Elsevier, Springer Nature, Wiley, Taylor & Francis, and SAGE. Elsevier, in particular, has been moving into predictive analytics for years now. Of course the publishing giants have long profited off of academics and our university employers— by packaging scholars’ unpaid writing-and-editing labor only to sell it back to us as usuriously priced subscriptions or APCs. That’s a lucrative business that Elsevier and the others won’t give up. But they’re layering another business on top of their legacy publishing operations, in the Clarivate mold. The data trove that publishers are sitting on is, if anything, far richer than the citation graph alone. Why worry about surveillance publishing? One reason is the balance-sheet, since the companies’ trading in academic futures will further pad profits at the expense of taxpayers and students. The bigger reason is that our behavior—once alienated from us and abstracted into predictive metrics—will double back onto our work lives. Existing biases, like male academics’ propensity for selfcitation, will receive a fresh coat of algorithmic legitimacy. More broadly, the academic reward system is already distorted by metrics. To the extent that publishers’ tallies and indices get folded into grant-making, tenure-and-promotion, and other evaluative decisions, the metric tide will gain power. The biggest risk is that scholars will internalize an analytics mindset, one already encouraged by citation counts and impact factors….(More)”.

Legal study on Government access to data in third countries


Report commissioned by the European Data Protection Board: “The present report is part of a study analysing the implications for the work of the European Union (EU)/ European Economic Area (EEA) data protection supervisory authorities (SAs) in relation to transfers of personal data to third countries after the Court of Justice of the European Union (CJEU) judgment C- 311/18 on Data Protection Commissioner v. Facebook Ireland Ltd, Maximilian Schrems (Schrems II). Data controllers and processors may transfer personal data to third countries or international organisations only if the controller or processor has provided appropriate safeguards, and on the condition that enforceable data subject rights and effective legal remedies for data subjects are available.

Whereas it is the primary responsibility of data exporters and data importers to assess that the legislation of the country of destination enables the data importer to comply with any of the appropriate safeguards, SAs will play a key role when issuing further decisions on transfers to third countries. Hence, this report provides the European Data Protection Board (EDPB) and the SAs in the EEA/EU with information on the legislation and practice in China, India and Russia on their governments’ access to personal data processed by economic operators. The report contains an overview of the relevant information in order for the SAs to assess whether and to what extent legislation and practices in the abovementioned countries imply massive and/or indiscriminate access to personal data processed by economic operators…(More)”.

Nudges: Four reasons to doubt popular technique to shape people’s behavior


Article by Magda Osman: “Throughout the pandemic, many governments have had to rely on people doing the right thing to reduce the spread of the coronavirus – ranging from social distancing to handwashing. Many enlisted the help of psychologists for advice on how to “nudge” the public to do what was deemed appropriate.

Nudges have been around since the 1940s and originally were referred to as behavioural engineering. They are a set of techniques developed by psychologists to promote “better” behaviour through “soft” interventions rather than “hard” ones (mandates, bans, fines). In other words, people aren’t punished if they fail to follow them. The nudges are based on psychological and behavioural economic research into human behaviour and cognition.

The nudges can involve subtle as well as obvious methods. Authorities may set a “better” choice, such as donating your organs, as a default – so people have to opt out of a register rather than opt in. Or they could make a healthy option more attractive through food labelling.

But, despite the soft approach, many people aren’t keen on being nudged. During the pandemic, for example, scientists examined people’s attitudes to nudging in social and news media in the UK, and discovered that half of the sentiments expressed in social media posts were negative…(More)”.

Interoperable, agile, and balanced


Brookings Paper on Rethinking technology policy and governance for the 21st century: “Emerging technologies are shifting market power and introducing a range of risks that can only be managed through regulation. Unfortunately, current approaches to governing technology are insufficient, fragmented, and lack the focus towards actionable goals. This paper proposes three tools that can be leveraged to support fit-for-purpose technology regulation for the 21st century: First, a transparent and holistic policymaking levers that clearly communicate goals and identify trade-offs at the national and international levels; second, revamped efforts to collaborate across jurisdictions, particularly through standard-setting and evidence gathering of critical incidents across jurisdictions; and third, a shift towards agile governance, whether acquired through the system, design, or both…(More)”.

Technology and the Global Struggle for Democracy


Essay by Manuel Muniz: “The commemoration of the first anniversary of the January 6, 2021, attack on the US Capitol by supporters of former President Donald Trump showed that the extreme political polarization that fueled the riot also frames Americans’ interpretations of it. It would, however, be gravely mistaken to view what happened as a uniquely American phenomenon with uniquely American causes. The disruption of the peaceful transfer of power that day was part of something much bigger.

As part of the commemoration, President Joe Biden said that a battle is being fought over “the soul of America.” What is becoming increasingly clear is that this is also true of the international order: its very soul is at stake. China is rising and asserting itself. Populism is widespread in the West and major emerging economies. And chauvinistic nationalism has re-emerged in parts of Europe. All signs point to increasing illiberalism and anti-democratic sentiment around the world.

Against this backdrop, the US hosted in December a (virtual) “Summit for Democracy” that was attended by hundreds of national and civil-society leaders. The message of the gathering was clear: democracies must assert themselves firmly and proactively. To that end, the summit devoted numerous sessions to studying the digital revolution and its potentially harmful implications for our political systems.

Emerging technologies pose at least three major risks for democracies. The first concerns how they structure public debate. Social networks balkanize public discourse by segmenting users into ever smaller like-minded communities. Algorithmically-driven information echo chambers make it difficult to build social consensus. Worse, social networks are not liable for the content they distribute, which means they can allow misinformation to spread on their platforms with impunity…(More)”.

A data ‘black hole’: Europol ordered to delete vast store of personal data


Article by Apostolis Fotiadis, Ludek Stavinoha, Giacomo Zandonini, Daniel Howden: “…The EU’s police agency, Europol, will be forced to delete much of a vast store of personal data that it has been found to have amassed unlawfully by the bloc’s data protection watchdog. The unprecedented finding from the European Data Protection Supervisor (EDPS) targets what privacy experts are calling a “big data ark” containing billions of points of information. Sensitive data in the ark has been drawn from crime reports, hacked from encrypted phone services and sampled from asylum seekers never involved in any crime.

According to internal documents seen by the Guardian, Europol’s cache contains at least 4 petabytes – equivalent to 3m CD-Roms or a fifth of the entire contents of the US Library of Congress. Data protection advocates say the volume of information held on Europol’s systems amounts to mass surveillance and is a step on its road to becoming a European counterpart to the US National Security Agency (NSA), the organisation whose clandestine online spying was revealed by whistleblower Edward Snowden….(More)”.

Are we witnessing the dawn of post-theory science?


Essay by Laura Spinney: “Does the advent of machine learning mean the classic methodology of hypothesise, predict and test has had its day?..

Isaac Newton apocryphally discovered his second law – the one about gravity – after an apple fell on his head. Much experimentation and data analysis later, he realised there was a fundamental relationship between force, mass and acceleration. He formulated a theory to describe that relationship – one that could be expressed as an equation, F=ma – and used it to predict the behaviour of objects other than apples. His predictions turned out to be right (if not always precise enough for those who came later).

Contrast how science is increasingly done today. Facebook’s machine learning tools predict your preferences better than any psychologist. AlphaFold, a program built by DeepMind, has produced the most accurate predictions yet of protein structures based on the amino acids they contain. Both are completely silent on why they work: why you prefer this or that information; why this sequence generates that structure.

You can’t lift a curtain and peer into the mechanism. They offer up no explanation, no set of rules for converting this into that – no theory, in a word. They just work and do so well. We witness the social effects of Facebook’s predictions daily. AlphaFold has yet to make its impact felt, but many are convinced it will change medicine.

Somewhere between Newton and Mark Zuckerberg, theory took a back seat. In 2008, Chris Anderson, the then editor-in-chief of Wired magazine, predicted its demise. So much data had accumulated, he argued, and computers were already so much better than us at finding relationships within it, that our theories were being exposed for what they were – oversimplifications of reality. Soon, the old scientific method – hypothesise, predict, test – would be relegated to the dustbin of history. We’d stop looking for the causes of things and be satisfied with correlations.

With the benefit of hindsight, we can say that what Anderson saw is true (he wasn’t alone). The complexity that this wealth of data has revealed to us cannot be captured by theory as traditionally understood. “We have leapfrogged over our ability to even write the theories that are going to be useful for description,” says computational neuroscientist Peter Dayan, director of the Max Planck Institute for Biological Cybernetics in Tübingen, Germany. “We don’t even know what they would look like.”

But Anderson’s prediction of the end of theory looks to have been premature – or maybe his thesis was itself an oversimplification. There are several reasons why theory refuses to die, despite the successes of such theory-free prediction engines as Facebook and AlphaFold. All are illuminating, because they force us to ask: what’s the best way to acquire knowledge and where does science go from here?…(More)”

A data-based participatory approach for health equity and digital inclusion: prioritizing stakeholders


Paper by Aristea Fotopoulou, Harriet Barratt, and Elodie Marandet: “This article starts from the premise that projects informed by data science can address social concerns, beyond prioritizing the design of efficient products or services. How can we bring the stakeholders and their situated realities back into the picture? It is argued that data-based, participatory interventions can improve health equity and digital inclusion while avoiding the pitfalls of top-down, technocratic methods. A participatory framework puts users, patients and citizens as stakeholders at the centre of the process, and can offer complex, sustainable benefits, which go beyond simply the experience of participation or the development of an innovative design solution. A significant benefit for example is the development of skills, which should not be seen as a by-product of the participatory processes, but a central element of empowering marginalized or excluded communities to participate in public life. By drawing from different examples in various domains, the article discusses what can be learnt from implementations of schemes using data science for social good, human-centric design, arts and wellbeing, to argue for a data-centric, creative and participatory approach to address health equity and digital inclusion in tandem…(More)”.

The Tech That Comes Next


Book by Amy Sample Ward and Afua Brice: “Who is part of technology development, who funds that development, and how we put technology to use all influence the outcomes that are possible. To change those outcomes, we must – all of us – shift our relationship to technology, how we use it, build it, fund it, and more. In The Tech That Comes Next, Amy Sample Ward and Afua Bruce – two leaders in equitable design and use of new technologies – invite you to join them in asking big questions and making change from wherever you are today. 

This book connects ideas and conversations across sectors from artificial intelligence to data collection, community centered design to collaborative funding, and social media to digital divides. Technology and equity are inextricably connected, and The Tech That Comes Next helps you accelerate change for the better….(More)