Mathematicians are deploying algorithms to stop gerrymandering


Article by Siobhan Roberts: “The maps for US congressional and state legislative races often resemble electoral bestiaries, with bizarrely shaped districts emerging from wonky hybrids of counties, precincts, and census blocks.

It’s the drawing of these maps, more than anything—more than voter suppression laws, more than voter fraud—that determines how votes translate into who gets elected. “You can take the same set of votes, with different district maps, and get very different outcomes,” says Jonathan Mattingly, a mathematician at Duke University in the purple state of North Carolina. “The question is, if the choice of maps is so important to how we interpret these votes, which map should we choose, and how should we decide if someone has done a good job in choosing that map?”

Over recent months, Mattingly and like-minded mathematicians have been busy in anticipation of a data release expected today, August 12, from the US Census Bureau. Every decade, new census data launches the decennial redistricting cycle—state legislators (or sometimes appointed commissions) draw new maps, moving district lines to account for demographic shifts.

In preparation, mathematicians are sharpening new algorithms—open-source tools, developed over recent years—that detect and counter gerrymandering, the egregious practice giving rise to those bestiaries, whereby politicians rig the maps and skew the results to favor one political party over another. Republicans have openly declared that with this redistricting cycle they intend to gerrymander a path to retaking the US House of Representatives in 2022….(More)”.

The controversy over the term ‘citizen science’


CBC News: “The term citizen science has been around for decades. Its original definition, coined in the 1990s, refers to institution-guided projects that invite the public to contribute to scientific knowledge in all kinds of ways, from the cataloguing of plants, animals and insects in people’s backyards to watching space.

Anyone is invited to participate in citizen science, regardless of whether they have an academic background in the sciences, and every year these projects number in the thousands. 

Recently, however, some large institutions, scientists and community members have proposed replacing the term citizen science with “community science.” 

Those in favour of the terminology change — such as eBird, one of the world’s largest biodiversity databases — say they want to avoid using the word citizen. They do so because they want to be “welcoming to any birder or person who wants to learn more about bird watching, regardless of their citizen status,” said Lynn Fuller, an eBird spokesperson, in a news release earlier this year. 

Some argue that while the intention is valid, the term community science already holds another definition — namely projects that gather different groups of people around environmental justice focused on social action. 

To add to the confusion, renaming citizen science could impact policies and legislation that have been established in countries such as the U.S. and Canada to support projects and efforts in favour of citizen science. 

For example, if we suddenly decided to call all species of birds “waterbirds,” then the specific meaning of this category of bird species that lives on or around water would eventually be lost. This would, in turn, make communication between people and the various fields of science incredibly difficult. 

A paper published in Science magazine last month pointed out some of the reasons why rebranding citizen science in the name of inclusion could backfire. 

Caren Cooper, a professor of forestry and environmental resources at North Carolina State University and one of the authors of the paper, said that the term citizen science didn’t originally mean to imply that people should have a certain citizenship status to participate in such projects. 

Rather, citizen science is meant to convey the idea of responsibilities and rights to access science. 

She said there are other terms being used to describe this meaning, including “public science, participatory science [and] civic science.”

Chris Hawn, a professor of geography and environmental systems at the University of Maryland Baltimore County and one of Cooper’s co-authors, said that being aware of the need for change is a good first step, but any decision to rename should be made carefully….(More)”.

Indigenous Peoples Rise Up: The Global Ascendency of Social Media Activism


Book edited by Bronwyn Carlson and Jeff Berglund: “…llustrates the impact of social media in expanding the nature of Indigenous communities and social movements. Social media has bridged distance, time, and nation states to mobilize Indigenous peoples to build coalitions across the globe and to stand in solidarity with one another. These movements have succeeded and gained momentum and traction precisely because of the strategic use of social media. Social media—Twitter and Facebook in particular—has also served as a platform for fostering health, well-being, and resilience, recognizing Indigenous strength and talent, and sustaining and transforming cultural practices when great distances divide members of the same community.
 
Including a range of international indigenous voices from the US, Canada, Australia, Aotearoa (New Zealand) and Africa, the book takes an interdisciplinary approach, bridging Indigenous studies, media studies, and social justice studies. Including examples like Idle No More in Canada, Australian Recognise!, and social media campaigns to maintain Maori language, Indigenous Peoples Rise Up serves as one of the first studies of Indigenous social media use and activism…(More)”.

The Society of Algorithms


Paper by Jenna Burrell and Marion Fourcade: “The pairing of massive data sets with processes—or algorithms—written in computer code to sort through, organize, extract, or mine them has made inroads in almost every major social institution. This article proposes a reading of the scholarly literature concerned with the social implications of this transformation. First, we discuss the rise of a new occupational class, which we call the coding elite. This group has consolidated power through their technical control over the digital means of production and by extracting labor from a newly marginalized or unpaid workforce, the cybertariat. Second, we show that the implementation of techniques of mathematical optimization across domains as varied as education, medicine, credit and finance, and criminal justice has intensified the dominance of actuarial logics of decision-making, potentially transforming pathways to social reproduction and mobility but also generating a pushback by those so governed. Third, we explore how the same pervasive algorithmic intermediation in digital communication is transforming the way people interact, associate, and think. We conclude by cautioning against the wildest promises of artificial intelligence but acknowledging the increasingly tight coupling between algorithmic processes, social structures, and subjectivities….(More)”.

Whose Streets? Our Streets!


Report by Rebecca Williams: “The extent to which “smart city” technology is altering our sense of freedom in public spaces deserves more attention if we want a democratic future. Democracy–the rule of the people–constitutes our collective self-determination and protects us against domination and abuse. Democracy requires safe spaces, or commons, for people to organically and spontaneously convene regardless of their background or position to campaign for their causes, discuss politics, and protest. In these commons, where anyone can take a stand and be noticed is where a notion of collective good can be developed and communicated. Public spaces, like our streets, parks, and squares, have historically played a significant role in the development of democracy. We should fight to preserve the freedoms intrinsic to our public spaces because they make democracy possible.

Last summer, approximately 15 to 26 million people participated in Black Lives Matter protests after the murder of George Floyd making it the largest mass movement in U.S. history. In June, the San Diego Police Department obtained footage of Black Lives Matter protesters from “smart streetlight” cameras, sparking shock and outrage from San Diego community members. These “smart streetlights” were promoted as part of citywide efforts to become a “smart city” to help with traffic control and air quality monitoring. Despite discoverable documentation about the streetlight’s capabilities and data policies on their website, including a data-sharing agreement about how they would share data with the police, the community had no expectation that the streetlights would be surveilling protestors. After media coverage and ongoing advocacy from the Transparent and Responsible Use of Surveillance Technology San Diego (TRUSTSD) coalition, the City Council, set aside the funding for the streetlights4 until a surveillance technology ordinance was considered and the Mayor ordered the 3,000+ streetlight cameras off. Due to the way power was supplied to the cameras, they remained on, but the city reported it no longer had access to the data it collected. In November, the City Council voted unanimously in favor of a surveillance ordinance and to establish a Privacy Advisory Board.In May, it was revealed that the San Diego Police Department had previously (in 2017) held back materials to Congress’ House Committee on Oversight and Reform about their use facial recognition technology. This story, with its mission creep and mishaps, is representative of a broader set of “smart city” cautionary trends that took place in the last year. These cautionary trends call us to question if our public spaces become places where one fears punishment, how will that affect collective action and political movements?

This report is an urgent warning of where we are headed if we maintain our current trajectory of augmenting our public space with trackers of all kinds. In this report, I outline how current “smart city” technologies can watch you. I argue that all “smart city” technology trends toward corporate and state surveillance and that if we don’t stop and blunt these trends now that totalitarianism, panopticonism, discrimination, privatization, and solutionism will challenge our democratic possibilities. This report examines these harms through cautionary trends supported by examples from this last year and provides 10 calls to action for advocates, legislatures, and technology companies to prevent these harms. If we act now, we can ensure the technology in our public spaces protect and promote democracy and that we do not continue down this path of an elite few tracking the many….(More)”

Designing data collaboratives to better understand human mobility and migration in West Africa



“The Big Data for Migration Alliance (BD4M) is released the report, “Designing Data Collaboratives to Better Understand Human Mobility and Migration in West Africa,” providing findings from a first-of-its-kind rapid co-design and prototyping workshop, or “Studio.” The first BD4M Studio convened over 40 stakeholders in government, international organizations, research, civil society, and the public sector to develop concrete strategies for developing and implementing cross- sectoral data partnerships, or “data collaboratives,” to improve ethical and secure access to data for migration-related policymaking and research in West Africa.

BD4M is an effort spearheaded by the International Organization for Migration’s Global Migration Data Analysis Centre (IOM GMDAC), European Commission’s Joint Research Centre (JRC), and The GovLab to accelerate the responsible and ethical use of novel data sources and methodologies—such as social media, mobile phone data, satellite imagery, artificial intelligence—to support migration-related programming and policy on the global, national, and local levels. 

The BD4M Studio was informed by The Migration Domain of The 100 Questions Initiative — a global agenda-setting exercise to define the most impactful questions related to migration that could be answered through data collaboration. Inspired by the outputs of The 100 Questions, Studio participants designed data collaboratives that could produce answers to three key questions: 

  1. How can data be used to estimate current cross-border migration and mobility by sex and age in West Africa?
  2.  How can data be used to assess the current state of diaspora communities and their migration behavior in the region?
  3. How can we use data to better understand the drivers of migration in West Africa?…(More)”

Off-Label: How tech platforms decide what counts as journalism


Essay by Emily Bell: “…But putting a stop to militarized fascist movements—and preventing another attack on a government building—will ultimately require more than content removal. Technology companies need to fundamentally recalibrate how they categorize, promote, and circulate everything under their banner, particularly news. They have to acknowledge their editorial responsibility.

The extraordinary power of tech platforms to decide what material is worth seeing—under the loosest possible definition of who counts as a “journalist”—has always been a source of tension with news publishers. These companies have now been put in the position of being held accountable for developing an information ecosystem based in fact. It’s unclear how much they are prepared to do, if they will ever really invest in pro-truth mechanisms on a global scale. But it is clear that, after the Capitol riot, there’s no going back to the way things used to be.

Between 2016 and 2020, Facebook, Twitter, and Google made dozens of announcements promising to increase the exposure of high-quality news and get rid of harmful misinformation. They claimed to be investing in content moderation and fact-checking; they assured us that they were creating helpful products like the Facebook News Tab. Yet the result of all these changes has been hard to examine, since the data is both scarce and incomplete. Gordon Crovitz—a former publisher of the Wall Street Journal and a cofounder of NewsGuard, which applies ratings to news sources based on their credibility—has been frustrated by the lack of transparency: “In Google, YouTube, Facebook, and Twitter we have institutions that we know all give quality ratings to news sources in different ways,” he told me. “But if you are a news organization and you want to know how you are rated, you can ask them how these systems are constructed, and they won’t tell you.” Consider the mystery behind blue-check certification on Twitter, or the absurdly wide scope of the “Media/News” category on Facebook. “The issue comes down to a fundamental failure to understand the core concepts of journalism,” Crovitz said.

Still, researchers have managed to put together a general picture of how technology companies handle various news sources. According to Jennifer Grygiel, an assistant professor of communications at Syracuse University, “we know that there is a taxonomy within these companies, because we have seen them dial up and dial down the exposure of quality news outlets.” Internally, platforms rank journalists and outlets and make certain designations, which are then used to develop algorithms for personalized news recommendations and news products….(More)”

It’s hard to be a moral person. Technology is making it harder.


Article by Sigal Samuel: “The idea of moral attention goes back at least as far as ancient Greece, where the Stoics wrote about the practice of attention (prosoché) as the cornerstone of a good spiritual life. In modern Western thought, though, ethicists didn’t focus too much on attention until a band of female philosophers came along, starting with Simone Weil.

Weil, an early 20th-century French philosopher and Christian mystic, wrote that “attention is the rarest and purest form of generosity.” She believed that to be able to properly pay attention to someone else — to become fully receptive to their situation in all its complexity — you need to first get your own self out of the way. She called this process “decreation,” and explained: “Attention consists of suspending our thought, leaving it detached, empty … ready to receive in its naked truth the object that is to penetrate it.”

Weil argued that plain old attention — the kind you use when reading novels, say, or birdwatching — is a precondition for moral attention, which is a precondition for empathy, which is a precondition for ethical action.

Later philosophers, like Iris Murdoch and Martha Nussbaum, picked up and developed Weil’s ideas. They garbed them in the language of Western philosophy; Murdoch, for example, appeals to Plato as she writes about the need for “unselfing.” But this central idea of “unselfing” or “decreation” is perhaps most reminiscent of Eastern traditions like Buddhism, which has long emphasized the importance of relinquishing our ego and training our attention so we can perceive and respond to others’ needs. It offers tools like mindfulness meditation for doing just that…(More)”

Who will benefit from big data? Farmers’ perspective on willingness to share farm data


Paper by Airong Zhang et al : “Agricultural industries are facing a dual challenge of increasing production to meet the growing population with a disruptive changing climate and, at the same time, reducing its environmental impacts. Digital agriculture supported by big data technology has been regarded as a solution to address such challenges. However, realising the potential value promised by big data technology depends upon farm-level data generated by digital agriculture being aggregated at scale. Yet, there is limited understanding of farmers’ willingness to contribute agricultural data for analysis and how that willingness could be affected by their perceived beneficiary of the aggregated data.

The present study aimed to investigate farmers’ perspective on who would benefit the most from the aggregated agricultural data, and their willingness to share their input and output farm data with a range of agricultural sector stakeholders (i.e. other farmers, industry and government statistical organisations, technology businesses, and research institutions). To do this, we conducted a computer-assisted telephone interview with 880 Australian farmers from broadacre agricultural sectors. The results show that only 34 % of participants regarded farmers as the primary beneficiary of aggregated agricultural data, followed by agribusiness (35 %) and government (21 %) as the main beneficiary. The participants’ willingness to share data was mostly positive. However, the level of willingness fluctuated depending on who was perceived as the primary beneficiary and with which stakeholder the data would be shared. While participants reported concerns over aggregated farm data being misused and privacy of own farm data, perception of farmers being the primary beneficiary led to the lowest levels of concerns. The findings highlight that, to seize the opportunities of sustainable agriculture through applying big data technologies, significant value propositions for farmers need to be created to provide a reason for farmers to share data, and a higher level of trust between farmers and stakeholders, especially technology and service providers, needs to be established….(More)”.

Ethical Governance of Artificial Intelligence in the Public Sector


Book by Liza Ireni-Saban and Maya Sherman: “This book argues that ethical evaluation of AI should be an integral part of public service ethics and that an effective normative framework is needed to provide ethical principles and evaluation for decision-making in the public sphere, at both local and international levels.

It introduces how the tenets of prudential rationality ethics, through critical engagement with intersectionality, can contribute to a more successful negotiation of the challenges created by technological innovations in AI and afford a relational, interactive, flexible and fluid framework that meets the features of AI research projects, so that core public and individual values are still honoured in the face of technological development….(More)”.