How Native Americans Are Trying to Debug A.I.’s Biases


Alex V. Cipolle in The New York Times: “In September 2021, Native American technology students in high school and college gathered at a conference in Phoenix and were asked to create photo tags — word associations, essentially — for a series of images.

One image showed ceremonial sage in a seashell; another, a black-and-white photograph circa 1884, showed hundreds of Native American children lined up in uniform outside the Carlisle Indian Industrial School, one of the most prominent boarding schools run by the American government during the 19th and 20th centuries.

For the ceremonial sage, the students chose the words “sweetgrass,” “sage,” “sacred,” “medicine,” “protection” and “prayers.” They gave the photo of the boarding school tags with a different tone: “genocide,” “tragedy,” “cultural elimination,” “resiliency” and “Native children.”

The exercise was for the workshop Teaching Heritage to Artificial Intelligence Through Storytelling at the annual conference for the American Indian Science and Engineering Society. The students were creating metadata that could train a photo recognition algorithm to understand the cultural meaning of an image.

The workshop presenters — Chamisa Edmo, a technologist and citizen of the Navajo Nation, who is also Blackfeet and Shoshone-Bannock; Tracy Monteith, a senior Microsoft engineer and member of the Eastern Band of Cherokee Indians; and the journalist Davar Ardalan — then compared these answers with those produced by a major image recognition app.

For the ceremonial sage, the app’s top tag was “plant,” but other tags included “ice cream” and “dessert.” The app tagged the school image with “human,” “crowd,” “audience” and “smile” — the last a particularly odd descriptor, given that few of the children are smiling.

The image recognition app botched its task, Mr. Monteith said, because it didn’t have proper training data. Ms. Edmo explained that tagging results are often “outlandish” and “offensive,” recalling how one app identified a Native American person wearing regalia as a bird. And yet similar image recognition apps have identified with ease a St. Patrick’s Day celebration, Ms. Ardalan noted as an example, because of the abundance of data on the topic….(More)”.

The first answer for food insecurity: data sovereignty


Interview by Brian Oaster: “For two years now, the COVID-19 pandemic has exacerbated almost every structural inequity in Indian Country. Food insecurity is high on that list.

Like other inequities, it’s an intergenerational product of dispossession and congressional underfunding — nothing new for Native communities. What is new, however, is the ability of Native organizations and sovereign nations to collectively study and understand the needs of the many communities facing the issue. The age of data sovereignty has (finally) arrived.

To that end, the Native American Agriculture Fund (NAAF) partnered with the Indigenous Food and Agricultural Initiative (INAI) and the Food Research and Action Center (FRAC) to produce a special report, Reimagining Hunger Responses in Times of Crisis, which was released in January.

According to the report, 48% of the more than 500 Native respondents surveyed across the country agreed that “sometimes or often during the pandemic the food their household bought just didn’t last, and they didn’t have money to get more.” Food security and access were especially low among Natives with young children or elders at home, people in fair to poor health and those whose employment was disrupted by the pandemic. “Native households experience food insecurity at shockingly higher rates than the general public and white households,” the report noted.

It also detailed how, throughout the pandemic, Natives overwhelmingly turned to their tribal governments and communities — as opposed to state or federal programs — for help. State and federal programs, like the Supplement Nutrition Assistance Program, or SNAP, don’t always mesh with the needs of rural reservations. A benefits card is useless if there’s no food store in your community. In response, tribes and communities came together and worked to get their people fed.

Understanding how and why will help pave the way for legislation that empowers tribes to provide for their own people, by using federal funding to build local agricultural infrastructure, for instance, instead of relying on assistance programs that don’t always work. HCN spoke with the Native American Agriculture Fund’s CEO, Toni Stanger-McLaughlin (Colville), to find out more…(More)”.

Towards a Standard for Identifying and Managing Bias in Artificial Intelligence


NIST Report: “As individuals and communities interact in and with an environment that is increasingly virtual they are often vulnerable to the commodification of their digital exhaust. Concepts and behavior that are ambiguous in nature are captured in this environment, quantified, and used to categorize, sort, recommend, or make decisions about people’s lives. While many organizations seek to utilize this information in a responsible manner, biases remain endemic across technology processes and can lead to harmful impacts regardless of intent. These harmful outcomes, even if inadvertent, create significant challenges for cultivating public trust in artificial intelligence (AI)….(More)”

Crypto’s “Freedom to Transact” May Actually Threaten Human Rights


Essay by Elizabeth M. Renieris: “What began as a small convoy of truck drivers protesting COVID-19 vaccine mandates in late January quickly grew to a large-scale protest blocking nearly $350 million a day in trade and crippling the transport of vital supplies across the US-Canada border for more than three weeks. After struggling to disband the protestors, Canadian Prime Minister Justin Trudeau invoked the Emergencies Act for the first time since its passage in 1988, compelling financial institutions to freeze the assets of protesters and urging local cryptocurrency exchanges not to process transactions from 253 bitcoin addresses suspected of supporting their efforts. Cryptocurrency promoters responded with outrage, siding with truckers, and calling Trudeau’s actions authoritarian, even comparing the Canadian prime minister to Hitler.

Days later, Russian President Vladimir Putin plunged the world into geopolitical instability with a full-scale unprovoked military invasion of Ukraine, resulting in mounting civilian causalities and sparking the biggest refugee crisis since the Second World War. Fearing the ramifications of a military response, governments around the world imposed an array of targeted financial sanctions, freezing and seizing the assets of Russian politicians and oligarchs, blocking transactions with Russia’s central bank and removing others from the SWIFT international payments network. Companies, including legacy payment processors Mastercard and Visa and tech platforms Apple Pay and Google Pay, followed with similar measures. However, as with the Canadian truckers, cryptocurrency exchanges have resisted similar steps, even when implored by Ukrainian officials, with one CEO remarking that sanctioning Russian users would “fly in the face of the reason crypto exists” — namely, for the “freedom to transact.”

As recently summarized by one journalist, the freedom to transact is a core tenet of crypto-libertarian ideology whereby “the individual is sovereign, and the state has no authority to limit what a person can do with their assets, digital or otherwise,” and money is magically apolitical. An extension of the same school of thought that elevates economic freedom above all other social, cultural and political interests, the freedom to transact is increasingly invoked by cryptocurrency promoters and right-wing politicians, who share similar ideological leanings, in response to measures by governments and private sector actors to impose political consequences through economic means, including in situations such as the Canadian truckers’ blockade or Russia’s recent assault on Ukraine…(More)”.

Privacy As/And Civil Rights


Paper by Tiffany C. Li: “Decades have passed since the modern American civil rights movement began, but the fight for equality is far from over. Systemic racism, sexism, and discrimination against many marginalized groups is still rampant in our society. Tensions rose to a fever pitch in 2020, with a summer of Black Lives Matters protests, sparked by the police killing of George Floyd, leading in to an attempted armed insurrection and attack on the U.S. Capitol on January 6, 2021. Asian-Americans faced rising rates of racism and hate crimes , spurred in part by inflammatory statements from the then-sitting President of the United States. Members of the LGBT community faced attacks on their civil rights during the Trump administration, including a rolling back of protections awarded to transgender individuals.

At the same time, the world faced a deadly pandemic that exposed the inequalities tearing the fabric of our society. The battle for civil rights is clearly not over, and the nation and the world have faced setbacks in the fight for equality, brought out by the pandemic, political pressures, and other factors. Meanwhile, the role of technology is also changing, with new technologies like facial recognition, artificial intelligence, and connected devices, offering new threats and perhaps new hope for civil rights. To understand privacy at our current point in time, we must consider the role of privacy in civil rights—and even, as scholars like Alvaro Bedoya have suggested, privacy itself as a civil right.

This Article is an attempt to expand upon the work of privacy and civil rights scholars in conceptualizing privacy as a civil right and situating this concept within the broader field of privacy studies. This Article builds on the work of scholars who have analyzed critical dimensions of privacy and privacy law, and who have advocated for changes in privacy law that can move our society forward to protect privacy and equality for all…(More)”.

Web3 and the Trap of ‘For Good’


Article by By Scott Smith & Lina Srivastava : “There are three linked challenges baked into Web3 that any proponent of positive social impact must solve.

1. Decentralized tech doesn’t equal distributed power. Web3 has become synonymous with the decentralized web, and one of the selling points of Web3 technologies is decentralization or shared ownership of web infrastructure. But in reality, ownership is too often centralized by and for those with resources already, the wealthy (even if only coin-wealthy) and corporations.

As the example of NFT marketplace OpenSea demonstrates, risks are too easily distributed onto the users, even as the gains remain very much centralized for platform owners and a small minority of participants. Even Ethereum co-creator Vitalik Buterin has issued warnings about power concentration in Web3 token-based economies, saying crypto “whales” can have too much power in these economies. Systems become inherently extractive unless ownership is shared and distributed by a majority, particularly by those who are traditionally most vulnerable to exploitation.

For this reason, equitable power structures must be proactively designed in Web3 systems.

2. A significant percentage of existing power holders are already building their Web3 business models on exploitation and extraction. At present, these business models mine energy and other resources to the detriment of our climate and environment and of energy-poor communities, in some cases actively resuscitating wasteful or harmful power projects. They do so without addressing these concerns in their core business model (or even by creating offsets, a less desirable alternative but still better than nothing).

These models are meant to avoid accountability to platform users or vulnerable communities in either economic or environmental terms. But they nevertheless ask for our trust?

3. Building community trust takes more than decentralization. Those who are building over distributed technologies often claim it as a solution to a trust deficit, that “trust” is inherent to the systems. Except that it isn’t…(More)”

Privacy and/or Trade


Paper by Anupam Chander and Paul M. Schwartz: “International privacy and trade law developed together, but now are engaged in significant conflict. Current efforts to reconcile the two are likely to fail, and the result for globalization favors the largest international companies able to navigate the regulatory thicket. In a landmark finding, this Article shows that more than sixty countries outside the European Union are now evaluating whether foreign countries have privacy laws that are adequate to receive personal data. This core test for deciding on the permissibility of global data exchanges is currently applied in a nonuniform fashion with ominous results for the data flows that power trade today.

The promise of a global internet, with access for all, including companies from the Global South, is increasingly remote. This Article uncovers the forgotten and fateful history of the international regulation of privacy and trade that led to our current crisis and evaluates possible solutions to the current conflict. It proposes a Global Agreement on Privacy enforced within the trade order, but with external data privacy experts developing the treaty’s substantive norms….(More)”.

How privacy’s past may shape its future


Essay by Alessandro Acquisti, Laura Brandimarte and Jeff Hancock: “Continued expansion of human activities into digital realms gives rise to concerns about digital privacy and its invasions, often expressed in terms of data rights and internet surveillance. It may thus be tempting to construe privacy as a modern phenomenon—something our ancestors lacked and technological innovation and urban growth made possible. Research from history, anthropology, and ethnography suggests otherwise. The evidence for peoples seeking to manage the boundaries of private and public spans time and space, social class, and degree of technological sophistication. Privacy—not merely hiding of data, but the selective opening and closing of the self to others—appears to be both culturally specific and culturally universal. But what could explain the simultaneous universality and diversity of a human drive for privacy? An account of the evolutionary roots of privacy may offer an answer and teach us about privacy’s digital future and how to manage it….(More)”.

Free Speech: A History from Socrates to Social Media


Book by Jacob Mchangama: A global history of free speech, from the ancient world to today. Hailed as the “first freedom,” free speech is the bedrock of democracy. But it is a challenging principle, subject to erosion in times of upheaval. Today, in democracies and authoritarian states around the world, it is on the retreat.

In Free Speech, Jacob Mchangama traces the riveting legal, political, and cultural history of this idea. Through captivating stories of free speech’s many defenders—from the ancient Athenian orator Demosthenes and the ninth-century freethinker al-Rāzī, to the anti-lynching crusader Ida B. Wells and modern-day digital activists—Mchangama reveals how the free exchange of ideas underlies all intellectual achievement and has enabled the advancement of both freedom and equality worldwide. Yet the desire to restrict speech, too, is a constant, and he explores how even its champions can be led down this path when the rise of new and contrarian voices challenge power and privilege of all stripes.

Meticulously researched and deeply humane, Free Speech demonstrates how much we have gained from this principle—and how much we stand to lose without it…(More)”.

Oversight Board publishes policy advisory opinion on the sharing of private residential information


Press Release by Oversight Board: “Last year, Meta requested a policy advisory opinion from the Board on the sharing of private residential addresses and images, and the contexts in which this information may be published on Facebook and Instagram. Meta considers this to be a difficult question as while access to such information can be relevant to journalism and civic activism, “exposing this information without consent can create a risk to residents’ safety and infringe on an individual’s privacy.”

Meta’s request noted several potential harms linked to releasing personal information, including residential addresses and images. These include “doxing,” (which refers to the release of documents, abbreviated as “dox”) where information which can identify someone is revealed online. Meta noted that doxing can have negative real-world consequences, such as harassment or stalking…

The Board understands that the sharing of private residential addresses and images represents a potentially serious violation of the right to privacy both for people who use Facebook and Instagram, and those who do not.

Once this information is shared, the harms that can result, such as doxing, are difficult to remedy. Harms resulting from doxing disproportionately affect groups such as women, children and LGBTQIA+ people, and can include emotional distress, loss of employment and even physical harm or death.

As the potential for harm is particularly context specific, it is challenging to develop objective and universal indicators that would allow content reviewers to distinguish the sharing of content that would be harmful from shares that would not be. That is why the Board believes that the Privacy Violations policy should be more protective of privacy.

International human rights standards permit necessary and proportionate restrictions on expression to protect people’s right to privacy. As such, the Board favors narrowing the exceptions to the Privacy Violations policy to help Meta better protect the private residential information of people both on and off its platforms.

In exchanges with the Board, Meta stressed that “ensuring that the “publicly available” definition does not exempt content from removal that poses a risk of offline harm” is a “persistent concern.” Public records and other sources of what could be considered “publicly available” information still require resources and effort to be accessed by the general public. On social media, however, such information may be shared and accessed more quickly, and on a much bigger scale, which significantly increases the risk of harm. As such, the Board proposes removing the “publicly available” exception for the sharing of both private residential addresses and images that meet certain criteria….(More)”.