Group Backed by Top Companies Moves to Combat A.I. Bias in Hiring


Steve Lohr at The New York Times: “Artificial intelligence software is increasingly used by human resources departments to screen résumés, conduct video interviews and assess a job seeker’s mental agility.

Now, some of the largest corporations in America are joining an effort to prevent that technology from delivering biased results that could perpetuate or even worsen past discrimination.

The Data & Trust Alliance, announced on Wednesday, has signed up major employers across a variety of industries, including CVS Health, Deloitte, General Motors, Humana, IBM, Mastercard, Meta (Facebook’s parent company), Nike and Walmart.

The corporate group is not a lobbying organization or a think tank. Instead, it has developed an evaluation and scoring system for artificial intelligence software.

The Data & Trust Alliance, tapping corporate and outside experts, has devised a 55-question evaluation, which covers 13 topics, and a scoring system. The goal is to detect and combat algorithmic bias.“This is not just adopting principles, but actually implementing something concrete,” said Kenneth Chenault, co-chairman of the group and a former chief executive of American Express, which has agreed to adopt the anti-bias tool kit…(More)”.

What Biden’s Democracy Summit Is Missing


Essay by Hélène Landemore: “U.S. President Joe Biden is set to host a virtual summit this week for leaders from government, civil society, and the private sector to discuss the renewal of democracy. We can expect to see plenty of worthy yet predictable issues discussed: the threat of foreign agents interfering in elections, online disinformation, political polarization, and the temptation of populist and authoritarian alternatives. For the United States specifically, the role of money in politics, partisan gerrymandering, endless gridlock in Congress, and the recent voter suppression efforts targeting Black communities in the South should certainly be on the agenda.

All are important and relevant topics. Something more fundamental, however, is needed.

The clear erosion of our political institutions is just the latest evidence, if any more was needed, that it’s past time to discuss what democracy actually means—and why we should care about it. We have to question, moreover, whether the political systems we have are even worth restoring or if we should more substantively alter them, including through profound constitutional reforms.

Such a discussion has never been more vital. The systems in place today once represented a clear improvement on prior regimes—monarchies, theocracies, and other tyrannies—but it may be a mistake to call them adherents of democracy at all. The word roughly translates from its original Greek as “people’s power.” But the people writ large don’t hold power in these systems. Elites do. Consider that in the United States, according to a 2014 study by the political scientists Martin Gilens and Benjamin Page, only the richest 10 percent of the population seems to have any causal effect on public policy. The other 90 percent, they argue, is left with “democracy by coincidence”—getting what they want only when they happen to want the same thing as the people calling the shots.

This discrepancy between reality—democracy by coincidence—and the ideal of people’s power is baked in as a result of fundamental design flaws dating back to the 18th century. The only way to rectify those mistakes is to rework the design—to fully reimagine what it means to be democratic. Tinkering at the edges won’t do….(More)”

“If Everybody’s White, There Can’t Be Any Racial Bias”: The Disappearance of Hispanic Drivers From Traffic Records


Article by Richard A. Webster: “When sheriff’s deputies in Jefferson Parish, Louisiana, pulled over Octavio Lopez for an expired inspection tag in 2018, they wrote on his traffic ticket that he is white. Lopez, who is from Nicaragua, is Hispanic and speaks only Spanish, said his wife.

In fact, of the 167 tickets issued by deputies to drivers with the last name Lopez over a nearly six-year span, not one of the motorists was labeled as Hispanic, according to records provided by the Jefferson Parish clerk of court. The same was true of the 252 tickets issued to people with the last name of Rodriguez, 234 named Martinez, 223 with the last name Hernandez and 189 with the surname Garcia.

This kind of misidentification is widespread — and not without harm. Across America, law enforcement agencies have been accused of targeting Hispanic drivers, failing to collect data on those traffic stops, and covering up potential officer misconduct and aggressive immigration enforcement by identifying people as white on tickets.

“If everybody’s white, there can’t be any racial bias,” Frank Baumgartner, a political science professor at the University of North Carolina of Chapel Hill, told WWNO/WRKF and ProPublica.

Nationally, states have tried to patch this data loophole and tighten controls against racial profiling. In recent years, legislators have passed widely hailed traffic stop data-collection laws in California, Colorado, Illinois, Oregon, Virginia and Washington, D.C. This April, Alabama became the 22nd state to enact similar legislation.

Though Louisiana has had its own data-collection requirement for two decades, it contains a loophole unlike any other state: It exempts law enforcement agencies from collecting and delivering data to the state if they have an anti-racial-profiling policy in place. This has rendered the law essentially worthless, said Josh Parker, a senior staff attorney at the Policing Project, a public safety research nonprofit at the New York University School of Law.

Louisiana State Rep. Royce Duplessis, D-New Orleans, attempted to remove the exemption two years ago, but law enforcement agencies protested. Instead, he was forced to convene a task force to study the issue, which thus far hasn’t produced any results, he said.

“They don’t want the data because they know what it would reveal,” Duplessis said of law enforcement agencies….(More)”.

A Fix-It Job for Government Tech


Shira Ovide at the New York Times: “U.S. government technology has a mostly deserved reputation for being expensive and awful.

Computer systems sometimes operate with Sputnik-era software. A Pentagon project to modernize military technology has little to show after five years. During the coronavirus pandemic, millions of Americans struggled to get government help like unemployment insurancevaccine appointments and food stamps because of red tape, inflexible technology and other problems.

Whether you believe that the government should be more involved in Americans’ lives or less, taxpayers deserve good value for the technology we pay for. And we often don’t get it. It’s part of Robin Carnahan’s job to take on this problem.

A former secretary of state for Missouri and a government tech consultant, Carnahan had been one of my guides to how public sector technology could work better. Then in June, she was confirmed as the administrator of the General Services Administration, the agency that oversees government acquisitions, including of technology.

Carnahan said that she and other Biden administration officials wanted technology used for fighting wars or filing taxes to be as efficient as our favorite app.

“Bad technology sinks good policy,” Carnahan told me. “We’re on a mission to make government tech more user-friendly and be smarter about how we buy it and use it.”

Carnahan highlighted three areas she wanted to address: First, change the process for government agencies to buy technology to recognize that tech requires constant updates. Second, simplify the technology for people using government services. And third, make it more appealing for people with tech expertise to work for the government, even temporarily.

All of that is easier said than done, of course. People in government have promised similar changes before, and it’s not a quick fix. Technology dysfunction is also often a symptom of poor policies.

But in Carnahan’s view, one way to build faith in government is to prove that it can be competent. And technology is an essential area to show that…(More)”.

Crime Prediction Software Promised to Be Free of Biases. New Data Shows It Perpetuates Them


Article by Aaron Sankin, Dhruv Mehrotra for Gizmodo, Surya Mattu, and Annie Gilbertson: “Between 2018 and 2021, more than one in 33 U.S. residents were potentially subject to police patrol decisions directed by crime prediction software called PredPol.

The company that makes it sent more than 5.9 million of these crime predictions to law enforcement agencies across the country—from California to Florida, Texas to New Jersey—and we found those reports on an unsecured server.

The Markup and Gizmodo analyzed them and found persistent patterns.

Residents of neighborhoods where PredPol suggested few patrols tended to be Whiter and more middle- to upper-income. Many of these areas went years without a single crime prediction.

By contrast, neighborhoods the software targeted for increased patrols were more likely to be home to Blacks, Latinos, and families that would qualify for the federal free and reduced lunch program.

These communities weren’t just targeted more—in some cases they were targeted relentlessly. Crimes were predicted every day, sometimes multiple times a day, sometimes in multiple locations in the same neighborhood: thousands upon thousands of crime predictions over years. A few neighborhoods in our data were the subject of more than 11,000 predictions.

The software often recommended daily patrols in and around public and subsidized housing, targeting the poorest of the poor.

“Communities with troubled relationships with police—this is not what they need,” said Jay Stanley, a senior policy analyst at the ACLU Speech, Privacy, and Technology Project. “They need resources to fill basic social needs.”…(More)”.

How Courts Embraced Technology, Met the Pandemic Challenge, and Revolutionized Their Operations


Report by The Pew Charitable Trusts: “To begin to assess whether, and to what extent, the rapid improvements in court technology undertaken in 2020 and 2021 made the civil legal system easier to navigate, The Pew Charitable Trusts examined pandemic-related emergency orders issued by the supreme courts of all 50 states and Washington, D.C. The researchers supplemented that review with an analysis of court approaches to virtual hearings, e-filing, and digital notarization, with a focus on how these tools affected litigants in three of the most common types of civil cases: debt claims, evictions, and child support. The key findings of this research are:

  • Civil courts’ adoption of technology was unprecedented in pace and scale. Despite having almost no history of using remote civil court proceedings, beginning in March 2020 every state and D.C. initiated online hearings at record rates to resolve many types of cases. For example, the Texas court system, which had never held a civil hearing via video before the pandemic, conducted 1.1 million remote proceedings across its civil and criminal divisions between March 2020 and February 2021. Similarly, Michigan courts held more than 35,000 video hearings totaling nearly 200,000 hours between April 1 and June 1, 2020, compared with no such hearings during the same two months in 2019.Courts moved other routine functions online as well. Before the pandemic, 37 states and D.C. allowed people without lawyers to electronically file court documents in at least some civil cases. But since March 2020, 10 more states have created similar processes, making e-filing available to more litigants in more jurisdictions and types of cases. In addition, after 11 states and D.C. made pandemic-driven changes to their policies on electronic notarization (e-notarization), 42 states and D.C. either allowed it or had waived notarization requirements altogether as of fall 2020.
  • Courts leveraged technology not only to stay open, but also to improve participation rates and help users resolve disputes more efficiently. Arizona civil courts, for example, saw an 8% drop year-over-year in June 2020 in the rate of default, or automatic, judgment—which results when defendants fail to appear in court—indicating an increase in participation. Although national and other state data is limited, court officials across the country, including judges, administrators, and attorneys, report increases in civil court appearance rates.
  • The accelerated adoption of technology disproportionately benefited people and businesses with legal representation—and in some instances, made the civil legal system more difficult to navigate for those without...(More)”.

NativeDATA


About: “NativeDATA is a free online resource that offers practical guidance for Tribes and Native-serving organizations. For this resource, Native-serving organizations includes Tribal and urban Indian organizations and Tribal Epidemiology Centers (TECs). 

Tribal and urban Indian communities need correct health information (data), so that community leaders can:

  • Watch disease trends
  • Respond to health threats
  • Create useful health policies…

Throughout, this resource offers practical guidance for obtaining and sharing health data in ways that honor Tribal sovereignty, data sovereignty, and public health authorityis the authority of a sovereign government to protect the health, safety, and welfare of its citizens. As sovereign nations, Tribes have the power to define how they will use this authority to protect and promote the health of their communities. The federal government recognizes Tribes and Tribal Epidemiology Centers (TECs) as public health authorities under federal law. More.

Inside you will find expert advice to help you:

Academic Incentives and Research Impact: Developing Reward and Recognition Systems to Better People’s Lives


Report by Jonathan Grant: “…offers new strategies to increase the societal impact that health research can have on the community and critiques the existing academic reward structure that determines the career trajectories of so many academics—including, tenure, peer-review publication, citations, and grant funding, among others. The new assessment illustrates how these incentives can lead researchers to produce studies as an end-goal, rather than pursuing impact by applying the work in real world settings.

Dr. Grant also outlines new system-, institution-, and person-level changes to academic incentives that, if implemented, could make societal impact an integral part of the research process. Among the changes offered by Dr. Grant are tying a percentage of grant funding to the impact the research has on the community, breaking from the tenure model to incentivize ongoing development and quality research, and encouraging academics themselves to prioritize social impact when submitting or reviewing research and grant proposals…(More)”.

Open Data Standard and Analysis Framework: Towards Response Equity in Local Governments


Paper by Joy Hsu, Ramya Ravichandran, Edwin Zhang, and Christine Keung: “There is an increasing need for open data in governments and systems to analyze equity at large scale. Local governments often lack the necessary technical tools to identify and tackle inequities in their communities. Moreover, these tools may not generalize across departments and cities nor be accessible to the public. To this end, we propose a system that facilitates centralized analyses of publicly available government datasets through 1) a US Census-linked API, 2) an equity analysis playbook, and 3) an open data standard to regulate data intake and support equitable policymaking….(More)”.

New York City passed a bill requiring ‘bias audits’ of AI hiring tech


Kate Kaye at Protocol: “Let the AI auditing vendor brigade begin. A year since it was introduced, New York City Council passed a bill earlier this week requiring companies that sell AI technologies for hiring to obtain audits assessing the potential of those products to discriminate against job candidates. The bill requiring “bias audits” passed with overwhelming support in a 38-4 vote.

The bill is intended to weed out the use of tools that enable already unlawful employment discrimination in New York City. If signed into law, it will require providers of automated employment decision tools to have those systems evaluated each year by an audit service and provide the results to companies using those systems.

AI for recruitment can include software that uses machine learning to sift through resumes and help make hiring decisions, systems that attempt to decipher the sentiments of a job candidate, or even tech involving games to pick up on subtle clues about someone’s hiring worthiness. The NYC bill attempts to encompass the full gamut of AI by covering everything from old-school decision trees to more complex systems operating through neural networks.

The legislation calls on companies using automated decision tools for recruitment not only to tell job candidates when they’re being used, but to tell them what information the technology used to evaluate their suitability for a job.

The bill, however, fails to go into detail on what constitutes a bias audit other than to define one as “an impartial evaluation” that involves testing. And it already has critics who say it was rushed into passage and doesn’t address discrimination related to disability or age…(More)”.