Paper by Meghan J. Ryan: “Burgeoning science and technology have provided the criminal justice system with the opportunity to address some of its shortcomings. And the criminal justice system has significant shortcomings. Among other issues, we have a mass incarceration problem; clearance rates are surprisingly low; there are serious concerns about wrongful convictions; and the system is layered with racial, religious, and other biases. Innovations that are widely used across industries, as well as those directed specifically at the criminal justice system, have the potential to improve upon such problems. But it is important to recognize that these innovations also have downsides, and criminal justice actors must proceed with caution and understand not only the potential of these interventions but also their limitations. Relevant to this calculation of caution is whether the innovation is broadly used across industry sectors or, rather, whether it has been specifically developed for use within the criminal justice system. These latter innovations have a record of not being sufficiently vetted for accuracy and reliability. Accordingly, criminal justice actors must be sufficiently well versed in basic science and technology so that they have the ability and the confidence to critically assess the usefulness of the various criminal justice innovations in light of their limitations. Considering lawyers’ general lack of competency in these areas, scientific and technological training is necessary to mold them into modern competent criminal justice actors. This training must be more than superficial subject-specific training, though; it must dig deeper, delving into critical thinking skills that include evaluating the accuracy and reliability of the innovation at issue, as well as assessing broader concerns such as the need for development transparency, possible intrusions on individual privacy, and incentives to curtail individual liberties given the innovation at hand….(More)”
Paper by Sara B. Heller, Benjamin Jakubowski, Zubin Jelveh & Max Kapustin: “This paper shows that shootings are predictable enough to be preventable. Using arrest and victimization records for almost 644,000 people from the Chicago Police Department, we train a machine learning model to predict the risk of being shot in the next 18 months. We address central concerns about police data and algorithmic bias by predicting shooting victimization rather than arrest, which we show accurately captures risk differences across demographic groups despite bias in the predictors. Out-of-sample accuracy is strikingly high: of the 500 people with the highest predicted risk, 13 percent are shot within 18 months, a rate 130 times higher than the average Chicagoan. Although Black male victims more often have enough police contact to generate predictions, those predictions are not, on average, inflated; the demographic composition of predicted and actual shooting victims is almost identical. There are legal, ethical, and practical barriers to using these predictions to target law enforcement. But using them to target social services could have enormous preventive benefits: predictive accuracy among the top 500 people justifies spending up to $123,500 per person for an intervention that could cut their risk of being shot in half….(More)”.
Article by Emily Davies: “Chris Ford stepped on the gas in his police cruiser and rolled down Gold Cup Drive to catch the SUV pushing 30 mph in a 15 mph zone. Eleven hours and 37 minutes into his shift, the corporal was ready for his first traffic stop of the day.
“Look at him being sneaky,” Fordsaid, his blue lights flashing on a quiet road in this small town where a busy day could mean animals escaped from a local slaughterhouse.
Ford parked, walked toward the SUV and greeted the man who had ignored the speed limit at exactly the wrong time.
“I was doing 15,” said the driver, a Black man in a mostly White neighborhood of a mostly White town.
The officertook his license and registration back to the cruiser.
“Every time I pull over someone of color, they’re standoffish with me. Like, ‘Here’s a White police officer, here we go again.’ ” Ford, 56, said. “So I just try to be nice.”
Ford knew the stop would be scrutinized — and not just by the reporter who was allowed to ride along on his shift.
After every significant encounter with residents, officers in Warrenton are required to hand out a QR code, which is on the back of their business card, asking for feedback on the interaction. Through a series of questions, citizens can use a star-based system to rate officers on their communication, listening skills and fairness. The responses are anonymous and can be completed any time after the interaction to encourage people to give honest assessments. The program, called Guardian Score, is supposed to give power to those stopped by police in a relationship that has historically felt one-sided — and to give police departments a tool to evaluate their force on more than arrests and tickets.
“If we started to measure how officers are treating community members, we realized we could actually infuse this into the overall evaluation process of individual officers,” said Burke Brownfeld, a founder of Guardian Score and a former police officer in Alexandria. “The definition of doing a good job could change. It would also include: How are your listening skills? How fairly are you treating people based on their perception?”…(More)”.
Report by The Justice and Home Affairs Committee (House of Lords): “In recent years, and without many of us realising it, Artificial Intelligence has begun to permeate every aspect of our personal and professional lives. We live in a world of big data; more and more decisions in society are being taken by machines using algorithms built from that data, be it in healthcare, education, business, or consumerism. Our Committee has limited its investigation to only one area–how these advanced technologies are used in our justice system. Algorithms are being used to improve crime detection, aid the security categorisation of prisoners, streamline entry clearance processes at our borders and generate new insights that feed into the entire criminal justice pipeline.
We began our work on the understanding that Artificial Intelligence (AI), used correctly, has the potential to improve people’s lives through greater efficiency, improved productivity. and in finding solutions to often complex problems. But while acknowledging the many benefits, we were taken aback by the proliferation of Artificial Intelligence tools potentially being used without proper oversight, particularly by police forces across the country. Facial recognition may be the best known of these new technologies but in reality there are many more already in use, with more being developed all the time.
When deployed within the justice system, AI technologies have serious implications for a person’s human rights and civil liberties. At what point could someone be imprisoned on the basis of technology that cannot be explained? Informed scrutiny is therefore essential to ensure that any new tools deployed in this sphere are safe, necessary, proportionate, and effective. This scrutiny is not happening. Instead, we uncovered a landscape, a new Wild West, in which new technologies are developing at a pace that public awareness, government and legislation have not kept up with.
Public bodies and all 43 police forces are free to individually commission whatever tools they like or buy them from companies eager to get in on the burgeoning AI market. And the market itself is worryingly opaque. We were told that public bodies often do not know much about the systems they are buying and will be implementing, due to the seller’s insistence on commercial confidentiality–despite the fact that many of these systems will be harvesting, and relying on, data from the general public.
This is particularly concerning in light of evidence we heard of dubious selling practices and claims made by vendors as to their products’ effectiveness which are often untested and unproven…(More)”.
Project by OCCRP: “In the wake of Russia’s brutal assault on Ukraine, governments around the world have imposed sanctions on many of Putin’s enablers. But they have learned to keep their wealth obscured, hiring an army of lawyers to hide it in secretive bank accounts and corporate structures that reach far offshore. Figuring out who owns what, and how much of it, is a tall order even for experienced police investigators.
That’s why we decided to follow the trail, tracking down as many of these assets as possible and compiling them in a database for the public to see and use. We started with a list of names of people who “actively participate in the oppression and corruption of Putin’s regime” drawn up by the Anti-Corruption Foundation, led by opposition leader Alexei Navalny. We’ll be expanding it soon to include other Russians sanctioned for corruption or their support of Putin.
We looked for land, mansions, companies, boats, planes, and anything else of value that could be tied through documentary evidence to Putin’s circle. Some of these assets have been reported before. Some are being revealed here for the first time. Some are still to be discovered: We’ll keep searching for more properties and yachts, adding more names, and updating this database regularly. If you are aware of anything we’ve missed, please let us know by filling out this form.
For now, we’ve uncovered over $17.5 billion in assets, and counting….(More)”.
Article by Elicia John & Shawn D. Bushway: “Community policing is often held up as an instrumental part of reforms to make policing less harmful, particularly in low-income communities that have high rates of violence. But building collaborative relationships between communities and police is hard. Writing in Nature, Shah and LaForest describe a large field experiment revealing that giving residents cards and letters with basic information about local police officers can prevent crime. Combining these results with those from Internet-based experiments, the authors attribute the observed reduction in crime to perceived ‘information symmetry’.
Known strangers are individuals whom we’ve never met but still know something about, such as celebrities. We tend to assume, erroneously, that known strangers know as much about us as we do about them. This tendency to see information symmetry when there is none is referred to as a social heuristic — a shortcut in our mental processing…
Collaborating with the New York Police Department, the authors sent letters and cards to residents of 39 public-housing developments, providing information about the developments’ local community police officers, called neighbourhood coordination officers. These flyers included personal details, such as the officers’ favourite food, sports team or superhero. Thirty control developments had neighbourhood coordination officers, but did not receive flyers….
This field experiment provided convincing evidence that a simple intervention can reduce crime. Indeed, in the three months after the intervention, the researchers observed a 5–7% drop in crime in the developments that received the information compared with neighbourhoods that did not. This level of reduction is similar to that of more-aggressive policing policies4. The drop in crime lessened after three months, which the authors suggest is due to the light touch and limited duration of the intervention. Interventions designed to keep officers’ information at the top of residents’ minds (such as flyers sent over a longer period at a greater frequency) might therefore result in longer-term effects.
The authors attribute the reduction in crime to a heightened perception among residents receiving flyers that the officer would find out if they committed a crime. The possibilities of such findings are potentially exciting, because the work implies that a police officer who is perceived as a real person can prevent crime without tactics such as the New York City police department’s ‘stop, question and frisk’ policy, which tended to create animosity between community members and the police….(More)”
Book by Benjamin van Rooij and Adam Fine: “Why do most Americans wear seatbelts but continue to speed even though speeding fines are higher? Why could park rangers reduce theft by removing “no stealing” signs? Why was a man who stole 3 golf clubs sentenced to 25 years in prison?
Some laws radically change behavior whereas others are consistently ignored and routinely broken. And yet we keep relying on harsh punishment against crime despite its continued failure.
Professors Benjamin van Rooij and Adam Fine draw on decades of research to uncover the behavioral code: the root causes and hidden forces that drive human behavior and our responses to society’s laws. In doing so, they present the first accessible analysis of behavioral jurisprudence, which will fundamentally alter how we understand the connection between law and human behavior.
The Behavioral Code offers a necessary and different approach to battling crime and injustice that is based in understanding the science of human misconduct—rather than relying on our instinctual drive to punish as a way to shape behavior. The book reveals the behavioral code’s hidden role through illustrative examples like:
• The illusion of the US’s beloved tax refund
• German walls that “pee back” at public urinators
• The $1,000 monthly “good behavior” reward that reduced gun violence
• Uber’s backdoor “Greyball” app that helped the company evade Seattle’s taxi regulators
• A $2.3 billion legal settlement against Pfizer that revealed how whistleblower protections fail to reduce corporate malfeasance
• A toxic organizational culture playing a core role in Volkswagen’s emissions cheating scandal
• How Peter Thiel helped Hulk Hogan sue Gawker into oblivion…(More)”.
Paper by Emile Loza de Siles: “In 1968, the Reverend Martin Luther King Jr. foresaw the inevitability of society’s eventual triumph over the deep racism of his time and the stain that continues to cast its destructive oppressive pall today. From the pulpit of the nation’s church, Dr King said, “We shall overcome because the arc of the moral universe is long but it bends toward justice”. More than 40 years later, Eric Holder, the first African American United States Attorney General, agreed, but only if people acting with conviction exert to pull that arc towards justice.
With artificial intelligence (AI) bias and discrimination rampant, the need to pull the moral arc towards algorithmic justice is urgent. This article offers empowering clarity by conceptually bifurcating AI bias problems into AI bias engineering and organisational AI governance problems, revealing proven legal development pathways to protect against the corrosive harms of AI bias and discrimination…(More)”.
Paul Wormeli at The Criminologist: “To the extent that a paradigm is defined as the way we view things, the crime statistics paradigm in the United States is inadequate and requires reinvention….The statement—”not all crime is reported to the police”—lies at the very heart of why our current crime data are inherently incomplete. It is a direct reference to the fact that not all “street crime” is reported and that state and local law enforcement are not the only entities responsible for overseeing violations of societally established norms (“street crime” or otherwise). Two significant gaps exist, in that: 1) official reporting of crime from state and local law enforcement agencies cannot provide insight into unreported incidents, and 2) state and local law enforcement may not have or acknowledge jurisdiction over certain types of matters, such as cybercrime, corruption, environmental crime, or terrorism, and therefore cannot or do not report on those incidents…
All of these gaps in crime reporting mask the portrait of crime in the U.S. If there was a complete accounting of crime that could serve as the basis of policy formulation, including the distribution of federal funds to state and local agencies, there could be a substantial impact across the nation. Such a calculation would move the country toward a more rational basis for determining federal support for communities based on a comprehensive measure of community wellness.
In its deliberations, the NAS Panel recognized that it is essential to consider both the concepts of classification and the rules of counting as we seek a better and more practical path to describing crime in the U.S. and its consequences. The panel postulated that a meaningful classification of incidents found to be crimes would go beyond the traditional emphasis on street crime and include all crime categories.
The NAS study identified the missing elements of a national crime report as including more complete data on crimes involving drugrelated offenses, criminal acts where juveniles are involved, so-called white-collar crimes such as fraud and corruption, cybercrime, crime against businesses, environmental crimes, and crimes against animals. Just as one example, it is highly unlikely that we will know the full extent of fraudulent claims against all federal, state, and local governments in the face of the massive influx of funding from recent and forthcoming Congressional action.
In proposing a set of crime classifications, the NAS panel recommended 11 major categories, 5 of which are not addressed in our current crime data collection systems. While there are parallel data systems that collect some of the missing data within these five crime categories, it remains unclear which federal agency, if any, has the authority to gather the information and aggregate it to give us anywhere near a complete estimate of crime in the United States. No federal or national entity has the assignment of estimating the total amount of crime that takes place in the United States. Without such leadership, we are left with an uninformed understanding of the health and wellness of communities throughout the country…(More)”
Kevin Roose at the New York Times: “…Especially at a time when many of tech’s leaders seem more interested in building new, virtual worlds than improving the world we live in, it’s worth praising the technologists who are stepping up to solve some of our biggest problems.
So here, without further ado, are this year’s Good Tech Awards…
One of the year’s most exciting A.I. breakthroughs came in July when DeepMind — a Google-owned artificial intelligence company — published data and open-source code from its groundbreaking AlphaFold project.
The project, which used A.I. to predict the structures of proteins, solved a problem that had vexed scientists for decades, and was hailed by experts as one of the greatest scientific discoveries of all time. And by publishing its data freely, AlphaFold set off a frenzy among researchers, some of whom are already using it to develop new drugs and better understand the proteins involved in viruses like SARS-CoV-2.
Google’s overall A.I. efforts have been fraught with controversy and missteps, but AlphaFold seems like an unequivocally good use of the company’s vast expertise and resources…
Prisons aren’t known as hotbeds of innovation. But two tech projects this year tried to make our criminal justice system more humane.
Recidiviz is a nonprofit tech start-up that builds open-source data tools for criminal justice reform. It was started by Clementine Jacoby, a former Google employee who saw an opportunity to corral data about the prison system and make it available to prison officials, lawmakers, activists and researchers to inform their decisions. Its tools are in use in seven states, including North Dakota, where the data tools helped prison officials assess the risk of Covid-19 outbreaks and identify incarcerated people who were eligible for early release….(More)”.