Algorithm Claims to Predict Crime in US Cities Before It Happens


Article by Carrington York: “A new computer algorithm can now forecast crime in a big city near you — apparently. 

The algorithm, which was formulated by social scientists at the University of Chicago and touts 90% accuracy, divides cities into 1,000-square-foot tiles, according to a study published in Nature Human Behavior. Researchers used historical data on violent crimes and property crimes from Chicago to test the model, which detects patterns over time in these tiled areas tries to predict future events. It performed just as well using data from other big cities, including Atlanta, Los Angeles and Philadelphia, the study showed. 

The new tool contrasts with previous models for prediction, which depict crime as emerging from “hotspots” that spread to surrounding areas. Such an approach tends to miss the complex social environment of cities, as well as the nuanced relationship between crime and the effects of police enforcement, thus leaving room for bias, according to the report.

“It is hard to argue that bias isn’t there when people sit down and determine which patterns they will look at to predict crime because these patterns, by themselves, don’t mean anything,” said Ishanu Chattopadhyay, Assistant Professor of Medicine at the University of Chicago and senior author of the study. “But now, you can ask the algorithm complex questions like: ‘What happens to the rate of violent crime if property crimes go up?”

But Emily M. Bender, professor of linguistics at the University of Washington, said in a series of tweets that the focus should be on targeting underlying inequities rather than on predictive policing, while also noting that the research appears to ignore securities fraud or environmental crimes…(More)”

Police Violence In Puerto Rico: Flooded With Data


Blog by Christine Grillo: “For María Mari-Narváez, a recent decision by the Supreme Court of Puerto Rico was both a victory and a moment of reckoning. The Court granted Kilómetro Cero, a citizen-led police accountability project in Puerto Rico, full access to every use-of-force report filed by the Puerto Rico Police Department since 2014. The decision will make it possible for advocates such as Mari to get a clear picture of how state police officers are using force, and when that use of force crosses the line into abuse. But the court victory flooded her small organization with data.

“We won, finally, and then I realized I was going to be receiving thousands of documents that I had zero capacity to process,” says Mari.

“One of the things that’s important to me when analyzing data is to find out where the gaps are, why those gaps exist, and what those gaps represent.” —Tarak Shah, data scientist

The Court made its decision in April 2021, and the police department started handing over PDF files in July. By the end, there could be up to 10,000 documents that get turned in. In addition to incident reports, the police had to provide their use-of-force database. Combined, the victory provides a complicated mixture of quantitative and qualitative data that can be analyzed to answer questions about what the state police are doing to its citizens during police interventions. In particular, Kilómetro Cero, which Mari founded, wants to find out if some Puerto Ricans are more likely to be victims of police violence than others.

“We’re looking for bias,” says Mari. “Bias against poor people, or people who live in a certain neighborhood. Gender bias. Language bias. Bias against drug users, sex workers, immigrants, people who don’t have a house. We’re trying to analyze the language of vulnerability.”…(More)”.

Understanding Criminal Justice Innovations


Paper by Meghan J. Ryan: “Burgeoning science and technology have provided the criminal justice system with the opportunity to address some of its shortcomings. And the criminal justice system has significant shortcomings. Among other issues, we have a mass incarceration problem; clearance rates are surprisingly low; there are serious concerns about wrongful convictions; and the system is layered with racial, religious, and other biases. Innovations that are widely used across industries, as well as those directed specifically at the criminal justice system, have the potential to improve upon such problems. But it is important to recognize that these innovations also have downsides, and criminal justice actors must proceed with caution and understand not only the potential of these interventions but also their limitations. Relevant to this calculation of caution is whether the innovation is broadly used across industry sectors or, rather, whether it has been specifically developed for use within the criminal justice system. These latter innovations have a record of not being sufficiently vetted for accuracy and reliability. Accordingly, criminal justice actors must be sufficiently well versed in basic science and technology so that they have the ability and the confidence to critically assess the usefulness of the various criminal justice innovations in light of their limitations. Considering lawyers’ general lack of competency in these areas, scientific and technological training is necessary to mold them into modern competent criminal justice actors. This training must be more than superficial subject-specific training, though; it must dig deeper, delving into critical thinking skills that include evaluating the accuracy and reliability of the innovation at issue, as well as assessing broader concerns such as the need for development transparency, possible intrusions on individual privacy, and incentives to curtail individual liberties given the innovation at hand….(More)”

Machine Learning Can Predict Shooting Victimization Well Enough to Help Prevent It


Paper by Sara B. Heller, Benjamin Jakubowski, Zubin Jelveh & Max Kapustin: “This paper shows that shootings are predictable enough to be preventable. Using arrest and victimization records for almost 644,000 people from the Chicago Police Department, we train a machine learning model to predict the risk of being shot in the next 18 months. We address central concerns about police data and algorithmic bias by predicting shooting victimization rather than arrest, which we show accurately captures risk differences across demographic groups despite bias in the predictors. Out-of-sample accuracy is strikingly high: of the 500 people with the highest predicted risk, 13 percent are shot within 18 months, a rate 130 times higher than the average Chicagoan. Although Black male victims more often have enough police contact to generate predictions, those predictions are not, on average, inflated; the demographic composition of predicted and actual shooting victims is almost identical. There are legal, ethical, and practical barriers to using these predictions to target law enforcement. But using them to target social services could have enormous preventive benefits: predictive accuracy among the top 500 people justifies spending up to $123,500 per person for an intervention that could cut their risk of being shot in half….(More)”.

In this small Va. town, citizens review police like Uber drivers


Article by Emily Davies: “Chris Ford stepped on the gas in his police cruiser and rolled down Gold Cup Drive to catch the SUV pushing 30 mph in a 15 mph zone. Eleven hours and 37 minutes into his shift, the corporal was ready for his first traffic stop of the day.

“Look at him being sneaky,” Fordsaid, his blue lights flashing on a quiet road in this small town where a busy day could mean animals escaped from a local slaughterhouse.

Ford parked, walked toward the SUV and greeted the man who had ignored the speed limit at exactly the wrong time.

“I was doing 15,” said the driver, a Black man in a mostly White neighborhood of a mostly White town.

The officertook his license and registration back to the cruiser.

“Every time I pull over someone of color, they’re standoffish with me. Like, ‘Here’s a White police officer, here we go again.’ ” Ford, 56, said. “So I just try to be nice.”

Ford knew the stop would be scrutinized — and not just by the reporter who was allowed to ride along on his shift.

After every significant encounter with residents, officers in Warrenton are required to hand out a QR code, which is on the back of their business card, asking for feedback on the interaction. Through a series of questions, citizens can use a star-based system to rate officers on their communication, listening skills and fairness. The responses are anonymous and can be completed any time after the interaction to encourage people to give honest assessments. The program, called Guardian Score, is supposed to give power to those stopped by police in a relationship that has historically felt one-sided — and to give police departments a tool to evaluate their force on more than arrests and tickets.

“If we started to measure how officers are treating community members, we realized we could actually infuse this into the overall evaluation process of individual officers,” said Burke Brownfeld, a founder of Guardian Score and a former police officer in Alexandria. “The definition of doing a good job could change. It would also include: How are your listening skills? How fairly are you treating people based on their perception?”…(More)”.

Technology rules? The advent of new technologies in the justice system


Report by The Justice and Home Affairs Committee (House of Lords): “In recent years, and without many of us realising it, Artificial Intelligence has begun to permeate every aspect of our personal and professional lives. We live in a world of big data; more and more decisions in society are being taken by machines using algorithms built from that data, be it in healthcare, education, business, or consumerism. Our Committee has limited its investigation to only one area–how these advanced technologies are used in our justice system. Algorithms are being used to improve crime detection, aid the security categorisation of prisoners, streamline entry clearance processes at our borders and generate new insights that feed into the entire criminal justice pipeline.

We began our work on the understanding that Artificial Intelligence (AI), used correctly, has the potential to improve people’s lives through greater efficiency, improved productivity. and in finding solutions to often complex problems. But while acknowledging the many benefits, we were taken aback by the proliferation of Artificial Intelligence tools potentially being used without proper oversight, particularly by police forces across the country. Facial recognition may be the best known of these new technologies but in reality there are many more already in use, with more being developed all the time.

When deployed within the justice system, AI technologies have serious implications for a person’s human rights and civil liberties. At what point could someone be imprisoned on the basis of technology that cannot be explained? Informed scrutiny is therefore essential to ensure that any new tools deployed in this sphere are safe, necessary, proportionate, and effective. This scrutiny is not happening. Instead, we uncovered a landscape, a new Wild West, in which new technologies are developing at a pace that public awareness, government and legislation have not kept up with.
Public bodies and all 43 police forces are free to individually commission whatever tools they like or buy them from companies eager to get in on the burgeoning AI market. And the market itself is worryingly opaque. We were told that public bodies often do not know much about the systems they are buying and will be implementing, due to the seller’s insistence on commercial confidentiality–despite the fact that many of these systems will be harvesting, and relying on, data from the general public.
This is particularly concerning in light of evidence we heard of dubious selling practices and claims made by vendors as to their products’ effectiveness which are often untested and unproven…(More)”.

Russian Asset Tracker


Project by OCCRP: “In the wake of Russia’s brutal assault on Ukraine, governments around the world have imposed sanctions on many of Putin’s enablers. But they have learned to keep their wealth obscured, hiring an army of lawyers to hide it in secretive bank accounts and corporate structures that reach far offshore. Figuring out who owns what, and how much of it, is a tall order even for experienced police investigators.

That’s why we decided to follow the trail, tracking down as many of these assets as possible and compiling them in a database for the public to see and use. We started with a list of names of people who “actively participate in the oppression and corruption of Putin’s regime” drawn up by the Anti-Corruption Foundation, led by opposition leader Alexei Navalny. We’ll be expanding it soon to include other Russians sanctioned for corruption or their support of Putin.

We looked for land, mansions, companies, boats, planes, and anything else of value that could be tied through documentary evidence to Putin’s circle. Some of these assets have been reported before. Some are being revealed here for the first time. Some are still to be discovered: We’ll keep searching for more properties and yachts, adding more names, and updating this database regularly. If you are aware of anything we’ve missed, please let us know by filling out this form.

For now, we’ve uncovered over $17.5 billion in assets, and counting….(More)”.

Letters and cards telling people about local police reduce crime


Article by Elicia John & Shawn D. Bushway: “Community policing is often held up as an instrumental part of reforms to make policing less harmful, particularly in low-income communities that have high rates of violence. But building collaborative relationships between communities and police is hard. Writing in Nature, Shah and LaForest describe a large field experiment revealing that giving residents cards and letters with basic information about local police officers can prevent crime. Combining these results with those from Internet-based experiments, the authors attribute the observed reduction in crime to perceived ‘information symmetry’.

Known strangers are individuals whom we’ve never met but still know something about, such as celebrities. We tend to assume, erroneously, that known strangers know as much about us as we do about them. This tendency to see information symmetry when there is none is referred to as a social heuristic — a shortcut in our mental processing…

Collaborating with the New York Police Department, the authors sent letters and cards to residents of 39 public-housing developments, providing information about the developments’ local community police officers, called neighbourhood coordination officers. These flyers included personal details, such as the officers’ favourite food, sports team or superhero. Thirty control developments had neighbourhood coordination officers, but did not receive flyers….

This field experiment provided convincing evidence that a simple intervention can reduce crime. Indeed, in the three months after the intervention, the researchers observed a 5–7% drop in crime in the developments that received the information compared with neighbourhoods that did not. This level of reduction is similar to that of more-aggressive policing policies4. The drop in crime lessened after three months, which the authors suggest is due to the light touch and limited duration of the intervention. Interventions designed to keep officers’ information at the top of residents’ minds (such as flyers sent over a longer period at a greater frequency) might therefore result in longer-term effects.

The authors attribute the reduction in crime to a heightened perception among residents receiving flyers that the officer would find out if they committed a crime. The possibilities of such findings are potentially exciting, because the work implies that a police officer who is perceived as a real person can prevent crime without tactics such as the New York City police department’s ‘stop, question and frisk’ policy, which tended to create animosity between community members and the police….(More)”

The Behavioral Code


Book by Benjamin van Rooij and Adam Fine: “Why do most Americans wear seatbelts but continue to speed even though speeding fines are higher? Why could park rangers reduce theft by removing “no stealing” signs? Why was a man who stole 3 golf clubs sentenced to 25 years in prison?

Some laws radically change behavior whereas others are consistently ignored and routinely broken. And yet we keep relying on harsh punishment against crime despite its continued failure.

Professors Benjamin van Rooij and Adam Fine draw on decades of research to uncover the behavioral code: the root causes and hidden forces that drive human behavior and our responses to society’s laws. In doing so, they present the first accessible analysis of behavioral jurisprudence, which will fundamentally alter how we understand the connection between law and human behavior.

The Behavioral Code offers a necessary and different approach to battling crime and injustice that is based in understanding the science of human misconduct—rather than relying on our instinctual drive to punish as a way to shape behavior. The book reveals the behavioral code’s hidden role through illustrative examples like:

   • The illusion of the US’s beloved tax refund
   • German walls that “pee back” at public urinators
   • The $1,000 monthly “good behavior” reward that reduced gun violence
   • Uber’s backdoor “Greyball” app that helped the company evade Seattle’s taxi regulators
   • A $2.3 billion legal settlement against Pfizer that revealed how whistleblower protections fail to reduce corporate malfeasance
   • A toxic organizational culture playing a core role in Volkswagen’s emissions cheating scandal
   • How Peter Thiel helped Hulk Hogan sue Gawker into oblivion…(More)”.

Artificial Intelligence Bias and Discrimination: Will We Pull the Arc of the Moral Universe Towards Justice?


Paper by Emile Loza de Siles: “In 1968, the Reverend Martin Luther King Jr. foresaw the inevitability of society’s eventual triumph over the deep racism of his time and the stain that continues to cast its destructive oppressive pall today. From the pulpit of the nation’s church, Dr King said, “We shall overcome because the arc of the moral universe is long but it bends toward justice”. More than 40 years later, Eric Holder, the first African American United States Attorney General, agreed, but only if people acting with conviction exert to pull that arc towards justice.

With artificial intelligence (AI) bias and discrimination rampant, the need to pull the moral arc towards algorithmic justice is urgent. This article offers empowering clarity by conceptually bifurcating AI bias problems into AI bias engineering and organisational AI governance problems, revealing proven legal development pathways to protect against the corrosive harms of AI bias and discrimination…(More)”.