Interview with Paul Fockens of Sustainable Rescue: “Human trafficking still takes place on a large scale, and still too often under the radar. That does not make it easy for organisations that want to combat human trafficking. Sharing of data between various sorts of organisations, including the government, the police, but also banks play a crucial role in mapping the networks of criminals involved in human trafficking, including their victims. Data sharing contributes to tackling this criminal business not only reactively, but also proactively….Sustainable Rescue tries to make the largely invisible human trafficking visible. Bundling data and therefore knowledge is crucial in this. Paul: “It’s about combining the routes criminals (and their victims) take from A to B, the financial transactions they make, the websites they visit, the hotels where they check in et cetera. All those signs of human trafficking can be found in the data of various types of organisations: the police, municipalities, the Public Prosecution Service, charities such as the Salvation Army, but also banks and insurance institutions. The problem here is that you need to collect all pieces of the puzzle to get clear insights from them. As long as this relevant data is not combined through data sharing, it is a very difficult job to get these insights. In nine out of ten cases, these authorities are not willing and/or allowed to share their data, mainly because of the privacy sensitivity of this data. However, in order to eliminate human trafficking, that data will have to be bundled. Only then analyses can be made about the patterns of a network of human trafficking.”…(More)”.
Book edited by John McDaniel and Ken Pease: “This edited text draws together the insights of numerous worldwide eminent academics to evaluate the condition of predictive policing and artificial intelligence (AI) as interlocked policy areas. Predictive and AI technologies are growing in prominence and at an unprecedented rate. Powerful digital crime mapping tools are being used to identify crime hotspots in real-time, as pattern-matching and search algorithms are sorting through huge police databases populated by growing volumes of data in an eff ort to identify people liable to experience (or commit) crime, places likely to host it, and variables associated with its solvability. Facial and vehicle recognition cameras are locating criminals as they move, while police services develop strategies informed by machine learning and other kinds of predictive analytics. Many of these innovations are features of modern policing in the UK, the US and Australia, among other jurisdictions.
AI promises to reduce unnecessary labour, speed up various forms of police work, encourage police forces to more efficiently apportion their resources, and enable police officers to prevent crime and protect people from a variety of future harms. However, the promises of predictive and AI technologies and innovations do not always match reality. They often have significant weaknesses, come at a considerable cost and require challenging trade- off s to be made. Focusing on the UK, the US and Australia, this book explores themes of choice architecture, decision- making, human rights, accountability and the rule of law, as well as future uses of AI and predictive technologies in various policing contexts. The text contributes to ongoing debates on the benefits and biases of predictive algorithms, big data sets, machine learning systems, and broader policing strategies and challenges.
Written in a clear and direct style, this book will appeal to students and scholars of policing, criminology, crime science, sociology, computer science, cognitive psychology and all those interested in the emergence of AI as a feature of contemporary policing….(More)”.
Press Release by the Council of Europe: “The European Commission for the Efficiency of Justice (CEPEJ) has adopted a feasibility study on the possible establishment of a certification mechanism for artificial intelligence tools and services. The study is based on the CEPEJ Charter on the use of artificial intelligence in judicial systems and their environment, adopted in December 2018. The Council of Europe, if it decides to create such a mechanism, could be a pioneer in this field. After consultation with all member and observer states, this feasibility study will be followed by an action plan that the CEPEJ will prepare and send to the Committee of Ministers for examination in 2021….(Study)”.
Paper by Wayne A. Logan: “Crowdsourcing, which leverages the collective expertise and resources of (mainly online) communities to achieve specified objectives, today figures prominently in a broad array of realms, including business, human rights, and medical and scientific research. It also now plays a significant role in governmental crime control efforts. Web and forensic–genetic sleuths, armchair detectives, and the like are collecting and analyzing evidence and identifying criminal suspects, at the behest of and with varying degrees of assistance from police officials.
Unfortunately, as with so many other aspects of modern society, current criminal procedure doctrine is ill-equipped to address this development. In particular, for decades it has been accepted that the Fourth Amendment only limits searches and seizures undertaken by public law enforcement, not private actors. Crowdsourcing, however, presents considerable taxonomic difficulty for existing doctrine, making the already often permeable line between public and private behavior considerably more so. Moreover, although crowdsourcing promises considerable benefit as an investigative force multiplier for police, it poses risks, including misidentification of suspects, violation of privacy, a diminution of governmental transparency and democratic accountability, and the fostering of a mutual social suspicion that is inimical to civil society.
Despite its importance, government use of crowdsourcing to achieve crime control goals has not yet been examined by legal scholars. Like the internet on which it predominantly relies, crowdsourcing is not going away; if anything, it will proliferate in coming years. The challenge lies in harnessing its potential, while protecting against the significant harms that will accrue should it go unregulated. This Essay describes the phenomenon and provides a framework for its regulation, in the hope of ensuring that the wisdom of the crowd does not become the tyranny of the crowd….(More)”.
Ann Marimow in the Washington Post: “Leaders of the federal judiciary are working to block bipartisan legislation designed to create a national database of court records that would provide free access to case documents.
Backers of the bill, who are pressing for a House vote in the coming days, envision a streamlined, user-friendly system that would allow citizens to search for court documents and dockets without having to pay. Under the current system, users pay 10 cents per page to view the public records through the service known as PACER, an acronym for Public Access to Court Electronic Records.
“Everyone wants to have a system that is technologically first class and free,” said Rep. Hank Johnson (D-Ga.), a sponsor of the legislation with Rep. Douglas A. Collins (R-Ga.).
A modern system, he said, “is more efficient and brings more transparency into the equation and is easier on the pocketbooks of regular people.”…(More)”.
Paper by Madeleine Waller and Paul Waller: “This paper collates multidisciplinary perspectives on the use of predictive analytics in government services. It moves away from the hyped narratives of “AI” or “digital”, and the broad usage of the notion of “ethics”, to focus on highlighting the possible risks of the use of prediction algorithms in public administration. Guidelines for AI use in public bodies are currently available, however there is little evidence these are being followed or that they are being written into new mandatory regulations. The use of algorithms is not just an issue of whether they are fair and safe to use, but whether they abide with the law and whether they actually work.
Particularly in public services, there are many things to consider before implementing predictive analytics algorithms, as flawed use in this context can lead to harmful consequences for citizens, individually and collectively, and public sector workers. All stages of the implementation process of algorithms are discussed, from the specification of the problem and model design through to the context of their use and the outcomes.
Evidence is drawn from case studies of use in child welfare services, the US Justice System and UK public examination grading in 2020. The paper argues that the risks and drawbacks of such technological approaches need to be more comprehensively understood, and testing done in the operational setting, before implementing them. The paper concludes that while algorithms may be useful in some contexts and help to solve problems, it seems those relating to predicting real life have a long way to go to being safe and trusted for use. As “ethics” are located in time, place and social norms, the authors suggest that in the context of public administration, laws on human rights, statutory administrative functions, and data protection — all within the principles of the rule of law — provide the basis for appraising the use of algorithms, with maladministration being the primary concern rather than a breach of “ethics”….(More)”
Report by the Center for Data Ethics and Innovation (CDEI) (UK): “Unfair biases, whether conscious or unconscious, can be a problem in many decision-making processes. This review considers the impact that an increasing use of algorithmic tools is having on bias in decision-making, the steps that are required to manage risks, and the opportunities that better use of data offers to enhance fairness. We have focused on the use of
algorithms in significant decisions about individuals, looking across four sectors (recruitment, financial services, policing and local government), and making cross-cutting recommendations that aim to help build the right systems so that algorithms improve, rather than worsen, decision-making…(More)”.
About: “The criminal legal system is a maze of laws, language, and unwritten rules that lawyers are trained to maneuver to represent defendants.
However, according to the Bureau of Justice Statistics, only 27% of county public defender’s offices meet national caseload recommendations for cases per attorney, meaning that most public defenders are overworked, leaving their clients underrepresented.
Defendants must complete an estimated 200 discrete tasks during their legal proceeding. This leaves them overwhelmed, lost, and profoundly disadvantaged while attempting to navigate the system….
We have… created a product that acts as the trusted advisor for defendants and their families as they navigate the criminal legal system. We aim to deliver valuable and relevant legal information (but not legal advice) to the user in plain language, empowering them to advocate for themselves and proactively plan for the future and access social services if necessary. The user is also encouraged to give feedback on their experience at each step of the process in the hope that this can be used to improve the system….(More)”
Trace Labs is a nonprofit organization whose mission is to accelerate
the family reunification of missing persons while training members in
the trade craft of open source intelligence (OSINT)….We crowdsource open source intelligence through both the Trace Labs OSINT Search Party CTFs and Ongoing Operations with our global community. Our highly skilled intelligence analysts then triage the data collected to produce actionable intelligence reports on each missing persons subject. These intelligence reports allow the law enforcement agencies that we work with the ability to quickly see any new details required to reopen a cold case and/or take immediate action on a missing subject.(More)”
Book by Sarah Brayne: “The scope of criminal justice surveillance has expanded rapidly in recent decades. At the same time, the use of big data has spread across a range of fields, including finance, politics, healthcare, and marketing. While law enforcement’s use of big data is hotly contested, very little is known about how the police actually use it in daily operations and with what consequences.
In Predict and Surveil, Sarah Brayne offers an unprecedented, inside look at how police use big data and new surveillance technologies, leveraging on-the-ground fieldwork with one of the most technologically advanced law enforcement agencies in the world-the Los Angeles Police Department. Drawing on original interviews and ethnographic observations, Brayne examines the causes and consequences of algorithmic control. She reveals how the police use predictive analytics to deploy resources, identify suspects, and conduct investigations; how the adoption of big data analytics transforms police organizational practices; and how the police themselves respond to these new data-intensive practices. Although big data analytics holds potential to reduce bias and increase efficiency, Brayne argues that it also reproduces and deepens existing patterns of social inequality, threatens privacy, and challenges civil liberties.
A groundbreaking examination of the growing role of the private sector in public policing, this book challenges the way we think about the data-heavy supervision law enforcement increasingly imposes upon civilians in the name of objectivity, efficiency, and public safety….(More)”.