Index: Crime and Criminal Justice Data


The Living Library Index – inspired by the Harper’s Index – provides important statistics and highlights global trends in governance innovation. This installment focuses on crime and criminal justice data and was originally published in 2015.

This index provides information about the type of crime and criminal justice data collected, shared and used in the United States. Because it is well known that data related to the criminal justice system is often times unreliable, or just plain missing, this index also highlights some of the issues that stand in the way of accessing useful and in-demand statistics.

Data Collections: National Crime Statistics

  • Number of incident-based crime datasets created by the Federal Bureau of Investigation (FBI): 2
    • Number of U.S. Statistical Agencies: 13
    • How many of those are focused on criminal justice: 1, the Bureau of Justice Statistics (BJS)
    • Number of data collections focused on criminal justice the BJS produces: 61
    • Number of federal-level APIs available for crime or criminal justice data: 1, the National Crime Victimization Survey (NCVS).
    • Frequency of the NCVS: annually
  • Number of Statistical Analysis Centers (SACs), organizations that are essentially clearinghouses for crime and criminal justice data for each state, the District of Columbia, Puerto Rico and the Northern Mariana Islands: 53

Open data, data use and the impact of those efforts

  • Number of datasets that are returned when “criminal justice” is searched for on Data.gov: 417, including federal-, state- and city-level datasets
  • Number of datasets that are returned when “crime” is searched for on Data.gov: 281
  • The percentage that public complaints dropped after officers started wearing body cameras, according to a study done in Rialto, Calif.: 88
  • The percentage that reported incidents of officer use of force fell after officers started wearing body cameras, according to a study done in Rialto, Calif.: 5
  • The percent that crime decreased during an experiment in predictive policing in Shreveport, LA: 35  
  • Number of crime data sets made available by the Seattle Police Department – generally seen as a leader in police data innovation – on the Seattle.gov website: 4
    • Major crime stats by category in aggregate
    • Crime trend reports
    • Precinct data by beat
    • State sex offender database
  • Number of datasets mapped by the Seattle Police Department: 2:
      • 911 incidents
    • Police reports
  • Number of states where risk assessment tools must be used in pretrial proceedings to help determine whether an offender is released from jail before a trial: at least 11.

Police Data

    • Number of federally mandated databases that collect information about officer use of force or officer involved shootings, nationwide: 0
    • The year a crime bill was passed that called for data on excessive force to be collected for research and statistical purposes, but has never been funded: 1994
    • Number of police departments that committed to being a part of the White House’s Police Data Initiative: 21
    • Percentage of police departments surveyed in 2013 by the Office of Community Oriented Policing within the Department of Justice that are not using body cameras, therefore not collecting body camera data: 75

The criminal justice system

  • Parts of the criminal justice system where data about an individual can be created or collected: at least 6
    • Entry into the system (arrest)
    • Prosecution and pretrial
    • Sentencing
    • Corrections
    • Probation/parole
    • Recidivism

Sources

  • Crime Mapper. Philadelphia Police Department. Accessed August 24, 2014.

The Trouble With Disclosure: It Doesn’t Work


Jesse Eisinger at ProPublica: “Louis Brandeis was wrong. The lawyer and Supreme Court justice famously declared that sunlight is the best disinfectant, and we have unquestioningly embraced that advice ever since.

 Over the last century, disclosure and transparency have become our regulatory crutch, the answer to every vexing problem. We require corporations and government to release reams of information on food, medicine, household products, consumer financial tools, campaign finance and crime statistics. We have a booming “report card” industry for a range of services, including hospitals, public schools and restaurants.

All this sunlight is blinding. As new scholarship is demonstrating, the value of all this information is unproved. Paradoxically, disclosure can be useless — and sometimes actually harmful or counterproductive.

“We are doing disclosure as a regulatory move all over the board,” says Adam J. Levitin, a law professor at Georgetown, “The funny thing is, we are doing this despite very little evidence of its efficacy.”

Let’s start with something everyone knows about — the “terms of service” agreements for the likes of iTunes. Like everybody else, I click the “I agree” box, feeling a flash of resentment. I’m certain that in Paragraph 184 is a clause signing away my firstborn to a life of indentured servitude to Timothy D. Cook as his chief caviar spoon keeper.

Our legal theoreticians have determined these opaque monstrosities work because someone, somewhere reads the fine print in these contracts and keeps corporations honest. It turns out what we laymen intuit is true: No one reads them, according to research by a New York University law professor, Florencia Marotta-Wurgler.

In real life, there is no critical mass of readers policing the agreements. And if there were an eagle-eyed crew of legal experts combing through these agreements, what recourse would they have? Most people don’t even know that the Supreme Court has gutted their rights to sue in court, and they instead have to go into arbitration, which usually favors corporations.

The disclosure bonanza is easy to explain. Nobody is against it. It’s politically expedient. Companies prefer such rules, especially in lieu of actual regulations that would curtail bad products or behavior. The opacity lobby — the remora fish class of lawyers, lobbyists and consultants in New York and Washington — knows that disclosure requirements are no bar to dodgy practices. You just have to explain what you’re doing in sufficiently incomprehensible language, a task that earns those lawyers a hefty fee.

Of course, some disclosure works. Professor Levitin cites two examples. The first is an olfactory disclosure. Methane doesn’t have any scent, but a foul smell is added to alert people to a gas leak. The second is ATM. fees. A study in Australia showed that once fees were disclosed, people avoided the high-fee machines and took out more when they had to go to them.

But to Omri Ben-Shahar, co-author of a recent book, ” More Than You Wanted To Know: The Failure of Mandated Disclosure,” these are cherry-picked examples in a world awash in useless disclosures. Of course, information is valuable. But disclosure as a regulatory mechanism doesn’t work nearly well enough, he argues….(More)

From Governmental Open Data Toward Governmental Open Innovation (GOI)


Chapter by Daniele Archibugi et al in The Handbook of Global Science, Technology, and Innovation: “Today, governments release governmental data that were previously hidden to the public. This democratization of governmental open data (OD) aims to increase transparency but also fuels innovation. Indeed, the release of governmental OD is a global trend, which has evolved into governmental open innovation (GOI). In GOI, governmental actors purposively manage the knowledge flows that span organizational boundaries and reveal innovation-related knowledge to the public with the aim to spur innovation for a higher economic and social welfare at regional, national, or global scale. GOI subsumes different revealing strategies, namely governmental OD, problem, and solution revealing. This chapter introduces the concept of GOI that has evolved from global OD efforts. It present a historical analysis of the emergence of GOI in four different continents, namely, Europe (UK and Denmark), North America (United States and Mexico), Australia, and China to highlight the emergence of GOI at a global scale….(More)”

Transforming Government Information


Sharyn Clarkson at the (Interim) Digital Transformation Office (Australia): “Our challenge: How do we get the right information and services to people when and where they need it?

The public relies on Government for a broad range of information – advice for individuals and businesses, what services are available and how to access them, and how various rules and laws impact our lives.

The government’s digital environment has grown organically over the last couple of decades. At the moment, information is largely created and managed within agencies and published across more than 1200 disparate gov.au websites, plus a range of social media accounts, apps and other digital formats.

This creates some difficulties for people looking for government information. By publishing within agency silos we are presenting people with an agency-centric view of government information. This is a problem because people largely don’t understand or care about how government organises itself and the structure of government does not map to the needs of people. Having a baby or travelling overseas? Up to a dozen government agencies may have information relevant to you. And as people’s needs span more than one agency, they end up with a disjointed and confusing user experience as they have to navigate across disparate government sites. And even if you begin at your favourite search engine how do you know which of the many government search results is the right place to start?

There are two government entry points already in place to help users – Australia.gov.au and business.gov.au – but they largely act as an umbrella across the 1200+ sites and currently only provide a very thin layer of whole of government information and mainly refer people off to other websites.

The establishment of the DTO has provided the first opportunity for people to come together and better understand how our underlying structural landscape is impacting people’s experience with government. It’s also given us an opportunity to take a step back and ask some of the big questions about how we manage information and what problems can only really be solved through whole of government transformation.

How do we make information and services easier to find? How do we make sure we provide information that people can trust and rely upon at times of need? How should the gov.au landscape be organised to make it easier for us to meet user’s needs and expectations? How many websites should we have – assuming 1200 is too many? What makes up a better user experience – does it mean all sites should look and feel the same? How can we provide government information at the places people naturally go looking for assistance – even if these are not government sites?

As we asked these questions we started to come across some central ideas:

  • What if we could decouple the authoring and management of information from the publishing process, so the subject experts in government still manage their content but we have flexibility to present it in more user-centric ways?
  • What if we unleashed government information? Making it possible for state and local governments, non-profit groups and businesses to deliver content and services alongside their own information to give better value users.
  • Should we move the bureaucratic content (information about agencies and how they are managed such as annual reports, budget statements and operating rules) out of the way of core content and services for people? Can we simplify our environment and base it around topics and life events instead of agencies? What if we had people in government responsible for curating these topics and life events across agencies and creating simpler pathways for users?…(More)”

Big Data’s Impact on Public Transportation


InnovationEnterprise: “Getting around any big city can be a real pain. Traffic jams seem to be a constant complaint, and simply getting to work can turn into a chore, even on the best of days. With more people than ever before flocking to the world’s major metropolitan areas, the issues of crowding and inefficient transportation only stand to get much worse. Luckily, the traditional methods of managing public transportation could be on the verge of changing thanks to advances in big data. While big data use cases have been a part of the business world for years now, city planners and transportation experts are quickly realizing how valuable it can be when making improvements to city transportation. That hour long commute may no longer be something travelers will have to worry about in the future.

In much the same way that big data has transformed businesses around the world by offering greater insight in the behavior of their customers, it can also provide a deeper look at travellers. Like retail customers, commuters have certain patterns they like to keep to when on the road or riding the rails. Travellers also have their own motivations and desires, and getting to the heart of their actions is all part of what big data analytics is about. By analyzing these actions and the factors that go into them, transportation experts can gain a better understanding of why people choose certain routes or why they prefer one method of transportation over another. Based on these findings, planners can then figure out where to focus their efforts and respond to the needs of millions of commuters.

Gathering the accurate data needed to make knowledgeable decisions regarding city transportation can be a challenge in itself, especially considering how many people commute to work in a major city. New methods of data collection have made that effort easier and a lot less costly. One way that’s been implemented is through the gathering of call data records (CDR). From regular transactions made from mobile devices, information about location, time, and duration of an action (like a phone call) can give data scientists the necessary details on where people are traveling to, how long it takes them to get to their destination, and other useful statistics. The valuable part of this data is the sample size, which provides a much bigger picture of the transportation patterns of travellers.

That’s not the only way cities are using big data to improve public transportation though. Melbourne in Australia has long been considered one of the world’s best cities for public transit, and much of that is thanks to big data. With big data and ad hoc analysis, Melbourne’s acclaimed tram system can automatically reconfigure routes in response to sudden problems or challenges, such as a major city event or natural disaster. Data is also used in this system to fix problems before they turn serious.Sensors located in equipment like tram cars and tracks can detect when maintenance is needed on a specific part. Crews are quickly dispatched to repair what needs fixing, and the tram system continues to run smoothly. This is similar to the idea of the Internet of Things, wherein embedded sensors collect data that is then analyzed to identify problems and improve efficiency.

Sao Paulo, Brazil is another city that sees the value of using big data for its public transportation. The city’s efforts concentrate on improving the management of its bus fleet. With big data collected in real time, the city can get a more accurate picture of just how many people are riding the buses, which routes are on time, how drivers respond to changing conditions, and many other factors. Based off of this information, Sao Paulo can optimize its operations, providing added vehicles where demand is genuine whilst finding which routes are the most efficient. Without big data analytics, this process would have taken a very long time and would likely be hit-or-miss in terms of accuracy, but now, big data provides more certainty in a shorter amount of time….(More)”

Government data does not mean data governance: Lessons learned from a public sector application audit


Paper by Nik ThompsonRavi Ravindran, and Salvatore Nicosia: “Public sector agencies routinely store large volumes of information about individuals in the community. The storage and analysis of this information benefits society, as it enables relevant agencies to make better informed decisions and to address the individual’s needs more appropriately. Members of the public often assume that the authorities are well equipped to handle personal data; however, due to implementation errors and lack of data governance, this is not always the case. This paper reports on an audit conducted in Western Australia, focusing on findings in the Police Firearms Management System and the Department of Health Information System. In the case of the Police, the audit revealed numerous data protection issues leading the auditors to report that they had no confidence in the accuracy of information on the number of people licensed to possess firearms or the number of licensed firearms. Similarly alarming conclusions were drawn in the Department of Health as auditors found that they could not determine which medical staff member was responsible for clinical data entries made. The paper describes how these issues often do not arise from existing business rules or the technology itself, but a lack of sound data governance. Finally, a discussion section presents key data governance principles and best practices that may guide practitioners involved in data management. These cases highlight the very real data management concerns, and the associated recommendations provide the context to spark further interest in the applied aspects of data protection….(More)”

 

Shifting from research governance to research ethics: A novel paradigm for ethical review in community-based research


Paper by Jay Marlowe and Martin Tolich: “This study examines a significant gap in the role of providing ethical guidance and support for community-based research. University and health-based ethical review committees in New Zealand predominantly serve as ‘gatekeepers’ that consider the ethical implications of a research design in order to protect participants and the institution from harm. However, in New Zealand, community-based researchers routinely do not have access to this level of support or review. A relatively new group, the New Zealand Ethics Committee (NZEC), formed in 2012, responds to the uneven landscape of access for community-based research. By offering ethical approval inclusive of the review of a project’s study design outside institutional settings, NZEC has endeavoured to move beyond a gatekeeping research governance function to that of bridge-building. This change of focus presents rich possibilities but also a number of limitations for providing ethical review outside conventional institutional contexts. This paper reports on the NZEC’s experience of working with community researchers to ascertain the possibilities and tensions of shifting ethics review processes from research governance to a focus on research ethics in community-based participatory research….(More)”

Ready Steady Gov


Joshua Chambers at FutureGov: “…two public servants in Western Australia have come up with an alternative way of pushing forwards their government’s digital delivery.

Their new project, Ready Steady Gov, provides free web templates based on an open source CMS so that any agency can quickly upgrade their web site, for free. The officials’ templates are based on the web site guidance published by the state: the Web Governance Framework and the Common Website Elements documentation.

The site was motivated by a desire to quickly improve government web sites. “I’m sure you’ve heard the phrase… ‘Everything takes longer in government’. We want building websites to become an exception to this rule,” wrote Jessy Yuen and Vincent Manera, the project’s founders.

They have created five open source templates “which are lightly styled so that you can easily integrate your own branding”. They are responsive so that they fit all screen sizes, and meet the required accessibility standards….(More)”

Mobileview

The International Handbook Of Public Administration And Governance


New book edited by Andrew Massey and Karen Johnston: “…Handbook explores key questions around the ways in which public administration and governance challenges can be addressed by governments in an increasingly globalized world. World-leading experts explore contemporary issues of government and governance, as well as the relationship between civil society and the political class. The insights offered will allow policy makers and officials to explore options for policy making in a new and informed way.

Adopting global perspectives of governance and public sector management, the Handbook includes scrutiny of current issues such as: public policy capacity, wicked policy problems, public sector reforms, the challenges of globalization and complexity management. Practitioners and scholars of public administration deliver a range of perspectives on the abiding wicked issues and challenges to delivering public services, and the way that delivery is structured. The Handbook uniquely provides international coverage of perspectives from Africa, Asia, North and South America, Europe and Australia.

Practitioners and scholars of public administration, public policy, public sector management and international relations will learn a great deal from this Handbook about the issues and structures of government and governance in an increasingly complex world. (Full table of contents)… (More).”

Study to examine Australian businesses’ use of government data


ComputerWorld: “The New York University’s GovLab and the federal Department of Communications have embarked on a study of how Australian organisations are employing government data sets.

The ‘Open Data 500’ study was launched today at the Locate15 conference. It aims to provide a basis for assessing the value of open data and encourage the development of new businesses based on open data, as well as encourage discussion about how to make government data more useful to businesses and not-for-profit organisations.

The study is part of a series of studies taking place under the auspices of the OD500 Global Network.

“This study will help ensure the focus of Government is on the publication of high value datasets, with an emphasis on quality rather than quantity,” a statement issued by the Department of Communications said.

“Open Data 500 advances the government’s policy of increasing the number of high value public datasets in Australia in an effort to drive productivity and innovation, as well as its commitment to greater consultation with private sector stakeholders on open data,” Communications Minister Malcolm Turnbull said in remarks prepared for the Locate 15 conference….(More)”