The Unsung Role That Ordinary Citizens Played in the Great Crime Decline


Emily Badger in The New York Times: “Most theories for the great crime decline that swept across nearly every major American city over the last 25 years have focused on the would-be criminals.

Their lives changed in many ways starting in the 1990s: Strict new policing tactics kept closer watch on them. Mass incarceration locked them up in growing numbers. The crack epidemic that ensnared many began to recede. Even the more unorthodox theories — around the rise of abortion, the reduction in lead or the spread of A.D.H.D. medication — have argued that larger shifts in society altered the behavior (and existence) of potential criminals.

But none of these explanations have paid much attention to the communities where violence plummeted the most. New research suggests that people there were working hard, with little credit, to address the problem themselves.

Local nonprofit groups that responded to the violence by cleaning streets, building playgrounds, mentoring children and employing young men had a real effect on the crime rate. That’s what Patrick Sharkey, a sociologist at New York University, argues in a new study and a forthcoming book. Mr. Sharkey doesn’t contend that community groups alone drove the national decline in crime, but rather that their impact is a major missing piece.

“This was a part that has been completely overlooked and ignored in national debates over the crime drop,” he said. “But I think it’s fundamental to what happened.”…(More)”.

Once Upon an Algorithm: How Stories Explain Computing


Book by Martin Erwig: “Picture a computer scientist, staring at a screen and clicking away frantically on a keyboard, hacking into a system, or perhaps developing an app. Now delete that picture. In Once Upon an Algorithm, Martin Erwig explains computation as something that takes place beyond electronic computers, and computer science as the study of systematic problem solving. Erwig points out that many daily activities involve problem solving. Getting up in the morning, for example: You get up, take a shower, get dressed, eat breakfast. This simple daily routine solves a recurring problem through a series of well-defined steps. In computer science, such a routine is called an algorithm.

Erwig illustrates a series of concepts in computing with examples from daily life and familiar stories. Hansel and Gretel, for example, execute an algorithm to get home from the forest. The movie Groundhog Day illustrates the problem of unsolvability; Sherlock Holmes manipulates data structures when solving a crime; the magic in Harry Potter’s world is understood through types and abstraction; and Indiana Jones demonstrates the complexity of searching. Along the way, Erwig also discusses representations and different ways to organize data; “intractable” problems; language, syntax, and ambiguity; control structures, loops, and the halting problem; different forms of recursion; and rules for finding errors in algorithms.

This engaging book explains computation accessibly and shows its relevance to daily life. Something to think about next time we execute the algorithm of getting up in the morning…(More)”.

Public Brainpower: Civil Society and Natural Resource Management


Book edited by Indra Øverland: ” …examines how civil society, public debate and freedom of speech affect natural resource governance. Drawing on the theories of Robert Dahl, Jurgen Habermas and Robert Putnam, the book introduces the concept of ‘public brainpower’, proposing that good institutions require: fertile public debate involving many and varied contributors to provide a broad base for conceiving new institutions; checks and balances on existing institutions; and the continuous dynamic evolution of institutions as the needs of society change.

The book explores the strength of these ideas through case studies of 18 oil and gas-producing countries: Algeria, Angola, Azerbaijan, Canada, Colombia, Egypt, Iraq, Kazakhstan, Libya, Netherlands, Nigeria, Norway, Qatar, Russia, Saudi, UAE, UK and Venezuela. The concluding chapter includes 10 tenets on how states can maximize their public brainpower, and a ranking of 33 resource-rich countries and the degree to which they succeed in doing so.

The Introduction and the chapters ‘Norway: Public Debate and the Management of Petroleum Resources and Revenues’, ‘Kazakhstan: Civil Society and Natural-Resource Policy in Kazakhstan’, and ‘Russia: Public Debate and the Petroleum Sector’ of this book are available open access under a CC BY 4.0 license at link.springer.com….(More)”.

Government 3.0 – Next Generation Government Technology Infrastructure and Services


Book edited by Adegboyega  Ojo and Jeremy Millard: “Historically, technological change has had significant effect on the locus of administrative activity, cost of carrying out administrative tasks, the skill sets needed by officials to effectively function, rules and regulations, and the types of interactions citizens have with their public authorities. Next generation Public Sector Innovation will be  “Government 3.0” powered by innovations related to Open and big data, administrative and business process management, Internet-of-Things and blockchains for public sector innovation to drive improvements in service delivery, decision and policy making and resource management. This book provides fresh insights into this transformation while also examining possible negative side effects of the increasing ope

nness of governments through the adoption of these new innovations. The goal is for technology policy makers to engage with the visions of Government 3.0 . Researchers should be able to critically examine some of the innovations described in the book as the basis for developing research agendas related to challenges associated with the adoption and use of some of the associated technologies.  The book serves as a rich source of materials from leading experts in the field that enables Public administration practitioners to better understand how these new technologies impact traditional public administration paradigms. The book is suitable for graduate courses in Public Sector Innovation, Innovation in Public Administration, E-Government and Information Systems. Public sector technology policy makers, e-government, information systems and public administration researchers and practitioners should all benefit from reading this book….(More).”

Our laws don’t do enough to protect our health data


 at the Conversation: “A particularly sensitive type of big data is medical big data. Medical big data can consist of electronic health records, insurance claims, information entered by patients into websites such as PatientsLikeMeand more. Health information can even be gleaned from web searches, Facebook and your recent purchases.

Such data can be used for beneficial purposes by medical researchers, public health authorities, and healthcare administrators. For example, they can use it to study medical treatments, combat epidemics and reduce costs. But others who can obtain medical big data may have more selfish agendas.

I am a professor of law and bioethics who has researched big data extensively. Last year, I published a book entitled Electronic Health Records and Medical Big Data: Law and Policy.

I have become increasingly concerned about how medical big data might be used and who could use it. Our laws currently don’t do enough to prevent harm associated with big data.

What your data says about you

Personal health information could be of interest to many, including employers, financial institutions, marketers and educational institutions. Such entities may wish to exploit it for decision-making purposes.

For example, employers presumably prefer healthy employees who are productive, take few sick days and have low medical costs. However, there are laws that prohibit employers from discriminating against workers because of their health conditions. These laws are the Americans with Disabilities Act (ADA) and the Genetic Information Nondiscrimination Act. So, employers are not permitted to reject qualified applicants simply because they have diabetes, depression or a genetic abnormality.

However, the same is not true for most predictive information regarding possible future ailments. Nothing prevents employers from rejecting or firing healthy workers out of the concern that they will later develop an impairment or disability, unless that concern is based on genetic information.

What non-genetic data can provide evidence regarding future health problems? Smoking status, eating preferences, exercise habits, weight and exposure to toxins are all informative. Scientists believe that biomarkers in your blood and other health details can predict cognitive decline, depression and diabetes.

Even bicycle purchases, credit scores and voting in midterm elections can be indicators of your health status.

Gathering data

How might employers obtain predictive data? An easy source is social media, where many individuals publicly post very private information. Through social media, your employer might learn that you smoke, hate to exercise or have high cholesterol.

Another potential source is wellness programs. These programs seek to improve workers’ health through incentives to exercise, stop smoking, manage diabetes, obtain health screenings and so on. While many wellness programs are run by third party vendors that promise confidentiality, that is not always the case.

In addition, employers may be able to purchase information from data brokers that collect, compile and sell personal information. Data brokers mine sources such as social media, personal websites, U.S. Census records, state hospital records, retailers’ purchasing records, real property records, insurance claims and more. Two well-known data brokers are Spokeo and Acxiom.

Some of the data employers can obtain identify individuals by name. But even information that does not provide obvious identifying details can be valuable. Wellness program vendors, for example, might provide employers with summary data about their workforce but strip away particulars such as names and birthdates. Nevertheless, de-identified information can sometimes be re-identified by experts. Data miners can match information to data that is publicly available….(More)”.

The Arsenal of Exclusion and Inclusion


Book by Tobias Armborst, Daniel D’Oca and Georgeen Theodore: “Urban History 101 teaches us that the built environment is not the product of invisible, uncontrollable market forces, but of human-made tools that could have been used differently (or not at all). The Arsenal of Exclusion & Inclusion is an encyclopedia of 202 tools–or what we call “weapons”–used by architects, planners, policy-makers, developers, real estate brokers, activists, and other urban actors in the United States use to restrict or increase access to urban space. The Arsenal of Exclusion & Inclusion inventories these weapons, examines how they have been used, and speculates about how they might be deployed (or retired) to make more open cities in which more people feel welcome in more spaces.

The Arsenal of Exclusion & Inclusion includes minor, seemingly benign weapons like no loitering signs and bouncers, but also big, headline-grabbing things like eminent domaon and city-county consolidation. It includes policies like expulsive zoning and annexation, but also practices like blockbusting, institutions like neighborhood associations, and physical things like bombs and those armrests that park designers put on benches to make sure homeless people don’t get too comfortable. It includes historical things that aren’t talked about too much any more (e.g., ugly laws), things that seem historical but aren’t (e.g., racial steering), and things that are brand new (e.g., aging improvement district).

With contributions from over fifty of the best minds in architecture, urban planning, urban history, and geography, The Arsenal of Exclusion & Inclusion offers a wide-ranging view of the policies, institutions, and social practices that shape our cities. It can be read as a historical account of the making of the modern American city, a toolbox of best practices for creating better, more just spaces, or as an introduction to the process of city-making in The United States….(More)”.

Open Space: The Global Effort for Open Access to Environmental Satellite Data


Book by Mariel Borowitz: “Key to understanding and addressing climate change is continuous and precise monitoring of environmental conditions. Satellites play an important role in collecting climate data, offering comprehensive global coverage that can’t be matched by in situ observation. And yet, as Mariel Borowitz shows in this book, much satellite data is not freely available but restricted; this remains true despite the data-sharing advocacy of international organizations and a global open data movement. Borowitz examines policies governing the sharing of environmental satellite data, offering a model of data-sharing policy development and applying it in case studies from the United States, Europe, and Japan—countries responsible for nearly half of the unclassified government Earth observation satellites.

Borowitz develops a model that centers on the government agency as the primary actor while taking into account the roles of such outside actors as other government officials and non-governmental actors, as well as the economic, security, and normative attributes of the data itself. The case studies include the U.S. National Aeronautics and Space Administration (NASA) and the U.S. National Oceanographic and Atmospheric Association (NOAA), and the United States Geological Survey (USGS); the European Space Agency (ESA) and the European Organization for the Exploitation of Meteorological Satellites (EUMETSAT); and the Japanese Aerospace Exploration Agency (JAXA) and the Japanese Meteorological Agency (JMA). Finally, she considers the policy implications of her findings for the future and provides recommendations on how to increase global sharing of satellite data….(More)”.

Decoding the Social World: Data Science and the Unintended Consequences of Communication


Book by Sandra González-Bailón: “Social life is full of paradoxes. Our intentional actions often trigger outcomes that we did not intend or even envision. How do we explain those unintended effects and what can we do to regulate them? In Decoding the Social World, Sandra González-Bailón explains how data science and digital traces help us solve the puzzle of unintended consequences—offering the solution to a social paradox that has intrigued thinkers for centuries. Communication has always been the force that makes a collection of people more than the sum of individuals, but only now can we explain why: digital technologies have made it possible to parse the information we generate by being social in new, imaginative ways. And yet we must look at that data, González-Bailón argues, through the lens of theories that capture the nature of social life. The technologies we use, in the end, are also a manifestation of the social world we inhabit.

González-Bailón discusses how the unpredictability of social life relates to communication networks, social influence, and the unintended effects that derive from individual decisions. She describes how communication generates social dynamics in aggregate (leading to episodes of “collective effervescence”) and discusses the mechanisms that underlie large-scale diffusion, when information and behavior spread “like wildfire.” She applies the theory of networks to illuminate why collective outcomes can differ drastically even when they arise from the same individual actions. By opening the black box of unintended effects, González-Bailón identifies strategies for social intervention and discusses the policy implications—and how data science and evidence-based research embolden critical thinking in a world that is constantly changing….(More)”.

Policy Analytics, Modelling, and Informatics


Book edited by J. Ramon Gil-Garcia, Theresa A. Pardo and Luis F. Luna-Reyes: “This book provides a comprehensive approach to the study of policy analytics, modelling and informatics. It includes theories and concepts for understanding tools and techniques used by governments seeking to improve decision making through the use of technology, data, modelling, and other analytics, and provides relevant case studies and practical recommendations. Governments around the world face policy issues that require strategies and solutions using new technologies, new access to data and new analytical tools and techniques such as computer simulation, geographic information systems, and social network analysis for the successful implementation of public policy and government programs. Chapters include cases, concepts, methodologies, theories, experiences, and practical recommendations on data analytics and modelling for public policy and practice, and addresses a diversity of data tools, applied to different policy stages in several contexts, and levels and branches of government. This book will be of interest of researchers, students, and practitioners in e-government, public policy, public administration, policy analytics and policy informatics….(More)”.

The application of crowdsourcing approaches to cancer research: a systematic review


Paper by Young Ji Lee, Janet A. Arida, and Heidi S. Donovan at Cancer Medicine: “Crowdsourcing is “the practice of obtaining participants, services, ideas, or content by soliciting contributions from a large group of people, especially via the Internet.” (Ranard et al. J. Gen. Intern. Med. 29:187, 2014) Although crowdsourcing has been adopted in healthcare research and its potential for analyzing large datasets and obtaining rapid feedback has recently been recognized, no systematic reviews of crowdsourcing in cancer research have been conducted. Therefore, we sought to identify applications of and explore potential uses for crowdsourcing in cancer research. We conducted a systematic review of articles published between January 2005 and June 2016 on crowdsourcing in cancer research, using PubMed, CINAHL, Scopus, PsychINFO, and Embase. Data from the 12 identified articles were summarized but not combined statistically. The studies addressed a range of cancers (e.g., breast, skin, gynecologic, colorectal, prostate). Eleven studies collected data on the Internet using web-based platforms; one recruited participants in a shopping mall using paper-and-pen data collection. Four studies used Amazon Mechanical Turk for recruiting and/or data collection. Study objectives comprised categorizing biopsy images (n = 6), assessing cancer knowledge (n = 3), refining a decision support system (n = 1), standardizing survivorship care-planning (n = 1), and designing a clinical trial (n = 1). Although one study demonstrated that “the wisdom of the crowd” (NCI Budget Fact Book, 2017) could not replace trained experts, five studies suggest that distributed human intelligence could approximate or support the work of trained experts. Despite limitations, crowdsourcing has the potential to improve the quality and speed of research while reducing costs. Longitudinal studies should confirm and refine these findings….(More)”