A theoretical framework explaining the mechanisms of nudging


Paper by Löfgren, Åsa & Nordblom, Katarina: “…we develop a theoretical model to clarify the underlying mechanisms that drive individual decision making and responses to behavioral interventions, such as nudges. The contribution of the paper is three-fold: First, the model provides a theoretical framework that comprehensively structures the individual decision-making process applicable to a wide range of choice situations. Second, we reduce the confusion regarding what should be called a nudge by offering a clear classification of behavioral interventions. We distinguish among what we label as pure nudges, preference nudges, and other behavioral interventions. Third, we identify the mechanisms behind the effectiveness of behavioral interventions based on the structured decision-making process. Hence, the model can be used to predict under which circumstances, and in which choice situations, a nudge is likely to be effective….(More)”

The Governance of Digital Technology, Big Data, and the Internet: New Roles and Responsibilities for Business


Introduction to Special Issue of Business and Society by Dirk Matten, Ronald Deibert & Mikkel Flyverbom: “The importance of digital technologies for social and economic developments and a growing focus on data collection and privacy concerns have made the Internet a salient and visible issue in global politics. Recent developments have increased the awareness that the current approach of governments and business to the governance of the Internet and the adjacent technological spaces raises a host of ethical issues. The significance and challenges of the digital age have been further accentuated by a string of highly exposed cases of surveillance and a growing concern about issues of privacy and the power of this new industry. This special issue explores what some have referred to as the “Internet-industrial complex”—the intersections between business, states, and other actors in the shaping, development, and governance of the Internet…(More)”.

e-Citizens: Toward a New Model of (Inter)active Citizenry


Book by Alfredo M. Ronchi: “…This book explores a society currently being transformed by the influence of advanced information technology, and provides insights into the main technological and human issues and a holistic approach to inclusion, security, safety and, last but not least, privacy and freedom of expression. Its main aim is to bridge the gap between technological solutions, their successful implementation, and the fruitful utilization of the main set of e-Services offered by governments, private institutions, and commercial companies.
Today, various parameters actively influence e-Services’ success or failure: cultural aspects, organisational issues, bureaucracy and workflow, infrastructure and technology in general, user habits, literacy, capacity or merely interaction design. The purpose of this book is to help in outlining and understanding a realistic scenario of what we can term e-Citizenry. It identifies today’s citizen, who is surrounded by an abundance of digital services, as an “e-Citizen” and explores the transition from their traditional role and behaviour to new ones. The respective chapters presented here will lay the foundation of the technological and social environment in which this societal transition takes place…(More)”.

The Palgrave Handbook of Global Health Data Methods for Policy and Practice


Book edited by Sarah B. Macfarlane and Carla AbouZahr: “This handbook compiles methods for gathering, organizing and disseminating data to inform policy and manage health systems worldwide. Contributing authors describe national and international structures for generating data and explain the relevance of ethics, policy, epidemiology, health economics, demography, statistics, geography and qualitative methods to describing population health. The reader, whether a student of global health, public health practitioner, programme manager, data analyst or policymaker, will appreciate the methods, context and importance of collecting and using global health data….(More)”.

The People’s Republic of Walmart


Book by Leigh Phillips and Michal Rozworski: “For the left and the right, major multinational companies are held up as the ultimate expressions of free-market capitalism. Their remarkable success appears to vindicate the old idea that modern society is too complex to be subjected to a plan. And yet, as Leigh Phillips and Michal Rozworski argue, much of the economy of the West is centrally planned at present. Not only is planning on vast scales possible, we already have it and it works. The real question is whether planning can be democratic. Can it be transformed to work for us?

An engaging, polemical romp through economic theory, computational complexity, and the history of planning, The People’s Republic of Walmart revives the conversation about how society can extend democratic decision-making to all economic matters. With the advances in information technology in recent decades and the emergence of globe-straddling collective enterprises, democratic planning in the interest of all humanity is more important and closer to attainment than ever before….(More)”.

Negotiating Internet Governance


(Open Access) Book by Roxana Radu: “… provides an incisive analysis of the emergence and evolution of global Internet governance, revealing its mechanisms, key actors and dominant community practices. Based on extensive empirical analysis covering more than four decades, it presents the evolution of Internet regulation from the early days of networking to more recent debates on algorithms and artificial intelligence, putting into perspective its politically-mediated system of rules built on technical features and power differentials. 

For anyone interested in understanding contemporary global developments, this book is a primer on how norms of behaviour online and Internet regulation are renegotiated in numerous fora by a variety of actors – including governments, businesses, international organisations, civil society, technical and academic experts – and what that means for everyday users….(More)”.

Regulating disinformation with artificial intelligence


Paper for the European Parliamentary Research Service: “This study examines the consequences of the increasingly prevalent use of artificial intelligence (AI) disinformation initiatives upon freedom of expression, pluralism and the functioning of a democratic polity. The study examines the trade-offs in using automated technology to limit the spread of disinformation online. It presents options (from self-regulatory to legislative) to regulate automated content recognition (ACR) technologies in this context. Special attention is paid to the opportunities for the European Union as a whole to take the lead in setting the framework for designing these technologies in a way that enhances accountability and transparency and respects free speech. The present project reviews some of the key academic and policy ideas on technology and disinformation and highlights their relevance to European policy.

Chapter 1 introduces the background to the study and presents the definitions used. Chapter 2 scopes the policy boundaries of disinformation from economic, societal and technological perspectives, focusing on the media context, behavioural economics and technological regulation. Chapter 3 maps and evaluates existing regulatory and technological responses to disinformation. In Chapter 4, policy options are presented, paying particular attention to interactions between technological solutions, freedom of expression and media pluralism….(More)”.

Using street imagery and crowdsourcing internet marketplaces to measure motorcycle helmet use in Bangkok, Thailand


Hasan S. Merali, Li-Yi Lin, Qingfeng Li, and Kavi Bhalla in Injury Prevention: “The majority of Thailand’s road traffic deaths occur on motorised two-wheeled or three-wheeled vehicles. Accurately measuring helmet use is important for the evaluation of new legislation and enforcement. Current methods for estimating helmet use involve roadside observation or surveillance of police and hospital records, both of which are time-consuming and costly. Our objective was to develop a novel method of estimating motorcycle helmet use.

Using Google Maps, 3000 intersections in Bangkok were selected at random. At each intersection, hyperlinks of four images 90° apart were extracted. These 12 000 images were processed in Amazon Mechanical Turk using crowdsourcing to identify images containing motorcycles. The remaining images were sorted manually to determine helmet use.

After processing, 462 unique motorcycle drivers were analysed. The overall helmet wearing rate was 66.7 % (95% CI 62.6 % to 71.0 %). …

This novel method of estimating helmet use has produced results similar to traditional methods. Applying this technology can reduce time and monetary costs and could be used anywhere street imagery is used. Future directions include automating this process through machine learning….(More)”.

Identifying commonly used and potentially unsafe transit transfers with crowdsourcing


Paper by Elizabeth J.Traut and Aaron Steinfeld: “Public transit is an important contributor to sustainable transportation as well as a public service that makes necessary travel possible for many. Poor transit transfers can lead to both a real and perceived reduction in convenience and safety, especially for people with disabilities. Poor transfers can expose riders to inclement weather and crime, and they can reduce transit ridership by motivating riders who have the option of driving or using paratransit to elect a more expensive and inefficient travel mode. Unfortunately, knowledge about inconvenient, missed, and unsafe transit transfers is sparse and incomplete.

We show that crowdsourced public transit ridership data, which is more scalable than conducting traditional surveys, can be used to analyze transit transfers. The Tiramisu Transit app merges open transit data with information contributed by users about which trips they take. We use Tiramisu data to do origin-destination analysis and identify connecting trips to create a better understanding of where and when poor transfers are occurring in the Pittsburgh region. We merge the results with data from other open public data sources, including crime data, to create a data resource that can be used for planning and identification of locations where bus shelters and other infrastructure improvements may facilitate safer and more comfortable waits and more accessible transfers. We use generalizable methods to ensure broader value to both science and practitioners.

We present a case study of the Pittsburgh region, in which we identified and characterized 338 transfers from 142 users. We found that 66.6% of transfers were within 0.4 km (0.25 mi.) and 44.1% of transfers were less than 10 min. We identified the geographical distribution of transfers and found several highly-utilized transfer locations that were not identified by the Port Authority of Allegheny County as recommended transfer points, and so might need more planning attention. We cross-referenced transfer location and wait time data with crime levels to provide additional planning insight….(More)”.

Toward an Open Data Bias Assessment Tool Measuring Bias in Open Spatial Data


Working Paper by Ajjit Narayanan and Graham MacDonald: “Data is a critical resource for government decisionmaking, and in recent years, local governments, in a bid for transparency, community engagement, and innovation, have released many municipal datasets on publicly accessible open data portals. In recent years, advocates, reporters, and others have voiced concerns about the bias of algorithms used to guide public decisions and the data that power them.

Although significant progress is being made in developing tools for algorithmic bias and transparency, we could not find any standardized tools available for assessing bias in open data itself. In other words, how can policymakers, analysts, and advocates systematically measure the level of bias in the data that power city decisionmaking, whether an algorithm is used or not?

To fill this gap, we present a prototype of an automated bias assessment tool for geographic data. This new tool will allow city officials, concerned residents, and other stakeholders to quickly assess the bias and representativeness of their data. The tool allows users to upload a file with latitude and longitude coordinates and receive simple metrics of spatial and demographic bias across their city.

The tool is built on geographic and demographic data from the Census and assumes that the population distribution in a city represents the “ground truth” of the underlying distribution in the data uploaded. To provide an illustrative example of the tool’s use and output, we test our bias assessment on three datasets—bikeshare station locations, 311 service request locations, and Low Income Housing Tax Credit (LIHTC) building locations—across a few, hand-selected example cities….(More)”