Stefaan Verhulst
Article by Michael Walton: “Two ideas in development – activating agency of citizens and using “nudges” to change their behavior – seem diametrically opposed in spirit: activating latent agency at the ground level versus top-down designs that exploit people’s behavioral responses. Yet both start from a psychological focus and a belief that changes in people’s behavior can lead to “better” outcomes, for the individuals involved and for society. So how should we think of these contrasting sets of ideas? When should each approach be used?…
Let’s compare the two approaches with respect to diagnostic frame, practice and ethics.
Diagnostic frame.
The common ground is recognition that people use short-cuts for decision-making, in ways that can hurt their own interests. In both approaches, there is an emphasis that decision-making is particularly tough for poor people, given the sheer weight of daily problem-solving. In behavioral economics one core idea is that we have limited mental “bandwidth” and this form of scarcity hampers decision-making. However, in the “agency” tradition, there is much more emphasis on unearthing and working with the origins of the prevailing mental models, with respect to social exclusion, stigmatization, and the typically unequal economic and cultural relations with respect to more powerful groups in a society. One approach works more with symptoms, the other with root causes.
Implications for practice.
The two approaches on display in Cerrito both concern social gains, and both involve a role for an external actor. But here the contrast is sharp. In the “nudge” approach the external actor is a beneficent technocrat, trying out alternative offers to poor (or non-poor) people to improve outcomes. A vivid example is alternative messages to tax payers in Guatemala, that induce varying improvements in tax payments. In the “agency” approach the essence of the interaction is between a front-line worker and an individual or family, with a co-created diagnosis and plan, designed around goals and specific actions that the poor person chooses. This is akin to what anthropologist Arjun Appadurai termed increasing the “capacity to aspire,” and can extend to greater engagement in civic and political life.
Ethics.
In both approaches, ethics is central. As implicated in the “nudging for social good as opposed to electoral gain,” some form of ethical regulation is surely needed. In “action to activate agency,” the central ethical issue is of maintaining equality in design between activist and citizen, and explicit owning of any decisions.
What does this imply?
To some degree this is a question of domain of action. Nudging is most appropriate in a program for which there is a fully supported political and social program, and the issue is how to make it work (as in paying taxes). The agency approach has a broader ambition, but starts from domains that are potentially within an individual’s control once the sources of “ineffective” or inhibited behavior are tackled, including via front-line interactions with public or private actors….(More)”.
Introduction by Matt Hancock MP, Secretary of State for Digital, Culture, Media and Sport to the UK’s Data Ethics Framework: “Making better use of data offers huge benefits, in helping us provide the best possible services to the people we serve.
However, all new opportunities present new challenges. The pace of technology is changing so fast that we need to make sure we are constantly adapting our codes and standards. Those of us in the public sector need to lead the way.
As we set out to develop our National Data Strategy, getting the ethics right, particularly in the delivery of public services, is critical. To do this, it is essential that we agree collective standards and ethical frameworks.
Ethics and innovation are not mutually exclusive. Thinking carefully about how we use our data can help us be better at innovating when we use it.
Our new Data Ethics Framework sets out clear principles for how data should be used in the public sector. It will help us maximise the value of data whilst also setting the highest standards for transparency and accountability when building or buying new data technology.
We have come a long way since we published the first version of the Data Science Ethical Framework. This new version focuses on the need for technology, policy and operational specialists to work together, so we can make the most of expertise from across disciplines.
We want to work with others to develop transparent standards for using new technology in the public sector, promoting innovation in a safe and ethical way.
This framework will build the confidence in public sector data use needed to underpin a strong digital economy. I am looking forward to working with all of you to put it into practice…. (More)”
The Data Ethics Framework principles
1.Start with clear user need and public benefit
2.Be aware of relevant legislation and codes of practice
3.Use data that is proportionate to the user need
4.Understand the limitations of the data
5.Ensure robust practices and work within your skillset
Technical note by Lafuente, Mariano and González, Sebastián prepared as part of the Inter-American Development Bank’s (IDB) agenda on Center of Government: “… analyzes how delivery units (DU) have been adapted by Latin American and Caribbean governments, the degree to which they have contributed to meeting governments’ priority goals between 2007 and 2018, and the lessons learned along the way. The analysis, which draws lessons from 14 governments in the region, shows that the implementation of the DU model has varied as it has been tailored to each country’s context and that, under certain preconditions, has contributed to: (i) improved management using specific tools in contexts where institutional development is low; and (ii) attaining results that have a direct impact on citizens. The objective of this document is to serve as a guide for governments interested in applying similar management models as well as to set out an agenda for the future of DU in the region….(More)“.
World Bank Policy Research Paper by Daniel Ayalew Ali, Klaus Deininger and Michael Wild: “The technical complexity of ensuring that tax rolls are complete and valuations current is often perceived as a major barrier to bringing in more property tax revenues in developing countries.
This paper shows how high-resolution satellite imagery makes it possible to assess the completeness of existing tax maps by estimating built-up areas based on building heights and footprints. Together with information on sales prices from the land registry, targeted surveys, and routine statistical data, this makes it possible to use mass valuation procedures to generate tax maps. The example of Kigali illustrates the reliability of the method and the potentially far-reaching revenue impacts. Estimates show that heightened compliance and a move to a 1 percent ad valorem tax would yield a tenfold increase in revenue from public land….(More)”.
The Conversation: “Social media sites’ responses to the Facebook-Cambridge Analytica scandal and new European privacy regulations have given users much more control over who can access their data, and for what purposes. To me, as a social media user, these are positive developments: It’s scary to think what these platforms could do with the troves of data available about me. But as a researcher, increased restrictions on data sharing worry me.
I am among the many scholars who depend on data from social media to gain insights into people’s actions. In a rush to protect individuals’ privacy, I worry that an unintended casualty could be knowledge about human nature. My most recent work, for example, analyzes feelings people express on Twitter to explain why the stock market fluctuates so much over the course of a single day. There are applications well beyond finance. Other scholars have studied mass transit rider satisfaction, emergency alert systems’ function during natural disasters and how online interactions influence people’s desire to lead healthy lifestyles.
This poses a dilemma – not just for me personally, but for society as a whole. Most people don’t want social media platforms to share or sell their personal information, unless specifically authorized by the individual user. But as members of a collective society, it’s useful to understand the social forces at work influencing everyday life and long-term trends. Before the recent crises, Facebook and other companies had already been making it hard for legitimate researchers to use their data, including by making it more difficult and more expensive to download and access data for analysis. The renewed public pressure for privacy means it’s likely to get even tougher….
It’s true – and concerning – that some presumably unethical people have tried to use social media data for their own benefit. But the data are not the actual problem, and cutting researchers’ access to data is not the solution. Doing so would also deprive society of the benefits of social media analysis.
Fortunately, there is a way to resolve this dilemma. Anonymization of data can keep people’s individual privacy intact, while giving researchers access to collective data that can yield important insights.
There’s even a strong model for how to strike that balance efficiently: the U.S. Census Bureau. For decades, that government agency has collected extremely personal data from households all across the country: ages, employment status, income levels, Social Security numbers and political affiliations. The results it publishes are very rich, but also not traceable to any individual.
It often is technically possible to reverse anonymity protections on data, using multiple pieces of anonymized information to identify the person they all relate to. The Census Bureau takes steps to prevent this.
For instance, when members of the public access census data, the Census Bureau restricts information that is likely to identify specific individuals, such as reporting there is just one person in a community with a particularly high- or low-income level.
For researchers the process is somewhat different, but provides significant protections both in law and in practice. Scholars have to pass the Census Bureau’s vetting process to make sure they are legitimate, and must undergo training about what they can and cannot do with the data. The penalties for violating the rules include not only being barred from using census data in the future, but also civil fines and even criminal prosecution.
Even then, what researchers get comes without a name or Social Security number. Instead, the Census Bureau uses what it calls “protected identification keys,” a random number that replaces data that would allow researchers to identify individuals.
Each person’s data is labeled with his or her own identification key, allowing researchers to link information of different types. For instance, a researcher wanting to track how long it takes people to complete a college degree could follow individuals’ education levels over time, thanks to the identification keys.
Social media platforms could implement a similar anonymization process instead of increasing hurdles – and cost – to access their data…(More)” .
UK Government Press Release: “…the government has announced that key parts of the OS MasterMap will be made openly available for the public and businesses to use.
It is estimated that this will boost the UK economy by at least £130m each year, as innovative companies and startups use the data.
The release of OS MasterMap data is one of the first projects to be delivered by the new Geospatial Commission, in conjunction with Ordnance Survey. The aim is to continue to drive forward the UK as a world leader in location data, helping to grow the UK’s digital economy by an estimated £11bn each year.
This is a step on a journey towards more open geospatial data infrastructure for the UK.
Chancellor of the Duchy of Lancaster and Minister for the Cabinet Office, David Lidington, said
Opening up OS MasterMap underlines this Government’s commitment to ensuring the UK continues to lead the way in digital innovation. Releasing this valuable government data for free will help stimulate innovation in the economy, generate jobs and improve public services.
Location-aware technologies – using geospatial data – are revolutionising our economy. From navigating public transport to tracking supply chains and planning efficient delivery routes, these digital services are built on location data that has become part of everyday life and business.
The newly available data should be particularly useful to small firms and entrepreneurs to realise their ideas and compete with larger organisations, encouraging greater competition and innovation….(More)”.
Paper by Serena Zheng, Marshini Chetty, and Nick Feamster: “Despite the increasing presence of Internet of Things (IoT) devices inside the home, we know little about how users feel about their privacy living with Internet-connected devices that continuously monitor and collect data in their homes. To gain insight into this state of affairs, we conducted eleven semi-structured interviews with owners of smart homes, investigating privacy values and expectations.
In this paper, we present the findings that emerged from our study: First, users prioritize the convenience and connectedness of their smart homes, and these values dictate their privacy opinions and behaviors. Second, user opinions about who should have access to their smart home data depend on the perceived benefit. Third, users assume their privacy is protected because they trust the manufacturers of their IoT devices. Our findings bring up several implications for IoT privacy, which include the need for design for privacy and evaluation standards….(More)”.
Special issue by The Economist: “…the relationship between information and crime has changed in two ways, one absolute, one relative. In absolute terms, people generate more searchable information than they used to. Smartphones passively track and record where people go, who they talk to and for how long; their apps reveal subtler personal information, such as their political views, what they like to read and watch and how they spend their money. As more appliances and accoutrements become networked, so the amount of information people inadvertently create will continue to grow.
To track a suspect’s movements and conversations, police chiefs no longer need to allocate dozens of officers for round-the-clock stakeouts. They just need to seize the suspect’s phone and bypass its encryption. If he drives, police cars, streetlights and car parks equipped with automatic number-plate readers (ANPRs, known in America as automatic licence-plate readers or ALPRs) can track all his movements.
In relative terms, the gap between information technology and policy gapes ever wider. Most privacy laws were written for the age of postal services and fixed-line telephones. Courts give citizens protection from governments entering their homes or rifling through their personal papers. The law on people’s digital presence is less clear. In most liberal countries, police still must convince a judge to let them eavesdrop on phone calls.
But mobile-phone “metadata”—not the actual conversations, but data about who was called and when—enjoy less stringent protections. In 2006 the European Union issued a directive requiring telecom firms to retain customer metadata for up to two years for use in potential crime investigations. The European Court of Justice invalidated that law in 2014, after numerous countries challenged it in court, saying that it interfered with “the fundamental rights to respect for private life”. Today data-retention laws vary widely in Europe. Laws, and their interpretation, are changing in America, too. A case before the Supreme Court will determine whether police need a warrant to obtain metadata.
Less shoe leather
If you drive in a city anywhere in the developed world, ANPRs are almost certainly tracking you. This is not illegal. Police do not generally need a warrant to follow someone in public. However, people not suspected of committing a crime do not usually expect authorities to amass terabytes of data on every person they have met and every business visited. ANPRs offer a lot of that.
To some people, this may not matter. Toplines, an Israeli ANPR firm, wants to add voice- and facial-recognition to its Bluetooth-enabled cameras, and install them on private vehicles, turning every car on the road into a “mobile broadcast system” that collects and transmits data to a control centre that security forces can access. Its founder posits that insurance-rate discounts could incentivise drivers to become, in effect, freelance roving crime-detection units for the police, subjecting unwitting citizens to constant surveillance. In answer to a question about the implications of such data for privacy, a Toplines employee shrugs: Facebook and WhatsApp are spying on us anyway, he says. If the stream of information keeps people safer, who could object? “Privacy is dead.”
It is not. But this dangerously complacent attitude brings its demise ever closer….(More)”.
Paper by Chloe Lim: “Journalists now regularly trumpet fact-checking as an important tool to hold politicians accountable for their public statements, but fact checking’s effect has only been assessed anecdotally and in experiments on politicians holding lower-level offices.
Using a rigorous research design to estimate the effects of fact-checking on presidential candidates, this paper shows that a fact-checker deeming a statement false false causes a 9.5 percentage points reduction in the probability that the candidate repeats the claim. To eliminate alternative explanations that could confound this estimate, I use two types of difference-in-differences analyses, each using true-rated claims and “checkable but unchecked” claims, a placebo test using hypothetical fact-check dates, and a topic model to condition on the topic of the candidate’s statement.
This paper contributes to the literature on how news media can hold politicians accountable, showing that when news organizations label a statement as inaccurate, they affect candidate behavior…(More)”.
Jon Huggett at the Stanford Social Innovation Review: “…power is the secret sauce of nonprofit collaborations. Great collaborations between organizations achieve more than either organization could achieve by itself. But when nonprofit collaborations don’t talk about power and address the implications of power imbalances openly, each party runs the risk of stumbling into (or contributing to) an ugly, counterproductive situation. This is true on an organizational level and a personal level, as relationships naturally grow and evolve over time. Sometimes, organizational and personal issues are one and the same. And sometimes the breakdown is irrevocable, and each party regretfully—and usually wrongly—walks away thinking the other was ultimately too uncollaborative.
The true nature of the problem
Over the past year, I have interviewed dozens of collaborators all over the world, at the request of a group of Australian nonprofits whose leaders wanted to better understand what effective collaboration looked like before working closely together. I observed many effective collaborations. I also observed an assortment of dysfunctional ones, where leaders and others privately confided that they felt the other party was uncollaborative. Based on these interviews and my own experience, I’ve identified three major types of power struggles, where one party (either an organization or individual) implies the other is “bad,” “sad,” or “mad.”…
It doesn’t have to be this way. In fact, many of the successful collaborations I’ve observed seem to get power right from day one. Specifically, ones that:
1. Set clear goals…
2. Recognize each other’s legitimate needs, which may differ….
3. Set clear roles, showing which parties have more power others, and why…(More)”.