Old Dog, New Tricks: Retraining and the Road to Government Reform

Essay by Beth Noveck: “…To be sure, one strategy for modernizing government is hiring new people with fresh skills in the fields of technology, data science, design, and marketing. Today, only 6 percent of the federal workforce is under 30 and, if age is any proxy for mastery of these in-demand new skills, then efforts by non-profits such as the Partnership for Public Service and the Tech Talent Project to attract a younger generation to work in the public sector are crucial. But we will not reinvent government fast enough through hiring alone.

The crucial and overlooked mechanism for improving government effectiveness is, therefore, to change how people work by training public servants across departments to use data and collective intelligence at each stage of the problem-solving process to foster more informed decision-making, more innovative solutions to problems, and more agile implementation of what works. All around the world we have witnessed how, when public servants work differently, government solves problems better.

Jonathan Wachtel, the lone city planner in Lakewood, Colorado, a suburb of Denver, has been able to undertake 500 sustainability projects because he knows how to collaborate and codesign with a network of 20,000 residents. When former Mayor of New Orleans Mitch Landrieu launched an initiative to start using data and resident engagement to address the city’s abysmal murder rate, that effort led to a 25 percent reduction in homicides in two years and a further decline to its lowest levels in 50 years by 2019. Because Samir Brahmachari, former Secretary, Department of Scientific and Industrial Research, of the government of India, turned to crowdsourcing and engaged the assistance of 7,900 contributors, he was able to identify six already-approved drugs that showed promised in the fight against tuberculosis….(More)”.

Nudgeability: Mapping Conditions of Susceptibility to Nudge Influence

Paper by Denise de Ridder, Floor Kroese, and Laurens van Gestel: “Nudges are behavioral interventions to subtly steer citizens’ choices toward “desirable” options. An important topic of debate concerns the legitimacy of nudging as a policy instrument, and there is a focus on issues relating to nudge transparency, the role of preexisting preferences people may have, and the premise that nudges primarily affect people when they are in “irrational” modes of thinking. Empirical insights into how these factors affect the extent to which people are susceptible to nudge influence (i.e., “nudgeable”) are lacking in the debate. This article introduces the new concept of nudgeability and makes a first attempt to synthesize the evidence on when people are responsive to nudges. We find that nudge effects do not hinge on transparency or modes of thinking but that personal preferences moderate effects such that people cannot be nudged into something they do not want. We conclude that, in view of these findings, concerns about nudging legitimacy should be softened and that future research should attend to these and other conditions of nudgeability….(More)”.

Little Rock Shows How Open Data Drives Resident Engagement

Blog by  Ross Schwartz: “The 12th Street corridor is in the heart of Little Rock, stretching west from downtown across multiple neighborhoods. But for years the area had suffered from high crime rates and disinvestment, and is considered a food desert.

With the intention of improving public safety and supporting efforts to revitalize the area, the City built a new police station in 2014 on the street. And, in the years following, as city staff ramped up efforts to place data at the center of problem-solving, it began to hold two-day-long “Data Academy” trainings for city employees and residents on foundational data practices, including data analysis.

Responding to public safety concerns, a 2018 Data Academy training focused on 12th Street. A cross-department team dug into data sets to understand the challenges facing the area, looking at variables including crime, building code violations, and poverty. It turned out the neighborhood with the highest levels of crime and blight was actually blocks away from 12th Street itself, in Midtown. A predominantly African-American neighborhood just east of the University of Arkansas at Little Rock campus, Midtown has a mix of older longtime homeowners and younger renters.

“It was a real data-driven ‘a-ha’ moment — an example of what you can understand about a city if you have the right data sets and look in the right places,” says Melissa Bridges, Little Rock’s performance and innovation coordinator. With support from What Works Cities (WWC), for the last five years she’s led Little Rock’s efforts to build open data and performance measurement resources and infrastructure…

Newly aware of Midtown’s challenges, city officials decided to engage residents in the neighborhood and adjacent areas. Data Academy members hosted a human-centered design workshop, during which residents were given the opportunity to self-prioritize their pressing concerns. Rather than lead the workshop, officials from various city departments quietly observed the discussion.

The main issue that emerged? Many parts of Midtown were poorly lit due to broken or blocked streetlights. Many residents didn’t feel safe and didn’t know how to alert the City to get lights fixed or vegetation cut back. A review of 311 request data showed that few streetlight problems in the area were ever reported to the City.

Aware of studies showing the correlation between dark streets and crime, the City designed a streetlight canvassing project in partnership with area neighborhood associations to engage and empower residents. Bridges and her team built canvassing route maps using Google Maps and Little Rock Citizen Connect, which collects 311 requests and other data sets. Then they gathered resident volunteers to walk or drive Midtown’s streets on a Friday night, using the City’s 311 mobile app to make a light service request and tag the location….(More)”.

New York City to Require Food Delivery Services to Share Customer Data with Restaurants

Hunton Privacy Blog: “On August 29, 2021, a New York City Council bill amending the New York City Administrative Code to address customer data collected by food delivery services from online orders became law after the 30-day period for the mayor to sign or veto lapsed. Effective December 27, 2021, the law will permit restaurants to request customer data from third-party food delivery services and require delivery services to provide, on at least a monthly basis, such customer data until the restaurant “requests to no longer receive such customer data.” Customer data includes name, phone number, email address, delivery address and contents of the order.

Although customers are permitted to request that their customer data not be shared, the presumption under the law is that “customers have consented to the sharing of such customer data applicable to all online orders, unless the customer has made such a request in relation to a specific online order.” The food delivery services are required to provide on its website a way for customers to request that their data not be shared “in relation to such online order.” To “assist its customers with deciding whether their data should be shared,” delivery services must disclose to the customer (1) the data that may be shared with the restaurant and (2) the restaurant fulfilling the order as the recipient of the data.

The law will permit restaurants to use the customer data for marketing and other purposes, and prohibit delivery apps from restricting such activities by restaurants. Restaurants that receive the customer data, however, must allow customers to request and delete their customer data. In addition, restaurants are not permitted to sell, rent or disclose customer data to any other party in exchange for financial benefit, except with the express consent of the customer….(More)”.

The Battle for Digital Privacy Is Reshaping the Internet

Brian X. Chen at The New York Times: “Apple introduced a pop-up window for iPhones in April that asks people for their permission to be tracked by different apps.

Google recently outlined plans to disable a tracking technology in its Chrome web browser.

And Facebook said last month that hundreds of its engineers were working on a new method of showing ads without relying on people’s personal data.

The developments may seem like technical tinkering, but they were connected to something bigger: an intensifying battle over the future of the internet. The struggle has entangled tech titans, upended Madison Avenue and disrupted small businesses. And it heralds a profound shift in how people’s personal information may be used online, with sweeping implications for the ways that businesses make money digitally.

At the center of the tussle is what has been the internet’s lifeblood: advertising.

More than 20 years ago, the internet drove an upheaval in the advertising industry. It eviscerated newspapers and magazines that had relied on selling classified and print ads, and threatened to dethrone television advertising as the prime way for marketers to reach large audiences….

If personal information is no longer the currency that people give for online content and services, something else must take its place. Media publishers, app makers and e-commerce shops are now exploring different paths to surviving a privacy-conscious internet, in some cases overturning their business models. Many are choosing to make people pay for what they get online by levying subscription fees and other charges instead of using their personal data.

Jeff Green, the chief executive of the Trade Desk, an ad-technology company in Ventura, Calif., that works with major ad agencies, said the behind-the-scenes fight was fundamental to the nature of the web…(More)”

Harms of AI

Paper by Daron Acemoglu: “This essay discusses several potential economic, political and social costs of the current path of AI technologies. I argue that if AI continues to be deployed along its current trajectory and remains unregulated, it may produce various social, economic and political harms. These include: damaging competition, consumer privacy and consumer choice; excessively automating work, fueling inequality, inefficiently pushing down wages, and failing to improve worker productivity; and damaging political discourse, democracy’s most fundamental lifeblood. Although there is no conclusive evidence suggesting that these costs are imminent or substantial, it may be useful to understand them before they are fully realized and become harder or even impossible to reverse, precisely because of AI’s promising and wide-reaching potential. I also suggest that these costs are not inherent to the nature of AI technologies, but are related to how they are being used and developed at the moment – to empower corporations and governments against workers and citizens. As a result, efforts to limit and reverse these costs may need to rely on regulation and policies to redirect AI research. Attempts to contain them just by promoting competition may be insufficient….(More)”.

Government Lawyers: Technicians, Policy Shapers and Organisational Brakes

Paper by Philip S.C. Lewis and Linda Mulcahy: “Government lawyers have been rather neglected by scholars interested in the workings of the legal profession and the role of professional groups in contemporary society. This is surprising given the potential for them to influence the internal workings of an increasingly legalistic and centralized state. This article aims to partly fill the gap left by looking at the way that lawyers employed by the government and the administrators they work with talk about their day to day practices. It draws on the findings of a large-scale empirical study of government lawyers in seven departments, funded by the ESRC. The study was undertaken between 2002-2003 by Philip Lewis, and is reported for the first time here. By looking at lawyers in bureaucracies the interviews conducted sought to explore what government lawyers do, how they talked about their work, and what distinguished them from the administrative grade clients and colleagues they worked with….(More)”.

Enrollment algorithms are contributing to the crises of higher education

Paper by Alex Engler: “Hundreds of higher education institutions are procuring algorithms that strategically allocate scholarships to convince more students to enroll. In doing so, these enrollment management algorithms help colleges vary the cost of attendance to students’ willingness to pay, a crucial aspect of competition in the higher education market. This paper elaborates on the specific two-stage process by which these algorithms first predict how likely prospective students are to enroll, and second help decide how to disburse scholarships to convince more of those prospective students to attend the college. These algorithms are valuable to colleges for institutional planning and financial stability, as well as to help reach their preferred financial, demographic, and scholastic outcomes for the incoming student body.

Unfortunately, the widespread use of enrollment management algorithms may also be hurting students, especially due to their narrow focus on enrollment. The prevailing evidence suggests that these algorithms generally reduce the amount of scholarship funding offered to students. Further, algorithms excel at identifying a student’s exact willingness to pay, meaning they may drive enrollment while also reducing students’ chances to persist and graduate. The use of this two-step process also opens many subtle channels for algorithmic discrimination to perpetuate unfair financial aid practices. Higher education is already suffering from low graduation rates, high student debt, and stagnant inequality for racial minorities—crises that enrollment algorithms may be making worse.

This paper offers a range of recommendations to ameliorate the risks of enrollment management algorithms in higher education. Categorically, colleges should not use predicted likelihood to enroll in either the admissions process or in awarding need-based aid—these determinations should only be made based on the applicant’s merit and financial circumstances, respectively. When colleges do use algorithms to distribute scholarships, they should proceed cautiously and document their data, processes, and goals. Colleges should also examine how scholarship changes affect students’ likelihood to graduate, or whether they may deepen inequities between student populations. Colleges should also ensure an active role for humans in these processes, such as exclusively using people to evaluate application quality and hiring internal data scientists who can challenge algorithmic specifications. State policymakers should consider the expanding role of these algorithms too, and should try to create more transparency about their use in public institutions. More broadly, policymakers should consider enrollment management algorithms as a concerning symptom of pre-existing trends towards higher tuition, more debt, and reduced accessibility in higher education….(More)”.

The Future of Citizen Engagement: Rebuilding the Democratic Dialogue

Report by the Congressional Management Foundation: “The Future of Citizen Engagement: Rebuilding the Democratic Dialogue” explores the current challenges to engagement and trust between Senators and Representatives and their constituents; proposes principles for rebuilding that fundamental democratic relationship; and describes innovative practices in federal, state, local, and international venues that Congress could look to for modernizing the democratic dialogue.

cmf citizen engagement rebuilding democratic dialogue cover 200x259

The report answers the following questions:

  • What factors have contributed to the deteriorating state of communications between citizens and Congress?
  • What principles should guide Congress as it tries to transform its communications systems and practices from administrative transactions to substantive interactions with the People it represents?
  • What models at the state and international level can Congress follow as it modernizes and rebuilds the democratic dialogue?

The findings and recommendations in this report are based on CMF’s long history of researching the relationship between Members of Congress and their constituents…(More)”.

The State of Consumer Data Privacy Laws in the US (And Why It Matters)

Article by Thorin Klosowski at the New York Times: “With more of the things people buy being internet-connected, more of our reviews and recommendations at Wirecutter are including lengthy sections detailing the privacy and security features of such products, everything from smart thermostats to fitness trackers. As the data these devices collect is sold and shared—and hacked—deciding what risks you’re comfortable with is a necessary part of making an informed choice. And those risks vary widely, in part because there’s no single, comprehensive federal law regulating how most companies collect, store, or share customer data.

Most of the data economy underpinning common products and services is invisible to shoppers. As your data gets passed around between countless third parties, there aren’t just more companies profiting from your data, but also more possibilities for your data to be leaked or breached in a way that causes real harm. In just the past year, we’ve seen a news outlet use pseudonymous app data, allegedly leaked from an advertiser associated with the dating app Grindr, to out a priest. We’ve read about the US government buying location data from a prayer app. Researchers have found opioid-addiction treatment apps sharing sensitive data. And T-Mobile recently suffered a data breach that affected at least 40 million people, some who had never even had a T-Mobile account.

“We have these companies that are amassing just gigantic amounts of data about each and every one of us, all day, every day,” said Kate Ruane, senior legislative counsel for the First Amendment and consumer privacy at the American Civil Liberties Union. Ruane also pointed out how data ends up being used in surprising ways—intentionally or not—such as in targeting ads or adjusting interest rates based on race. “Your data is being taken and it is being used in ways that are harmful.”

Consumer data privacy laws can give individuals rights to control their data, but if poorly implemented such laws could also maintain the status quo. “We can stop it,” Ruane continued. “We can create a better internet, a better world, that is more privacy protective.”…(More)”