Arts Data in the Public Sector: Strategies for local arts agencies


Report by Bloomberg Associates: “Cities are increasingly using data to help shape policy and identify service gaps, but data about arts and culture is often met with skepticism. Local arts agencies, the city and county entities at the forefront of understanding and serving their local creative communities, often face difficulties in identifying meaningful metrics that capture quality as well as quantity in this unique field. With the Covid-19 pandemic and intensifying demand for equity, the desire for reliable, longitudinal information will only increase in the coming years as municipalities with severely limited resources face critical decisions in their effort toward recovery.

So how can arts-minded cities leverage data to better serve grantees, promote equity in service delivery, and demonstrate the impact of arts and culture across a range of significant policy priorities, among other ambitions?

Produced by our Cultural Assets Management team, Arts Data in the Public Sector highlights the data practices of fifteen local arts agencies across the U.S. to capture a meaningful cross-section of constituencies, resources, and strategies. Through best practices and case studies, the Guide offers useful insights and practical resources that can assist and inspire local government arts funders and advocates as they work to establish more equitable and inclusive practices and to affirm the importance of arts and culture as a public service well into the future…(More)”.

Checks in the Balance: Legislative Capacity and the Dynamics of Executive Power


Book by Alexander Bolton and Sharece Thrower: “The specter of unbridled executive power looms large in the American political imagination. Are checks and balances enough to constrain ambitious executives? Checks in the Balance presents a new theory of separation of powers that brings legislative capacity to the fore, explaining why Congress and state legislatures must possess both the opportunities and the means to constrain presidents and governors—and why, without these tools, executive power will prevail.

Alexander Bolton and Sharece Thrower reveal how legislative capacity—which they conceive of as the combination of a legislature’s resources and policymaking powers—is the key to preventing the accumulation of power in the hands of an encroaching executive. They show how low-capacity legislatures face difficulties checking the executive through mechanisms such as discretion and oversight, and how presidents and governors unilaterally bypass such legislative adversaries to impose their will. When legislative capacity is high, however, the legislative branch can effectively stifle executives. Bolton and Thrower draw on a wealth of historical evidence on congressional capacity, oversight, discretion, and presidential unilateralism. They also examine thousands of gubernatorial executive orders, demonstrating how varying capacity in the states affects governors’ power.

Checks in the Balance affirms the centrality of legislatures in tempering executive power—and sheds vital new light on how and why they fail….(More)”.

Sharing Student Data Across Public Sectors: Importance of Community Engagement to Support Responsible and Equitable Use


Report by CDT: “Data and technology play a critical role in today’s education institutions, with 85 percent of K-12 teachers anticipating that online learning and use of education technology at their school will play a larger role in the future than it did before the pandemic.  The growth in data-driven decision-making has helped fuel the increasing prevalence of data sharing practices between K-12 education agencies and adjacent public sectors like social services. Yet the sharing of personal data can pose risks as well as benefits, and many communities have historically experienced harm as a result of irresponsible data sharing practices. For example, if the underlying data itself is biased, sharing that information exacerbates those inequities and increases the likelihood that potential harms fall disproportionately on certain communities. As a result, it is critical that agencies participating in data sharing initiatives take steps to ensure the benefits are available to all and no groups of students experience disproportionate harm.

A core component of sharing data responsibly is proactive, robust community engagement with the group of people whose data is being shared, as well as their surrounding community. This population has the greatest stake in the success or failure of a given data sharing initiative; as such, public agencies have a practical incentive, and a moral obligation, to engage them regarding decisions being made about their data…

This paper presents guidance on how practitioners can conduct effective community engagement around the sharing of student data between K-12 education agencies and adjacent public sectors. We explore the importance of community engagement around data sharing initiatives, and highlight four dimensions of effective community engagement:

  • Plan: Establish Goals, Processes, and Roles
  • Enable: Build Collective Capacity
  • Resource: Dedicate Appropriate People, Time, and Money
  • Implement: Carry Out Vision Effectively and Monitor Implementation…(More)”.

How Smart Tech Is Transforming Nonprofits


Essay by Allison Fine and Beth Kanter: “Covid-19 created cascades of shortages, disruptions, and problems that rolled downhill and landed in the most vulnerable neighborhoods. In these neighborhoods, it’s often nonprofit organizations that provide services to members of the community. While the pandemic accelerated the need for digital transformation throughout the economy, the nonprofit sector was not immune to the need for nearly overnight innovation. As experts on the use of technology for social good, we’ve observed the many ways that nonprofits have been adopting “smart tech” to further social change in the wake of the pandemic, which we chronicle in our upcoming book, The Smart Nonprofit.

We use “smart tech” as an umbrella term for advanced digital technologies that make decisions for people. It includes artificial intelligence (AI) and its subsets and cousins, such as machine learning, natural language processing, smart forms, chatbots, robots, and more.

The use of smart tech by social service agencies and other nonprofits exploded during the pandemic. For example, food banks deployed robots to pack meals; homeless services agencies used chatbots to give legal and mental health advice; and fundraising departments turned to AI-powered software to identify potential donors.Insight Center CollectionTaking on Digital TransformationMoving your company forward in the wake of the pandemic.

When the pandemic began and schools switched to remote learning, many students who relied on school lunches were not able to receive them. Here’s where nonprofits stepped in to use smart technologies for social good. For example, researchers at Carnegie Mellon University used machine learning to flip the system on its head; instead of using buses to deliver children to schools, new bus routes were created to bring meals to children in the Pittsburgh area in the most efficient way.

The use of chatbots to provide support and deliver services to vulnerable populations increased tremendously during the pandemic. For instance, the Rentervention chatbot was developed by the legal aid nonprofits in Illinois to help tenants navigate eviction and other housing issues they were experiencing due to Covid-19. It also directs renters to pro bono legal advice….(More)”.

Senators unveil bipartisan bill requiring social media giants to open data to researchers


Article by Rebecca Klar: “Meta and other social media companies would be required to share their data with outside researchers under a new bill announced by a bipartisan group of senators on Thursday. 

Sens. Chris Coons (D-Del.), Amy Klobuchar (D-Minn.) and Rob Portman (R-Ohio) underscored the need for their bill based on information leaked about Meta’s platforms in the so-called Facebook Papers, though the proposal would also apply to other social media companies.

The bill, the Platform Accountability and Transparency Act, would allow independent researchers to submit proposals to the National Science Foundation. If the requests are approved, social media companies would be required to provide the necessary data subject to certain privacy protections. 

“It’s increasingly clear that more transparency is needed so that the billions of people who use Facebook, Twitter, and similar platforms can fully understand the impact of those tradeoffs. This bipartisan proposal is an important step that will bring much needed information about the impact of social media companies to light and ought to be a crucial part of any comprehensive strategy that Congress can take to regulate major social media companies,” Coons said in a statement. 

If companies failed to comply with the requirement under the bill, they would be subject to enforcement from the Federal Trade Commission (FTC) and face losing immunity under Section 230 of the Communications Decency Act. Section 230 is a controversial provision that provides immunity for internet companies based on content posted by third parties, and lawmakers on both sides of the aisle have proposed measures to weaken its reach….(More)”.

Group Backed by Top Companies Moves to Combat A.I. Bias in Hiring


Steve Lohr at The New York Times: “Artificial intelligence software is increasingly used by human resources departments to screen résumés, conduct video interviews and assess a job seeker’s mental agility.

Now, some of the largest corporations in America are joining an effort to prevent that technology from delivering biased results that could perpetuate or even worsen past discrimination.

The Data & Trust Alliance, announced on Wednesday, has signed up major employers across a variety of industries, including CVS Health, Deloitte, General Motors, Humana, IBM, Mastercard, Meta (Facebook’s parent company), Nike and Walmart.

The corporate group is not a lobbying organization or a think tank. Instead, it has developed an evaluation and scoring system for artificial intelligence software.

The Data & Trust Alliance, tapping corporate and outside experts, has devised a 55-question evaluation, which covers 13 topics, and a scoring system. The goal is to detect and combat algorithmic bias.“This is not just adopting principles, but actually implementing something concrete,” said Kenneth Chenault, co-chairman of the group and a former chief executive of American Express, which has agreed to adopt the anti-bias tool kit…(More)”.

What Biden’s Democracy Summit Is Missing


Essay by Hélène Landemore: “U.S. President Joe Biden is set to host a virtual summit this week for leaders from government, civil society, and the private sector to discuss the renewal of democracy. We can expect to see plenty of worthy yet predictable issues discussed: the threat of foreign agents interfering in elections, online disinformation, political polarization, and the temptation of populist and authoritarian alternatives. For the United States specifically, the role of money in politics, partisan gerrymandering, endless gridlock in Congress, and the recent voter suppression efforts targeting Black communities in the South should certainly be on the agenda.

All are important and relevant topics. Something more fundamental, however, is needed.

The clear erosion of our political institutions is just the latest evidence, if any more was needed, that it’s past time to discuss what democracy actually means—and why we should care about it. We have to question, moreover, whether the political systems we have are even worth restoring or if we should more substantively alter them, including through profound constitutional reforms.

Such a discussion has never been more vital. The systems in place today once represented a clear improvement on prior regimes—monarchies, theocracies, and other tyrannies—but it may be a mistake to call them adherents of democracy at all. The word roughly translates from its original Greek as “people’s power.” But the people writ large don’t hold power in these systems. Elites do. Consider that in the United States, according to a 2014 study by the political scientists Martin Gilens and Benjamin Page, only the richest 10 percent of the population seems to have any causal effect on public policy. The other 90 percent, they argue, is left with “democracy by coincidence”—getting what they want only when they happen to want the same thing as the people calling the shots.

This discrepancy between reality—democracy by coincidence—and the ideal of people’s power is baked in as a result of fundamental design flaws dating back to the 18th century. The only way to rectify those mistakes is to rework the design—to fully reimagine what it means to be democratic. Tinkering at the edges won’t do….(More)”

“If Everybody’s White, There Can’t Be Any Racial Bias”: The Disappearance of Hispanic Drivers From Traffic Records


Article by Richard A. Webster: “When sheriff’s deputies in Jefferson Parish, Louisiana, pulled over Octavio Lopez for an expired inspection tag in 2018, they wrote on his traffic ticket that he is white. Lopez, who is from Nicaragua, is Hispanic and speaks only Spanish, said his wife.

In fact, of the 167 tickets issued by deputies to drivers with the last name Lopez over a nearly six-year span, not one of the motorists was labeled as Hispanic, according to records provided by the Jefferson Parish clerk of court. The same was true of the 252 tickets issued to people with the last name of Rodriguez, 234 named Martinez, 223 with the last name Hernandez and 189 with the surname Garcia.

This kind of misidentification is widespread — and not without harm. Across America, law enforcement agencies have been accused of targeting Hispanic drivers, failing to collect data on those traffic stops, and covering up potential officer misconduct and aggressive immigration enforcement by identifying people as white on tickets.

“If everybody’s white, there can’t be any racial bias,” Frank Baumgartner, a political science professor at the University of North Carolina of Chapel Hill, told WWNO/WRKF and ProPublica.

Nationally, states have tried to patch this data loophole and tighten controls against racial profiling. In recent years, legislators have passed widely hailed traffic stop data-collection laws in California, Colorado, Illinois, Oregon, Virginia and Washington, D.C. This April, Alabama became the 22nd state to enact similar legislation.

Though Louisiana has had its own data-collection requirement for two decades, it contains a loophole unlike any other state: It exempts law enforcement agencies from collecting and delivering data to the state if they have an anti-racial-profiling policy in place. This has rendered the law essentially worthless, said Josh Parker, a senior staff attorney at the Policing Project, a public safety research nonprofit at the New York University School of Law.

Louisiana State Rep. Royce Duplessis, D-New Orleans, attempted to remove the exemption two years ago, but law enforcement agencies protested. Instead, he was forced to convene a task force to study the issue, which thus far hasn’t produced any results, he said.

“They don’t want the data because they know what it would reveal,” Duplessis said of law enforcement agencies….(More)”.

A Fix-It Job for Government Tech


Shira Ovide at the New York Times: “U.S. government technology has a mostly deserved reputation for being expensive and awful.

Computer systems sometimes operate with Sputnik-era software. A Pentagon project to modernize military technology has little to show after five years. During the coronavirus pandemic, millions of Americans struggled to get government help like unemployment insurancevaccine appointments and food stamps because of red tape, inflexible technology and other problems.

Whether you believe that the government should be more involved in Americans’ lives or less, taxpayers deserve good value for the technology we pay for. And we often don’t get it. It’s part of Robin Carnahan’s job to take on this problem.

A former secretary of state for Missouri and a government tech consultant, Carnahan had been one of my guides to how public sector technology could work better. Then in June, she was confirmed as the administrator of the General Services Administration, the agency that oversees government acquisitions, including of technology.

Carnahan said that she and other Biden administration officials wanted technology used for fighting wars or filing taxes to be as efficient as our favorite app.

“Bad technology sinks good policy,” Carnahan told me. “We’re on a mission to make government tech more user-friendly and be smarter about how we buy it and use it.”

Carnahan highlighted three areas she wanted to address: First, change the process for government agencies to buy technology to recognize that tech requires constant updates. Second, simplify the technology for people using government services. And third, make it more appealing for people with tech expertise to work for the government, even temporarily.

All of that is easier said than done, of course. People in government have promised similar changes before, and it’s not a quick fix. Technology dysfunction is also often a symptom of poor policies.

But in Carnahan’s view, one way to build faith in government is to prove that it can be competent. And technology is an essential area to show that…(More)”.

Crime Prediction Software Promised to Be Free of Biases. New Data Shows It Perpetuates Them


Article by Aaron Sankin, Dhruv Mehrotra for Gizmodo, Surya Mattu, and Annie Gilbertson: “Between 2018 and 2021, more than one in 33 U.S. residents were potentially subject to police patrol decisions directed by crime prediction software called PredPol.

The company that makes it sent more than 5.9 million of these crime predictions to law enforcement agencies across the country—from California to Florida, Texas to New Jersey—and we found those reports on an unsecured server.

The Markup and Gizmodo analyzed them and found persistent patterns.

Residents of neighborhoods where PredPol suggested few patrols tended to be Whiter and more middle- to upper-income. Many of these areas went years without a single crime prediction.

By contrast, neighborhoods the software targeted for increased patrols were more likely to be home to Blacks, Latinos, and families that would qualify for the federal free and reduced lunch program.

These communities weren’t just targeted more—in some cases they were targeted relentlessly. Crimes were predicted every day, sometimes multiple times a day, sometimes in multiple locations in the same neighborhood: thousands upon thousands of crime predictions over years. A few neighborhoods in our data were the subject of more than 11,000 predictions.

The software often recommended daily patrols in and around public and subsidized housing, targeting the poorest of the poor.

“Communities with troubled relationships with police—this is not what they need,” said Jay Stanley, a senior policy analyst at the ACLU Speech, Privacy, and Technology Project. “They need resources to fill basic social needs.”…(More)”.