The Trouble With Disclosure: It Doesn’t Work


Jesse Eisinger at ProPublica: “Louis Brandeis was wrong. The lawyer and Supreme Court justice famously declared that sunlight is the best disinfectant, and we have unquestioningly embraced that advice ever since.

 Over the last century, disclosure and transparency have become our regulatory crutch, the answer to every vexing problem. We require corporations and government to release reams of information on food, medicine, household products, consumer financial tools, campaign finance and crime statistics. We have a booming “report card” industry for a range of services, including hospitals, public schools and restaurants.

All this sunlight is blinding. As new scholarship is demonstrating, the value of all this information is unproved. Paradoxically, disclosure can be useless — and sometimes actually harmful or counterproductive.

“We are doing disclosure as a regulatory move all over the board,” says Adam J. Levitin, a law professor at Georgetown, “The funny thing is, we are doing this despite very little evidence of its efficacy.”

Let’s start with something everyone knows about — the “terms of service” agreements for the likes of iTunes. Like everybody else, I click the “I agree” box, feeling a flash of resentment. I’m certain that in Paragraph 184 is a clause signing away my firstborn to a life of indentured servitude to Timothy D. Cook as his chief caviar spoon keeper.

Our legal theoreticians have determined these opaque monstrosities work because someone, somewhere reads the fine print in these contracts and keeps corporations honest. It turns out what we laymen intuit is true: No one reads them, according to research by a New York University law professor, Florencia Marotta-Wurgler.

In real life, there is no critical mass of readers policing the agreements. And if there were an eagle-eyed crew of legal experts combing through these agreements, what recourse would they have? Most people don’t even know that the Supreme Court has gutted their rights to sue in court, and they instead have to go into arbitration, which usually favors corporations.

The disclosure bonanza is easy to explain. Nobody is against it. It’s politically expedient. Companies prefer such rules, especially in lieu of actual regulations that would curtail bad products or behavior. The opacity lobby — the remora fish class of lawyers, lobbyists and consultants in New York and Washington — knows that disclosure requirements are no bar to dodgy practices. You just have to explain what you’re doing in sufficiently incomprehensible language, a task that earns those lawyers a hefty fee.

Of course, some disclosure works. Professor Levitin cites two examples. The first is an olfactory disclosure. Methane doesn’t have any scent, but a foul smell is added to alert people to a gas leak. The second is ATM. fees. A study in Australia showed that once fees were disclosed, people avoided the high-fee machines and took out more when they had to go to them.

But to Omri Ben-Shahar, co-author of a recent book, ” More Than You Wanted To Know: The Failure of Mandated Disclosure,” these are cherry-picked examples in a world awash in useless disclosures. Of course, information is valuable. But disclosure as a regulatory mechanism doesn’t work nearly well enough, he argues….(More)

Tools to Innovate: Data Analytics, Risk Management, and Shared Services


New report by The Business of Government Center: “Today, governments have access to a variety of tools to successfully implement agency programs. For example, Data Analytics—especially of financial data—can be used to better inform decision making by ensuring agencies have the information they need at the point of time that it can be most effective. In addition, governments at all levels can more effectively address risks using new Risk Management approaches. And finally, Shared Services can not only save money, but also stimulate innovation, improve decisionmaking, and increase the quality of services expected by citizens.

The IBM Center has published a variety of reports related to these topics and accordingly, we have brought key findings on these topics together in the compilation that follows. We welcome your thoughts on these issues, and look forward to a continued dialogue with government leaders and stakeholders on actions to help agencies achieve their mission effectively and efficiently….(More)”

Why transparency can be a dirty word


Francis Fukuyama in the Financial Times: “It is hard to think of a political good that is more universally praised than government transparency. Whereas secrecy shelters corruption, abuse of power, undue influence and a host of other evils, transparency allows citizens to keep their rulers accountable. Or that is the theory.

It is clear that there are vast areas in which modern governments should reveal more. Edward Snowden’s revelations of eavesdropping by the National Security Agency has encouraged belief that the US government has been not nearly transparent enough. But is it possible to have too much transparency? The answer is clearly yes: demands for certain kinds of transparency have hurt government effectiveness, particularly with regard to its ability to deliberate.

The US has a number of statutes mandating transparency passed decades ago in response to perceived government abuses, and motivated by perfectly reasonable expectations that the government should operate under greater scrutiny. Yet they have had a number of unfortunate consequences.

The Federal Advisory Committee Act, for example, places onerous requirements on any public agency seeking to consult a group outside the government, requiring that they are formally approved and meet various criteria for political balance. Meetings must be held in public. The Government in the Sunshine Act stipulates that, with certain exceptions, “every portion of every meeting of an agency shall be open to public observation”.

These obligations put a serious damper on informal consultations with citizens, and even make it difficult for officials to talk to one another. Deliberation, whether in the context of a family or a federal agency, require people to pose hypotheticals and, when trying to reach agreement, make concessions.

When the process itself is open to public scrutiny, officials fear being hounded for a word taken out of context. They resort to cumbersome methods of circumventing the regulations, such as having one-on-one discussions so as not to trigger a group rule, or having subordinates do all the serious work.

The problem with the Freedom of Information Act is different. It was meant to serve investigative journalists looking into abuses of power. But today a large number of FOIA requests are filed by corporate sleuths trying to ferret out secrets for competitive advantage, or simply by individuals curious to find out what the government knows about them. The FOIA can be “weaponised”, as when the activist group Judicial Watch used it to obtain email documents on the Obama administration’s response to the 2012 attack on the US compound in Benghazi…..

National security aside, the federal government’s executive branch is probably one of the most transparent organisations on earth — no corporation, labour union, lobbying group or non-profit organisation is subject to such scrutiny. The real problem, as Professor John DiIulio of Pennsylvania university has pointed out, is that most of the work of government has been outsourced to contractors who face none of the transparency requirements of the government itself. It is an impossible task even to establish the number of such contractors in a single American city, much less how they are performing their jobs.

In Europe, where there is no equivalent to the FACA or the Sunshine Act, governments can consult citizens’ groups more flexibly. There is, of course, a large and growing distrust of European institutions by citizens. But America’s experience suggests that greater transparency requirements do not necessarily lead to more trust in government….(More)”

 

Postal Service’s Futuristic Vision for the Internet of Things


Mohana Ravindranath at NextGov: “The U.S. Postal Service is betting a new device will soon enter hi-tech homes, alongside the self-adjusting thermostats, text-enabled washing machines, and fridges notifying owners when groceries run out: the smart mailbox.

It’s the future of mail delivery, according to USPS’ Office of the Inspector General: a mailbox, equipped with tiny sensors that can collect data on mail delivery and pick up time, or outside temperature. The owners might control the box’s internal temperature and locking mechanism through a smartphone app.

The smart mailbox is just one element of the Postal Service’s larger vision for the Internet of Things, a term for a connected network of devices and sensors. In a report about that vision — dubbed the “Internet of Postal Things” — the Inspector General’s Office, in collaboration with IBM, paints a picture of post office systems that auto-fill paperwork for customers as they walk in, delivery vehicles that monitor themselves for maintenance, and sensors that notify package recipients upon delivery, among other scenarios.

The IG’s office recommended the Postal Service “start experimenting with Internet of Things technologies” to modernize its business, “as well as develop new business models to stay relevant in the digital age.”

A more advanced technological infrastructure could prepare the agency for same-day delivery and real-time rerouting for packages, options which “have the potential to grow in importance as consumers continue to buy more items online,” the report said.

For instance, sensors in a smart mailbox could scan a barcode or read an RFID tag on a letter, which could confirm delivery and replace the hand scan and delivery signature process, the report said. If the mailbox were temperature controlled, employees could deliver groceries or temperature-sensitive medicine.

These new mailboxes could also make the agency money. “If 5 percent of the 117 U.S. million households rented such a box for $3 a month, the product would generate $210 million a year in revenue,” the report said.

Postal Service vehicles could use sensors for driving routing systems, modeled after the DHL’s SmartTruck program, which can recalculate routes based on real-time events such as traffic, weather, or new pick-up requests. Vehicle sensors could also weigh cargo to make sure the truck is fully packed.

In the 2012 fiscal year, the Postal Service spent $926 million on fuel for more than 15,000 highway routes, some of which are coast-to-coast — sensors could help manage those contracts better, the report said.

In-vehicle sensors could also “incentivize contract drivers as well as its own carriers to adopt fuel-efficient behavior.”

The sensors could additionally collect large volumes of data on routes, helping the agency identify operational inefficiencies. The Postal Service has about 200,000 vehicles traveling more than 1.2 billion miles annually, often the same route six  days a week — and plans to acquire about 10,000 vehicles over the next two years, the report said.

The IG also suggested post offices themselves get a tech upgrade. More efficient lighting or air conditioning systems might turn off when the office is empty. During business hours, beacon technology could detect when a customer enters the post office, send him or her a push notification through a smartphone app, and direct them to the right counter.

Customers could also use an app to pre-fill paperwork, such as customs forms, which could then be displayed on a clerk’s monitor. They may also use a smartphone app to pay for postage….(More)”

Yelp’s Consumer Protection Initiative: ProPublica Partnership Brings Medical Info to Yelp


Yelp Official Blog: “…exists to empower and protect consumers, and we’re continually focused on how we can enhance our service while enhancing the ability for consumers to make smart transactional decisions along the way.

A few years ago, we partnered with local governments to launch the LIVES open data standard. Now, millions of consumers find restaurant inspection scores when that information is most relevant: while they’re in the middle of making a dining decision (instead of when they’re signing the check). Studies have shown that displaying this information more prominently has a positive impact.

Today we’re excited to announce we’ve joined forces with ProPublica to incorporate health care statistics and consumer opinion survey data onto the Yelp business pages of more than 25,000 medical treatment facilities. Read more in today’s Washington Post story.

We couldn’t be more excited to partner with ProPublica, the Pulitzer Prize winning non-profit newsroom that produces investigative journalism in the public interest.

The information is compiled by ProPublica from their own research and the Centers for Medicare and Medicaid Services (CMS) for 4,600 hospitals, 15,000 nursing homes, and 6,300 dialysis clinics in the US and will be updated quarterly. Hover text on the business page will explain the statistics, which include number of serious deficiencies and fines per nursing home and emergency room wait times for hospitals. For example, West Kendall Baptist Hospital has better than average doctor communication and an average 33 minute ER wait time, Beachside Nursing Center currently has no deficiencies, and San Mateo Dialysis Center has a better than average patient survival rate.

Now the millions of consumers who use Yelp to find and evaluate everything from restaurants to retail will have even more information at their fingertips when they are in the midst of the most critical life decisions, like which hospital to choose for a sick child or which nursing home will provide the best care for aging parents….(More)

Confronting the Internet’s Dark Side: Moral and Social Responsibility on the Free Highway


New book by Raphael Cohen-Almagor: “Terrorism, cyberbullying, child pornography, hate speech, cybercrime: along with unprecedented advancements in productivity and engagement, the Internet has ushered in a space for violent, hateful, and antisocial behavior. How do we, as individuals and as a society, protect against dangerous expressions online? Confronting the Internet’s Dark Side is the first book on social responsibility on the Internet. It aims to strike a balance between the free speech principle and the responsibilities of the individual, corporation, state, and the international community. This book brings a global perspective to the analysis of some of the most troubling uses of the Internet. It urges net users, ISPs, and liberal democracies to weigh freedom and security, finding the golden mean between unlimited license and moral responsibility. This judgment is necessary to uphold the very liberal democratic values that gave rise to the Internet and that are threatened by an unbridled use of technology. (More)

Transforming public services the right way with impact management


Emily Bazalgette at FutureGov: “…Impact evaluation involves using a range of research methodologies to investigate whether our products and services are having an impact on users’ lives. ….Rigorous academic impact evaluation wasn’t really designed for rapidly iterating products made by a fast-moving digital and design company like FutureGov. Our products can change significantly over short periods of time — for instance, in a single workshop Doc Ready evolved from a feature-rich social media platform to a stripped-down checklist builder — and that can create a tension between our agile process and traditional evaluation methodologies, which tend to require a fixed product to support a long-term evaluation plan.

We’ve decided to embrace this tension by using Theories of Change, a useful evaluation tool recommended to us by our investors and partners Nesta Impact Investments. To give you a flavour (excuse the pun), below we have Casserole Club’s Theory of Change.

Casserole toc

The problem we’re trying to solve (reducing social isolation) doesn’t tend to change, but the way we solve it might (the inputs and short to medium-term outcomes). In future, we may find that we need to adapt to serve new user groups, or operate in different channels, or that there are mediating outcomes for social isolation that Casserole Club produces other than social contact with a Casserole Club cook. Theories of Change allow us to stay focused on big-picture outcomes, while being flexible about how the product delivers on these outcomes.

Another lesson is to make evaluation everyone’s business. Like many young-ish companies, FutureGov is not at the stage where we have the resources to support a full-time, dedicated Head of Impact. But we’ve found that you can get pretty far if you’ve got a flat structure and lots of passionate people (both of which, luckily, we have). Our lack of hierarchy means that anyone can take up a project and run with it, and collaboration across the company is encouraged. Product impact evaluation is owned by the product teams who manage the product over time. This means we can get more done, that research design benefits from the deep knowledge of our product teams, and that evaluation skills (like how to design a decent survey or depth interview) have started to spread across the organisation….(More)”

Unpacking Civic Tech – Inside and Outside of Government


David Moore at Participatory Politics Foundation: “…I’ll argue it’s important to unpack the big-tent term “civic tech” to at least five major component areas, overlapping in practice & flexible of course – in order to more clearly understand what we have and what we need:

  • Responsive & efficient city services (e.g., SeeClickFix)
  • Open data portals & open government data publishing / visualization (Socrata, OpenGov.com)
  • Engagement platforms for government entities (Mindmixer aka Sidewalk)
  • Community-focused organizing services (Change, NextDoor, Brigade- these could validly be split, as NextDoor is of course place-based IRL)
  • Geo-based services & open mapping data (e.g.. Civic Insight)

More precisely, instead of “civic tech”, the term #GovTech can be productively applied to companies whose primary business model is vending to government entities – some #govtech is #opendata, some is civic #engagement, and that’s healthy & brilliant. But it doesn’t make sense to me to conflate as “civic tech” both government software vendors and the open-data work of good-government watchdogs. Another framework for understanding the inside / outside relationship to government, in company incorporation strategies & priorities, is broadly as follows:

  • tech entirely-outside government (such as OpenCongress or OpenStates);
  • tech mostly-outside government, where some elected officials volunteer to participate (such as AskThem, Councilmatic, DemocracyOS, or Change Decision Makers);
  • tech mostly-inside government, paid-for-by-government (such as Mindmixer or SpeakUp or OpenTownHall) where elected officials or gov’t staff sets the priorities, with the strong expectation of an official response;
  • deep legacy tech inside government, the enterprise vendors of closed-off CRM software to Congressional offices (including major defense contractors!).

These are the websites up and running today in the civic tech ecosystem – surveying them, I see there’s a lot of work still to do on developing advanced metrics towards thicker civic engagement. Towards evaluating whether the existing tools are having the impact we hope and expect them to at their level of capitalization, and to better contextualize the role of very-small non-profit alternatives….

One question to study is whether the highest-capitalized U.S. civic tech companies (Change, NextDoor, Mindmixer, Socrata, possibly Brigade) – which also generally have most users – are meeting ROI on continual engagement within communities.

  • If it’s a priority metric for users of a service to attend a community meeting, for example, are NextDoor or Mindmixer having expected impact?
  • How about metrics on return participation, joining an advocacy group, attending a district meeting with their U.S. reps, organizing peer-to-peer with neighbors?
  • How about writing or annotating their own legislation at the city level, introducing it for an official hearing, and moving it up the chain of government to state and even federal levels for consideration? What actual new popular public policies or systemic reforms are being carefully, collaboratively passed?
  • Do less-capitalized, community-based non-profits (AskThem, 596 Acres, OpenPlans’ much-missed Shareabouts, CKAN data portals, LittleSis, BeNeighbors, PBNYC tools) – with less scale, but with more open-source, open-data tools that can be remixed – improve on the tough metric of ROI on continual engagement or research-impact in the news?…(More)

Data Ethics in the Age of the Quantified Self


Video of Aspen Ideas Festival Session on Data Ethics: “Leading thinkers from business, government, civil society, and academia explore and debate ethics in the age of the quantified society. What role do ethics play in guiding existing efforts to develop and deploy data and information technologies? Does data ethics need to develop as a field to help guide policy, research, and practice — just as bioethics did in order to guide medicine and biology? Why or why not? Speakers:Kate Crawford, Jonathan Zittrain, Ashkan Soltani,Alexis Madrigal….

(More)”

The Causes, Costs and Consequences of Bad Government Data


Katherine Barrett & Richard Greene in Governing: “Data is the lifeblood of state government. It’s the crucial commodity that’s necessary to manage projects, avoid fraud, assess program performance, keep the books in balance and deliver services efficiently. But even as the trend toward greater reliance on data has accelerated over the past decades, the information itself has fallen dangerously short of the mark. Sometimes it doesn’t exist at all. But worse than that, all too often it’s just wrong.

There are examples everywhere. Last year, the California auditor’s office issued a report that looked at accounting records at the State Controller’s Office to see whether it was accurately recording sick leave and vacation credits. “We found circumstances where instead of eight hours, it was 80 and in one case, 800,” says Elaine Howle, the California state auditor. “And the system didn’t have controls to say that’s impossible.” The audit found 200,000 questionable hours of leave due to data entry errors, with a value of $6 million.

Mistakes like that are embarrassing, and can lead to unequal treatment of valued employees. Sometimes, however, decisions made with bad data can have deeper consequences. In 2012, the secretary of environmental protection in Pennsylvania told Congress that there was no evidence the state’s water quality had been affected by fracking. “Tens of thousands of wells have been hydraulically fractured in Pennsylvania,” he said, “without any indication that groundwater quality has been impacted.”

But by August 2014, the same department published a list of 248 incidents of damage to well water due to gas development. Why didn’t the department pick up on the water problems sooner? A key reason was that the data collected by its six regional offices had not been forwarded to the central office. At the same time, the regions differed greatly in how they collected, stored, transmitted and dealt with the information. An audit concluded that Pennsylvania’s complaint tracking system for water quality was ineffective and failed to provide “reliable information to effectively manage the program.”

When data is flawed, the consequences can reach throughout the entire government enterprise. Services are needlessly duplicated; evaluation of successful programs is difficult; tax dollars go uncollected; infrastructure maintenance is conducted inefficiently; health-care dollars are wasted. The list goes on and on. Increasingly, states are becoming aware of just how serious the problem is. “The poor quality of government data,” says Dave Yost, Ohio’s state auditor, “is probably the most important emerging trend for government executives, across the board, at all levels.”

Just how widespread a problem is data quality? In aGoverning telephone survey with more than 75 officials in 46 states, about 7 out of 10 said that data problems were frequently or often an impediment to doing their business effectively. No one who worked with program data said this was rarely the case. (View the full results of the survey in this infographic.)…(More)

See also: Bad Data Is at All Levels of Government and The Next Big Thing in Data Analytics