RegData


“RegData, developed by Patrick A. McLaughlin, Omar Al-Ubaydli, and the Mercatus Center at George Mason University, improves dramatically on the existing methods used to quantify regulations. Previous efforts to assess the extent of regulation in the United States have used imprecise variables such as the number of pages published in the Federal Register or the number of new rules created annually. However, not all regulations are equal in their effects on the economy or on different sectors of the economy. One page of regulatory text is often quite different from another page in content and consequence.
RegData improves upon existing metrics of regulation in three principal ways:

  1. RegData provides a novel measure that quantifies regulations based on the actual content of regulatory text. In other words, RegData examines the regulatory text itself, counting the number of binding constraints or “restrictions”—words that indicate an obligation to comply, such as “shall” or “must.” This is important because some regulatory programs can be hundreds of pages long with relatively few restrictions, while others only have a few paragraphs with a relatively high number of restrictions.
  2. RegData quantifies regulation by industry. It uses the same industry classes as the North American Industrial Classification System (NAICS), which categorizes and describes each industry in the US economy. Using industry-specific quantifications of regulation, users can examine the growth of regulation relevant to a particular industry over time or compare growth rates across industries.
    There are several potential uses of a tool that measures regulation relevant to specific industries. Both the causes and consequences of regulation are likely to differ from one industry to the next, and by quantifying regulations for all industries, individuals can test whether industry characteristics, such as dynamism, unionization, or a penchant for lobbying, are correlated with industry-specific regulation levels.
    For example, if someone wanted to know whether high unionization rates are correlated with heavy regulation, the person could compare RegData’s measure of industry-specific regulation for highly unionized industries to industries with little to no unionization.
  3. *NEW* RegData 2.0 provides the user with the ability to quantify the regulation that specific federal regulators (including agencies, offices, bureaus, commissions, or administrations) have produced. For example, a user can now see how many restrictions a specific administration of the Department of Transportation (e.g., the National Highway Traffic Safety Administration) has produced in each year.”

For Big-Data Scientists, ‘Janitor Work’ Is Key Hurdle to Insights


in the New York Times: “Technology revolutions come in measured, sometimes foot-dragging steps. The lab science and marketing enthusiasm tend to underestimate the bottlenecks to progress that must be overcome with hard work and practical engineering.

The field known as “big data” offers a contemporary case study. The catchphrase stands for the modern abundance of digital data from many sources — the web, sensors, smartphones and corporate databases — that can be mined with clever software for discoveries and insights. Its promise is smarter, data-driven decision-making in every field. That is why data scientist is the economy’s hot new job.

Yet far too much handcrafted work — what data scientists call “data wrangling,” “data munging” and “data janitor work” — is still required. Data scientists, according to interviews and expert estimates, spend from 50 percent to 80 percent of their time mired in this more mundane labor of collecting and preparing unruly digital data, before it can be explored for useful nuggets….”

Bloomberg Philanthropies Announces Major New Investment In City Halls' Capacity To Innovate


Press Release: “Bloomberg Philanthropies today announced a new $45 million investment to boost the capacity of city halls to use innovation to tackle major challenges and improve urban life. The foundation will direct significant funding and other assistance to help dozens of cities adopt the Innovation Delivery model, an approach to generating and implementing new ideas that has been tested and refined over the past three years in partnership with city leaders in Atlanta, Chicago, Louisville, Memphis, and New Orleans. …

The foundation has invited over 80 American cities to apply for Innovation Delivery grants. Eligible cities have at least 100,000 residents and mayors with at least two years left in office. Grantees will be selected in the fall. They will receive from $250,000 to $1,000,000 annually over three years, as well as implementation support and peer-to-peer learning opportunities. Newly formed Innovation Delivery Teams will hit the ground running in each city no later than spring 2015.
Innovation Delivery Teams use best-in-class idea generation techniques with a structured, data-driven approach to delivering results. Operating as an in-house innovation consultancy, they have enabled mayors in the original five cities to produce clear results, such as:

  • New Orleans reduced murder in 2013 by 19% compared to the previous year, resulting in the lowest number of murders in New Orleans since 1985.
  • Memphis reduced retail vacancy rates by 30% along key commercial corridors.
  • Louisville redirected 26% of low-severity 911 medical calls to a doctor’s office or immediate care center instead of requiring an ambulance trip to the emergency room.
  • Chicago cut the licensing time for new restaurants by 33%; more than 1,000 new restaurants have opened since the Team began its work.
  • Atlanta moved 1,022 chronically homeless individuals into permanent housing, quickly establishing itself as a national leader.

“Innovation Delivery has been an essential part of our effort to bring innovation, efficiency and improved services to our customers,” said Louisville Mayor Greg Fischer. “Philanthropy can play an important role in expanding the capacity of cities to deliver better, bolder results. Bloomberg Philanthropies is one of few foundations investing in this area, and it has truly been a game changer for our city.”
In addition to direct investments in cities, Bloomberg Philanthropies will fund technical assistance, research and evaluation, and partnerships with organizations to further spread the Innovation Delivery approach. The Innovation Delivery Playbook, which details the approach and some experiences of the original cities with which Bloomberg Philanthropies partnered, is available at: www.bloomberg.org …”

Not just the government’s playbook


at Radar: “Whenever I hear someone say that “government should be run like a business,” my first reaction is “do you know how badly most businesses are run?” Seriously. I do not want my government to run like a business — whether it’s like the local restaurants that pop up and die like wildflowers, or megacorporations that sell broken products, whether financial, automotive, or otherwise.
If you read some elements of the press, it’s easy to think that healthcare.gov is the first time that a website failed. And it’s easy to forget that a large non-government website was failing, in surprisingly similar ways, at roughly the same time. I’m talking about the Common App site, the site high school seniors use to apply to most colleges in the US. There were problems with pasting in essays, problems with accepting payments, problems with the app mysteriously hanging for hours, and more.
 
I don’t mean to pick on Common App; you’ve no doubt had your own experience with woefully bad online services: insurance companies, Internet providers, even online shopping. I’ve seen my doctor swear at the Epic electronic medical records application when it crashed repeatedly during an appointment. So, yes, the government builds bad software. So does private enterprise. All the time. According to TechRepublic, 68% of all software projects fail. We can debate why, and we can even debate the numbers, but there’s clearly a lot of software #fail out there — in industry, in non-profits, and yes, in government.
With that in mind, it’s worth looking at the U.S. CIO’s Digital Services Playbook. It’s not ideal, and in many respects, its flaws reveal its origins. But it’s pretty good, and should certainly serve as a model, not just for the government, but for any organization, small or large, that is building an online presence.
The playbook consists of 13 principles (called “plays”) that drive modern software development:

  • Understand what people need
  • Address the whole experience, from start to finish
  • Make it simple and intuitive
  • Build the service using agile and iterative practices
  • Structure budgets and contracts to support delivery
  • Assign one leader and hold that person accountable
  • Bring in experienced teams
  • Choose a modern technology stack
  • Deploy in a flexible hosting environment
  • Automate testing and deployments
  • Manage security and privacy through reusable processes
  • Use data to drive decisions
  • Default to open

These aren’t abstract principles: most of them should be familiar to anyone who has read about agile software development, attended one of our Velocity conferences, one of the many DevOps Days, or a similar event. All of the principles are worth reading (it’s not a long document). I’m going to call out two for special attention….”

How technology is beating corruption


Jim Yong Kim at World Economic Forum: “Good governance is critical for all countries around the world today. When it doesn’t exist, many governments fail to deliver public services effectively, health and education services are often substandard and corruption persists in rich and poor countries alike, choking opportunity and growth. It will be difficult to reduce extreme poverty — let alone end it — without addressing the importance of good governance.
But this is not a hopeless situation. In fact, a new wave of progress on governance suggests we may be on the threshold of a transformational era. Countries are tapping into some of the most powerful forces in the world today to improve services and transparency. These forces include the spread of information technology and its convergence with grassroots movements for transparency, accountability and citizen empowerment. In some places, this convergence is easing the path to better-performing and more accountable governments.
The Philippines is a good example of a country embracing good governance. During a recent visit, I spoke with President Benigno Aquino about his plans to reduce poverty, create jobs, and ensure that economic growth is inclusive. He talked in great detail about how improving governance is a fundamentally important part of their strategy. The government has opened government data and contract information so citizens can see how their tax money is spent. The Foreign Aid Transparency Hub, launched after Typhoon Yolanda, offers a real-time look at pledges made and money delivered for typhoon recovery. Geo-tagging tools monitor assistance for people affected by the typhoon.
Opening budgets to scrutiny
This type of openness is spreading. Now many countries that once withheld information are opening their data and budgets to public scrutiny.
Late last year, my organization, the World Bank Group, established the Open Budgets Portal, a repository for budget data worldwide. So far, 13 countries have posted their entire public spending datasets online — including Togo, the first fragile state to do so.
In 2011, we helped Moldova become the first country in central Europe to launch an open data portal and put its expenditures online. Now the public and media can access more than 700 datasets, and are asking for more.
The original epicenter of the Arab Spring, Tunisia, recently passed a new constitution and is developing the first open budget data portal in the Middle East and North Africa. Tunisia has taken steps towards citizen engagement by developing a citizens’ budget and civil society-led platforms such as Marsoum41, to support freedom of information requests, including via mobile.
Using technology to improve services
Countries also are tapping into technology to improve public and private services. Estonia is famous for building an information technology infrastructure that has permitted widespread use of electronic services — everything from filing taxes online to filling doctors’ drug prescriptions.
In La Paz, Bolivia, a citizen feedback system known as OnTrack allows residents of one of the city’s marginalized neighbourhoods to send a text message on their mobile phones to provide feedback, make suggestions or report a problem related to public services.
In Pakistan, government departments in Punjab are using smart phones to collect real-time data on the activities of government field staff — including photos and geo-tags — to help reduce absenteeism and lax performance….”

Technology’s Crucial Role in the Fight Against Hunger


Crowdsourcing, predictive analytics and other new tools could go far toward finding innovative solutions for America’s food insecurity.

National Geographic recently sent three photographers to explore hunger in the United States. It was an effort to give a face to a very troubling statistic: Even today, one-sixth of Americans do not have enough food to eat. Fifty million people in this country are “food insecure” — having to make daily trade-offs among paying for food, housing or medical care — and 17 million of them skip at least one meal a day to get by. When choosing what to eat, many of these individuals must make choices between lesser quantities of higher-quality food and larger quantities of less-nutritious processed foods, the consumption of which often leads to expensive health problems down the road.
This is an extremely serious, but not easily visible, social problem. Nor does the challenge it poses become any easier when poorly designed public-assistance programs continue to count the sauce on a pizza as a vegetable. The deficiencies caused by hunger increase the likelihood that a child will drop out of school, lowering her lifetime earning potential. In 2010 alone, food insecurity cost America $167.5 billion, a figure that includes lost economic productivity, avoidable health-care expenses and social-services programs.
As much as we need specific policy innovations, if we are to eliminate hunger in America food insecurity is just one of many extraordinarily complex and interdependent “systemic” problems facing us that would benefit from the application of technology, not just to identify innovative solutions but to implement them as well. In addition to laudable policy initiatives by such states as Illinois and Nevada, which have made hunger a priority, or Arkansas, which suffers the greatest level of food insecurity but which is making great strides at providing breakfast to schoolchildren, we can — we must — bring technology to bear to create a sustained conversation between government and citizens to engage more Americans in the fight against hunger.

Identifying who is genuinely in need cannot be done as well by a centralized government bureaucracy — even one with regional offices — as it can through a distributed network of individuals and organizations able to pinpoint with on-the-ground accuracy where the demand is greatest. Just as Ushahidi uses crowdsourcing to help locate and identify disaster victims, it should be possible to leverage the crowd to spot victims of hunger. As it stands, attempts to eradicate so-called food deserts are often built around developing solutions for residents rather than with residents. Strategies to date tend to focus on the introduction of new grocery stores or farmers’ markets but with little input from or involvement of the citizens actually affected.

Applying predictive analytics to newly available sources of public as well as private data, such as that regularly gathered by supermarkets and other vendors, could also make it easier to offer coupons and discounts to those most in need. In addition, analyzing nonprofits’ tax returns, which are legally open and available to all, could help map where the organizations serving those in need leave gaps that need to be closed by other efforts. The Governance Lab recently brought together U.S. Department of Agriculture officials with companies that use USDA data in an effort to focus on strategies supporting a White House initiative to use climate-change and other open data to improve food production.

Such innovative uses of technology, which put citizens at the center of the service-delivery process and streamline the delivery of government support, could also speed the delivery of benefits, thus reducing both costs and, every bit as important, the indignity of applying for assistance.

Being open to new and creative ideas from outside government through brainstorming and crowdsourcing exercises using social media can go beyond simply improving the quality of the services delivered. Some of these ideas, such as those arising from exciting new social-science experiments involving the use of incentives for “nudging” people to change their behaviors, might even lead them to purchase more healthful food.

Further, new kinds of public-private collaborative partnerships could create the means for people to produce their own food. Both new kinds of financing arrangements and new apps for managing the shared use of common real estate could make more community gardens possible. Similarly, with the kind of attention, convening and funding that government can bring to an issue, new neighbor-helping-neighbor programs — where, for example, people take turns shopping and cooking for one another to alleviate time away from work — could be scaled up.

Then, too, advances in citizen engagement and oversight could make it more difficult for lawmakers to cave to the pressures of lobbying groups that push for subsidies for those crops, such as white potatoes and corn, that result in our current large-scale reliance on less-nutritious foods. At the same time, citizen scientists reporting data through an app would be able do a much better job than government inspectors in reporting what is and is not working in local communities.

As a society, we may not yet be able to banish hunger entirely. But if we commit to using new technologies and mechanisms of citizen engagement widely and wisely, we could vastly reduce its power to do harm.

Better Governing Through Data


Editorial Board of the New York Times: “Government bureaucracies, as opposed to casual friendships, are seldom in danger from too much information. That is why a new initiative by the New York City comptroller, Scott Stringer, to use copious amounts of data to save money and solve problems, makes such intuitive sense.

Called ClaimStat, it seeks to collect and analyze information on the thousands of lawsuits and claims filed each year against the city. By identifying patterns in payouts and trouble-prone agencies and neighborhoods, the program is supposed to reduce the cost of claims the way CompStat, the fabled data-tracking program pioneered by the New York Police Department, reduces crime.

There is a great deal of money to be saved: In its 2015 budget, the city has set aside $674 million to cover settlements and judgments from lawsuits brought against it. That amount is projected to grow by the 2018 fiscal year to $782 million, which Mr. Stringer notes is more than the combined budgets of the Departments of Aging and Parks and Recreation and the Public Library.

The comptroller’s office issued a report last month that applied the ClaimStat approach to a handful of city agencies: the Police Department, Parks and Recreation, Health and Hospitals Corporation, Environmental Protection and Sanitation. It notes that the Police Department generates the most litigation of any city agency: 9,500 claims were filed against it in 2013, leading to settlements and judgments of $137.2 million.

After adjusting for the crime rate, the report found that several precincts in the South Bronx and Central Brooklyn had far more claims filed against their officers than other precincts in the city. What does that mean? It’s hard to know, but the implications for policy and police discipline would seem to be a challenge that the mayor, police commissioner and precinct commanders need to figure out. The data clearly point to a problem.

Far more obvious conclusions may be reached from ClaimStat data covering issues like park maintenance and sewer overflows. The city’s tree-pruning budget was cut sharply in 2010, and injury claims from fallen tree branches soared. Multimillion-dollar settlements ensued.

The great promise of ClaimStat is making such shortsightedness blindingly obvious. And in exposing problems like persistent flooding from sewer overflows, ClaimStat can pinpoint troubled areas down to the level of city blocks. (We’re looking at you, Canarsie, and Community District 2 on Staten Island.)

Mayor Bill de Blasio’s administration has offered only mild praise for the comptroller’s excellent idea (“the mayor welcomes all ideas to make the city more effective and better able to serve its citizens”) while noting, perhaps a little defensively, that it is already on top of this, at least where the police are concerned. It has created a “Risk Assessment and Compliance Unit” within the Police Department to examine claims and make recommendations. The mayor’s aides also point out that the city’s payouts have remained flat over the last 12 years, for which they credit a smart risk-assessment strategy that knows when to settle claims and when to fight back aggressively in court.

But the aspiration of a well-run city should not be to hold claims even but to shrink them. And, at a time when anecdotes and rampant theorizing are fueling furious debates over police crime-fighting strategies, it seems beyond arguing that the more actual information, independently examined and publicly available, the better.”

Reddit, Imgur and Twitch team up as 'Derp' for social data research


in The Guardian: “Academic researchers will be granted unprecedented access to the data of major social networks including Imgur, Reddit, and Twitch as part of a joint initiative: The Digital Ecologies Research Partnership (Derp).
Derp – and yes, that really is its name – will be offering data to universities including Harvard, MIT and McGill, to promote “open, publicly accessible, and ethical academic inquiry into the vibrant social dynamics of the web”.
It came about “as a result of Imgur talking with a number of other community platforms online trying to learn about how they work with academic researchers,” says Tim Hwang, the image-sharing site’s head of special initiatives.
“In most cases, the data provided through Derp will already be accessible through public APIs,” he says. “Our belief is that there are ways of doing research better, and in a way that strongly respects user privacy and responsible use of data.
“Derp is an alliance of platforms that all believe strongly in this. In working with academic researchers, we support projects that meet institutional review at their home institution, and all research supported by Derp will be released openly and made publicly available.”
Hwang points to a Stanford paper analysing the success of Reddit’s Random Acts of Pizza subforum as an example of the sort of research Derp hopes to foster. In the research, Tim Althoff, Niloufar Salehi and Tuan Nguyen found that the likelihood of getting a free pizza from the Reddit community depended on a number of factors, including how the request was phrased, how much the user posted on the site, and how many friends they had online. In the end, they were able to predict with 67% accuracy whether or not a given request would be fulfilled.
The grouping aims to solve two problems academic research faces. Researchers themselves find it hard to get data outside of the larges social media platforms, such as Twitter and Facebook. The major services at least have a vibrant community of developers and researchers working on ways to access and use data, but for smaller communities, there’s little help provided.
Yet smaller is relative: Reddit may be a shrimp compared to Facebook, but with 115 million unique visitors every month, it’s still a sizeable community. And so Derp aims to offer “a single point of contact for researchers to get in touch with relevant team members across a range of different community sites….”

State Open Data Policies and Portals


New report by Laura Drees and Daniel Castro at the Center for Data Innovation: “This report provides a snapshot of states’ efforts to create open data policies and portals and ranks states on their progress. The six top-scoring states are Hawaii, Illinois, Maryland, New York, Oklahoma, and Utah. Each of these states has established an open data policy that requires basic government data, such as expenditure information, as well as other agency data, to be published on their open data portals in a machine-readable format. These portals contain extensive catalogs of open data, are relatively simple to navigate, and provide data in machine-readable formats as required. The next highest-ranked state, Connecticut, offers a similarly serviceable, machine-readable open data portal that provides wide varieties of information, but its policy does not requiremachine readability. Of the next three top-ranking states, Texas’s and Rhode Island’s policies require neither machine readability nor government data beyond expenditures; New Hampshire’s policy requires machine readability and many types of data, but its open data portal is not yet fully functional. States creating new open data policies or portals, or refreshing old ones, have many opportunities to learn from the experiences of early adopters in order to fully realize the benefits of data-driven innovation.”
Download printer-friendly PDF.

Reprogramming Government: A Conversation With Mikey Dickerson


Q and A by in The New York Times: “President Obama owes Mikey Dickerson two debts of gratitude. Mr. Dickerson was a crucial member of the team that, in just six weeks, fixed the HealthCare.gov website when the two-year, $400 million health insurance project failed almost as soon as it opened to the public in October.

Mr. Dickerson, 35, also oversaw the computers and wrote software for Mr. Obama’s 2012 re-election campaign, including crucial last-minute programs to figure out ad placement and plan “get out the vote” campaigns in critical areas. It was a good fit for him; since 2006, Mr. Dickerson had worked for Google on its computer systems, which have grown rapidly and are now among the world’s largest.

But last week Mr. Obama lured Mr. Dickerson away from Google. His new job at the White House will be to identify and fix other troubled government computer systems and websites. The engineer says he wants to change how citizens interact with the government as well as prevent catastrophes. He talked on Friday about his new role, in a conversation that has been condensed and edited….”